Saturday, September 24, 2011

The structure of owl brains

As I'm coding the details of the Talking Owl chatterbots for the Talking Owl Project, I'm discovering something kind of cool about the way that the object/class organization of the code is panning out.

So far, it seems to me that each Talking Owl chatterbot (class Owl) requires four objects as separate components for different aspects of "mental processing" during the processing of input from a user: a component of class LanguageParser, that represents an intermediate syntactic representation of a sentence that exists in short-term memory, and contains the methods to transform a sentence into this syntactic representation; a component of class SynsetStore, that contains the long-term memory lexical knowledge that is required to convert the syntactic representation into meanings; a component of class TripleStore, that contains general world knowledge stored in long-term memory; and a component of class MentalModel, that represents active knowledge in working memory, and integrates the meanings derived from the input sentence with the real-world knowledge retrieved from long-term memory in order to build up a representation of the sentence and inferences that can be concluded from it.

Why is this cool? Because this modularity roughly corresponds to real modularity in the brain!

This is a diagram of a Talking Owl Brain:

The LanguageParser roughly corresponds to the function of Broca's area: language comprehension. The SynsetStore corresponds roughly to the function of Wernicke's area: where lexical knowledge is stored. The TripleStore corresponds roughly to the hippocampus: long-term associative memory. And finally, the MentalModel corresponds roughly to (of course) the pre-frontal cortex: the area where active memory representations are used to create mental models and perform reasoning!

I don't know why I find this so satisfying. Maybe in some way it affirms for me the idea that these functional components of language comprehension really represent true functional modules, so that whether the "Language Comprehension Machine" is being programmed in an object-oriented language or evolved over millions of years, either way the outcome is the same!

No comments:

Post a Comment