The rationale for attempting to mimic emotional intelligence in a computer is not immediately clear. Star Trek fans will remember the difficulties Commander Data felt with the introduction of his emotion chip. In fact, classic Western views of intelligence often pitted emotion and reason against each other. Emotion was seen as a disorganizing factor, harmful to reasoning and logic. However, with the recent introduction of the concept of emotional intelligence, the many positive contributions of emotional factors to intellectual functioning were highlighted. Furthermore, understanding emotion (and to a lesser extent, exhibiting it) may prove essential to any system that is designed to interact with human beings. Of course, the implementation of such a system represents an enormous challenge.
So why is understanding emotions crucial? The most direct reason, and the obvious one after a moments introspection, is that emotion is inextricably tied to everything we say and do. In more concrete terms, let us consider the advantages of having a computer that understood emotion. First of all, it could provide vastly improved interactions with users. More importantly though, emotional intelligence would be an enormous leap forward for systems attempting to learn about people, and the world in which they live.
Also of interest to AI research, and psychology, is the idea of simulating emotion in computers. These simulations can be either internal, external, or both, depending on the motivations of those designing the system. External simulation, that is, exhibiting emotion purely for the benefit of those interacting with the system, is more difficult than it may seem. While psychologists have known for some time that there are a good deal of physical correlates to emotion - voice changes, blushing, pupil dilation, etc. - reproducing them proves difficult. Often the effect is exaggerated, so much so that the emotional device becomes obvious. For an example of the type of work being done, click here for a link to Janet Cahn's master's thesis work, generating expression in synthesized speech. There is also work being done on internal emotion, that is, programming a computer to actually 'experience emotions'. Emotion provides us with a motivation and drive, with a set of personal preferences, with a uniqueness that is desirable in a sophisticated AI. In some cases, most notably MIT's 'Kismet', the two (internal and external emotions) are combined into a robot that seems to possess and display emotion. For more information, click here to access Kismet's homepage (this page is of a moderately technical level). Both Cahn's work, and Kismet are part of the MIT Media Lab's research program. Students are encouraged to explore the various research projects being pursued at MIT -- its lab is at the forefront of AI and emotional AI research.