Teaching AI How to Feel Could Make for Better Computers

Boosting empathy in software is a challenging task

  • Researchers are trying to build computers that can understand human emotions. 
  • Emotional AI that can understand feelings might be better for tasks like working with patients.
  • But experts say there’s a long way to go before your devices understand your moods.
A human hand and a robot hand nearly touching fingers.

Kilito Chan / Getty Images

Emotional AI could eventually help your devices understand you better.

A computer scientist is working to help Artificial Intelligence (AI) comprehend human emotions. Experts say that emotional AI which understands human emotions might make for better computing.  

"Even if the system didn't truly understand them in the traditional sense, it would have an immense impact in almost every aspect of life," Ajay Satpute, the director of the Affective and Brain Science Lab at the Institute for Experiential AI at Northeastern University told Lifewire in an email interview. "We could help caregivers understand children. We could help children learn how to read emotions. We could accelerate learning in individuals with emotional perception issues, like Autism Spectrum Disorder."

AI That Gets You

Aniket Bera, a professor of computer science at Purdue University, is trying to build AI models and systems that are more humanlike and more adept at interacting with humans. His approach is to program devices to incorporate an understanding of nonverbal cues and communication.

"When a friend asks you how you are, you can say, 'I'm fine!' in an upbeat tone, and it means something completely different than if you say, 'I'm fine,' like Eeyore," Bera said in a news release. "Computers usually just pay attention to the content and ignore the context."

Bera and his colleagues are trying to develop emotional AI by observing and analyzing facial expressions, body language, involuntary and subconscious movements and gestures, eye movements, speech patterns, intonations, and different linguistic and cultural parameters. He said that training AI on many types of human inputs boosts communication and better equips the AI to respond to humans in a more appropriate and even emotive manner.

Stevens Institute of Technology professor Jordan Suchow said in an email interview that there's a tendency among technologists to think of emotions as being a physical property of a person's face that can be immediately detected and recognized in a photo. 

"But emotions are complex mental states that play out on the face in a way that can depend on the context—think of the feigned smile, the knowing glance, or the varied contortions that our face forms during speech," he added. 

Satpute pointed out that there are likely more limitations to AI understanding human emotions than we know. He said that most AI-driven approaches to emotion focus on specific channels of information—the face, the voice, biological signals—and try to guess what emotion might be occurring. 

Computers usually just pay attention to the content and ignore the context.

"They assume that these signals can be used as definitive signatures of emotions and generalize them across contexts and individuals," Satpute added. "In reality, emotional responses and their signals are highly situation-dependent and vary among individuals and cultures." 

And there's also the question of whether AI truly understands at all. Satpute said that philosophers and cognitive scientists have long debated the limitations of AI in understanding or representing the meaning of anything. 

"If the human ability to perceive and experience emotions requires understanding the meaning of certain signals, which we know to be highly context-dependent, then getting AI to understand emotion may be one of AI's greatest challenges," he added. 

The Benefits of Emotional AI

Computers that understand feelings could be a boon for patients. Bob Rogers, the CEO of Oii.ai, said in an email to Lifewire that in healthcare and mental health applications, it would be dangerous to provide significant information to patients via a chatbot that is not very sensitive to the potential emotional implications of that information. For example, he said that a diagnosis with a poor prognosis delivered in an insensitive way could easily cause harm. 

A robot hand reaching toward a baby who is reaching toward the hand.

Westend61 / Getty Images

"The flip side is that in many healthcare applications, especially in mental health, the ability to understand complex nuances in emotional content can dramatically increase the effectiveness of diagnosis and treatment," Rogers added. 

It might be a long time, though, before computers can guess your mood. Ján Záborský, a communications manager at the biometric security firm Innovatrics, said his company developed a system to ensure that a person in front of the camera is alive.

"One takeaway from its development is that many people automatically start to smile as soon as they look in the mobile's selfie camera, regardless of their emotional state—this is just to illustrate why devising an emotion detection system is such a challenge," he added.

Was this page helpful?