id="info"

Innovation

The machine knows how you feel: How AI can detect emotion

2016 is shaping up to be a year of significant advances in AI. One of the most important breakthroughs? AI understanding us.

AI
Image: Arndt Vladimir

One of the biggest fears about the rise of AI is that computers will begin to understand our emotional state. But having our computers understand us might just be what we need to get the help we need.

TechRepublic recently offered some predictions for AI in 2016, which could turn out to look like "2015 on steroids." According to Andrew Moore, dean of Carnegie Mellon's School of Computer Science, one of the greatest ways that AI will improve is in reading emotions.

"In addition to understanding written language and spoken language," said Moore, "it helps for computers to understand the emotional state of the person they're communicating with. There have been big breakthroughs in how they can understand that emotional state."

Here are three examples of how advances in AI and emotions are improving our lives, according to new studies from top research universities.

1. Educating children

Justine Casselle, professor at the Human-Computer Interaction Institute at Carnegie Mellon, looks at how children learn through their interactions with computers. In her experiments, children use a computer simulation of another child, which can respond in a realistic way. (In the past, the other "person" was a simulation of a video game character). The result? "There's a huge increase in the learning outcomes when the student is engaged with a simulated child that reacts realistically to their own emotional state," said Moore.

2. Treating depression

At the University of Pittsburgh, Jeff Cohn is examining how people respond to treatments for depression by having a computer watch them and monitor facial cues. Subjects came in once a week for a short conversation. The high-resolution cameras can detect enough of the skin movement to see what facial muscles are being used at any given time, and using FACS (facial action coding systems), they're mapping slight facial movements, both deliberate and subconscious, with the emotional state.

According to Moore, "we're at the stage where it would be practical to put this into handsets or tablets, so the tablet itself, subject to privacy questions, can actually get an idea of whether you're having a good time or bad time with whatever you're doing on the tablet, whether it's a video conference or playing a game or reading a book."

3. Diagnosing medical issues

"Many patients find it hard to fully describe everything that's going on in their minds when they're discussing their condition," said Moore.

To tackle this issue, Professor LP Morency at the School of Computer Science at Carnegie Mellon shows how emotion-understanding robots vs. regular robots can elicit better outcomes in terms of getting information that will help with diagnosis.

Also see

About

Hope Reese is a Staff Writer for TechRepublic. She covers the intersection of technology and society, examining the people and ideas that transform how we live today.

Editor's Picks