Computer learning to read lips to detect emotions | KurzweilAI

[+]

Open the pod bay doorsHAL.

Scientists in Malaysia are teaching a computer to interpret human emotions based on lip patterns.

The system could improve the way we interact with computers and perhaps allow disabled people to use computer-based communications devices, such as voice synthesizers, more effectively and more efficiently, says Karthigayan Muthukaruppan of Manipal International University.

The system uses a genetic algorithm that gets better and better with each iteration to match irregular ellipse-fitting equations to the shape of a human mouth displaying different emotions.

They have used photos of individuals from Southeast Asia and Japan to train a computer to recognize the six commonly accepted human emotions — happiness, sadness, fear, angry, disgust, surprise — and a neutral expression. The upper and lower lip is each analyzed as two separate ellipses by the algorithm.

The team’s algorithm can successfully classify the seven emotions and a neutral expression described, the scientists say.

No word if the system will be deployed on the manned mission to Mars.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s