LONDON, 27 June 2006 — A raised eyebrow, quizzical look or a nod of the head are just a few of the gestures computers could soon be using to read people’s minds.
An “emotionally aware” computer being developed by British and American scientists will be able to read an individual’s thoughts by analyzing a combination of facial movements that represent underlying feelings.
“The system we have developed allows a wide range of mental states to be identified just by pointing a video camera at someone,” said professor Peter Robinson of Cambridge University in England.
He and his collaborators believe the mind-reading computer’s applications could range from improving people’s driving skills to helping companies tailor advertising to people’s moods. “Imagine a computer that could pick the right emotional moment to try to sell you something, a future where mobile phones, cars and websites could read our mind and react to our moods,” he added.
The technology is already programmed to recognize 24 facial expressions generated by actors. Robinson hopes to get more data from the public to determine whether someone is bored, interested, confused, or agrees or disagrees when it is unveiled at a science exhibition in London on July 3.