New ‘emotional’ robots aim to read human feelings

The Omron Forpheus robot plays table tennis against a human at the International Consumer Electronics Show in Las Vegas on January 10. The consumer show is featuring robots are now more humanlike by acquiring “emotional intelligence” and empathy. (AFP)
Updated 12 January 2018
0

New ‘emotional’ robots aim to read human feelings

LAS VEGAS: The robot called Forpheus does more than play a mean game of table tennis. It can read body language to gauge its opponent’s ability, and offer advice and encouragement.
“It will try to understand your mood and your playing ability and predict a bit about your next shot,” said Keith Kersten of Japan-based Omron Automation, which developed Forpheus to showcase its technology.
“We don’t sell ping pong robots but we are using Forpheus to show how technology works with people,” said Kersten.
Forpheus is among several devices shown at this week’s Consumer Electronics Show, which highlight how robots can become more humanlike by acquiring “emotional intelligence” and empathy.
Although this specialization is still emerging, the notion of robotic empathy appeared to be a strong theme at the huge gathering of technology professionals in Las Vegas.
Honda, the Japanese auto giant, launched a new robotics program called Empower, Experience, Empathy including its new 3E-A18 robot which “shows compassion to humans with a variety of facial expressions,” according to a statement.
Although empathy and emotional intelligence do not necessarily require a humanoid form, some robot makers have been working on form as well as function.
“We’re been working very hard to have an emotional robot,” said Jean-Michel Mourier of French-based Blue Frog Robotics, which makes the companion and social robot called Buddy, set to be released later this year.
“He has a complex brain,” Mourier said at a CES event. “It will ask for a caress or it will get mad if you poke him in the eye.”
Other robots such as Qihan Technology’s Sanbot and SoftBank Robotics’ Pepper, are being “humanized” by teaching them to read and react to people’s emotional states.
Pepper is “capable of interpreting a smile, a frown, your tone of voice, as well as the lexical field you use and non-verbal language such as the angle of your head,” according to SoftBank.
Developing emotional intelligence in robots is a difficult task, melding the use of computer “vision” to interpret objects and people and creating software that can respond accordingly.
“Empathy is the goal: the robot is putting itself in the shoes of the human, and that’s about as hard as it gets,” said Patrick Moorhead, a technology analyst with Moor Insights & Strategy.
“It’s not just about technology, it’s about psychology and trust.”
Moorhead said this technology is still in the early stages but holds promise in some areas, noting that there is strong interest in Japan amid a lack of caretakers for the elderly population.
“In some ways it can be a bit creepy if you’re crying and the robot is trying to console you,” he said.
“If you have no friends, the next best thing is a friend robot, and introverts might feel more comfortable talking to a robot.”
One CES exhibitor offered a promise of going further than the current devices by developing an “emotion chip” which can allow robots to process emotions in a manner similar to humans.
“There has been a lot of research on detecting human emotions. We do the opposite. We synthesize emotions for the machine,” said Patrick Levy-Rosenthal, founder of New York-based Emoshape, which is producing its chip for partners in gaming, virtual and augmented reality and other sectors.
It could be used to power a humanoid robot, or other devices. For example, an e-reader could better understand a text to infuse more emotion in storytelling.
As for Forpheus, Kersten said the robot’s ability to help people improve their table tennis skills could have numerous applications for sports, businesses and more.
“You could sense how people are feeling, if they are attentive or in a good state to drive,” he said.
Another key application could be in health care, he said: “In an elderly patient facility, you can determine if someone is in distress and needs help.”


King Abdul Aziz City for Science and Technology unveils self-guided Black Shark boat at 38th GITEX Technology Week

The development of the Black Shark smart boat is part of a KACST initiative to localize and transform transport technology and logistics, to help achieve the aims of Vision 2030. (SPA)
Updated 20 October 2018
0

King Abdul Aziz City for Science and Technology unveils self-guided Black Shark boat at 38th GITEX Technology Week

  • These trucks are equipped with electronic pairing technologies, which effectively improve the shipping and distributing of goods, reduce human error

JEDDAH: King Abdul Aziz City for Science and Technology (KACST) has unveiled its Black Shark self-guided boat at the 38th GITEX Technology Week in Dubai. The vessel, which can carry out coastal surveillance and many other tasks, was developed in collaboration with Taqnia for Robotics and Smart Systems.
The development of the craft is part of a KACST initiative to localize and transform transport technology and logistics, to help achieve the aims of Vision of 2030.
The boat includes sensor systems that allow it to monitor and create a 3D map of a 200-meter area surrounding the boat, and automated control technology that gives it the ability to navigate independently and avoid collisions without human input. It can also be equipped with a flexible range of weapons, acting as a firearms platform that uses gyroscopic self-balancing technology. It has the ability to survey beaches at a range of 15 kilometers, in addition to accurately identifying its precise location with a margin of error of less than 20 centimeters using differential GPS, as well as specifying, monitoring and tracking targets.
The Black Shark also has long-range radar that covers up to 150 kilometers, and a telecommunication system to track its location, monitor its status and connect to multiple domains through command centers that allow wireless communication and remote control. It is fitted with a digital camera powered by electro-optic and infrared technology that can produce HD-quality video, and also has night vision capability.
As part of its initiative to develop transport technology and logistics, KACST has also worked on automated control technology, included self-driving heavy-duty trucks, with the University of California, Berkeley. These trucks are equipped with electronic pairing technologies, which effectively improve the shipping and distributing of goods, reduce human error, preserve resources, and reduce harmful emissions and fuel consumption.
The same technology can also, for example, transform a four-wheel-drive vehicle into a remote-controlled vehicle equipped with video cameras, infrared technology, a microphone and a control device wirelessly connected to a command center, where an operator can guide it using images from the video cameras.