‘Trash talk’ hurts even when it comes from a robot
In the study, people who played a game with a commercially available humanoid robot known as Pepper performed worse when the machine discouraged them, and better when it encouraged them.Updated: Nov 21, 2019 10:30 IST
Discouraging words such as “I have to say you are a terrible player” can be disturbing for people even when uttered by a robot, according to a study presented here.
In the study, people who played a game with a commercially available humanoid robot known as Pepper performed worse when the machine discouraged them, and better when it encouraged them.
The researchers noted that some of the 40 study participants were technically sophisticated and fully understood that a machine was the source of their discomfort.
“One participant said, ‘I don’t like what the robot is saying, but that’s the way it was programmed so I can’t blame it,’“ said Lead author Aaron M. Roth, who conducted the study while he was a master’s student at the Carnegie Mellon University (CMU) in the US.
The study, presented last month at the IEEE International Conference on Robot & Human Interactive Communication (RO-MAN) in New Delhi, found that, overall, human performance ebbed regardless of technical sophistication.
The study is a departure from typical human-robot interaction studies, which tend to focus on how humans and robots can best work together, the researchers said.
“This is one of the first studies of human-robot interaction in an environment where they are not cooperating,” said Fei Fang, an assistant professor in the Institute for Software Research at CMU.
It has enormous implications for a world where the number of robots and internet of things (IoT) devices with artificial intelligence capabilities is expected to grow exponentially, the researchers noted.
“We can expect home assistants to be cooperative, but in situations such as online shopping, they may not have the same goals as we do,” said Fang.
The researchers designed a study in which humans would compete against a robot in a game called “Guards and Treasures,” to analyse rationality.
This is a typical game used to study defender-attacker interaction in research on security games.
Each participant played the game 35 times with the robot, while either soaking in encouraging words from the robot or getting their ears singed with dismissive remarks.
Although the human players’ rationality improved as the number of games played increased, those who were criticised by the robot didn’t score as well as those who were praised, the researchers found.
It is well established that an individual’s performance is affected by what other people say, Afsaneh Doryab, a scientist at CMU’s Human-Computer Interaction Institute (HCII) during the study, said.
However, the study shows that humans also respond to what machines say, said Doryab, now an assistant professor at the University of Virginia.
A robot’s ability to prompt responses could have implications for automated learning, mental health treatment and even the use of robots as companions, she said.
(This story has been published from a wire agency feed without modifications to the text.Only the headline has been changed. )