News Release

When AI companions for lonely people seem a bit too human

Most may prefer voice companions, not robots

Peer-Reviewed Publication

Ohio State University

COLUMBUS, Ohio – Imagine a future in which lonely people can interact with social bots, based on artificial intelligence (AI), to get the conversations and connection they crave.

While it sounds intriguing, a small preliminary study suggests people may not be comfortable with AI companions that look and talk too much like real humans.

“We think it may seem a little too creepy to have these embodied robots that act and look almost human,” said Kelly Merrill Jr., lead author of the study and a doctoral student in communication at The Ohio State University.

“People seemed to be more comfortable with AI companions that were voice-based, more like smartphones and smart speakers like Alexa or Siri.”

Merrill conducted the study with Jihyun Kim of the University of Central Florida and Chad Collins of St. Johns River State College. Their results were published recently in the journal Communication Research Reports.

The researchers were interested in learning more about the role that the social presence and warmth of AI companions have on people’s views of them.

The study involved 106 college students who participated online. The students were only told the study was aimed at understanding their perceptions about technology.

They watched a 5-minute clip that the researchers edited from an episode of the television show Black Mirror. While the focus of the actual episode (“Be Right Back”) was somewhat different, the edited clip was designed to show a lonely woman, named Martha, talking to an AI companion named Ash.

Some participants watched a clip in which Martha interacted with Ash only by voice.  Others watched a clip in which Ash was represented as a realistic-looking robot that talked and interacted with Martha.

After watching the clip, participants were asked whether they thought an AI companion like Ash would be useful to lonely people, and whether they would recommend Ash to lonely people.

Participants rated Ash on social presence, which is how much he seemed like he was really with them. Ash was also rated on warmth, which was defined as the feeling of friendliness and intimacy.

One hypothesis could be that people would like an AI companion that scored high on social presence, and that acted more like a human would, Merrill said.

But in this study, that was only the case for the version of Ash that appeared just through voice. Participants who saw this version were more likely to recommend Ash as a companion if they rated it higher on social presence.

But those who viewed Ash as an actual human-looking robot were not more likely to recommend Ash if they thought he had more social presence.

The study didn’t ask participants why, but the researchers believe it had to do with what scientists call the “uncanny valley.

“People become uneasy when they see robots that come close to seeming human, but are slightly off,” Merrill said.

“In the clips, the actor playing the robot version of Ash did a good job of seeming slightly mechanistic and not quite human. It creeps people out, and that may be why more social presence in the embodied Ash didn’t make people more likely to recommend the bot for lonely people.”

The study found that participants’ views of how warm Ash seemed - his friendliness and intimacy – had no effect on whether they would recommend him to lonely people. That was true regardless of whether they viewed the robot version of Ash, or the voice version.

“It may be that people think an AI companion for lonely people would be good for casual conversation, but should not be a replacement for a more intimate and deep friendship,” Merrill said.

Merrill noted that this was a preliminary investigation and much more work needs to be done on the interaction of social presence, warmth and AI companions. But this work suggests people right now prefer the familiar.

“We already talk to disembodied AI through our smartphones and smart speakers, so we are used to that and comfortable with those kinds of interactions,” he said.

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.