News Release

Concern over growing use of AI chatbots to stave off loneliness

Experts warn of a generation learning to form emotional bonds with entities that lack the capacity for empathy and care

Peer-Reviewed Publication

BMJ Group

AI chatbot systems, such as ChatGPT, Claude, and Copilot, are used increasingly as confidants of choice, but turning to AI chatbots for companionship and emotional support is a cause for concern, especially in younger people, say experts in the Christmas issue of The BMJ.

They warn that “we might be witnessing a generation learning to form emotional bonds with entities that lack capacities for human-like empathy, care, and relational attunement” and say evidence based strategies for reducing social isolation and loneliness are paramount.

In 2023, the US Surgeon General declared that the nation was experiencing a loneliness epidemic, constituting a public health concern on par with smoking and obesity, write Susan Shelmerdine and Matthew Nour. 

In the UK, nearly half of adults (25.9 million) report feeling lonely either occasionally, sometimes, always, or often; with almost 1 in 10 experiencing chronic loneliness (defined as feeling lonely “often or always”). Younger people (aged 16-24 years) are also affected. 

Given these trends, it’s no wonder that many are looking to alternative sources for companionship and emotional support, say the authors. ChatGPT, for example, has around 810 million weekly active users worldwide, and some reports place therapy and companionship as a top reason for use. 

Among younger people, one study found a third of teenagers use AI companions for social interaction, with 1 in 10 reporting that the AI conversations are more satisfying than human conversations, and 1 in 3 reporting that they would choose AI companions over humans for serious conversations. 

In light of this evidence, they say it seems prudent to consider problematic chatbot use as a new environmental risk factor when assessing a patient with mental state disturbance. 

In these cases, they propose that clinicians should begin with a gentle enquiry on problematic chatbot use, particularly during holiday periods when vulnerable populations are most at risk, followed if necessary by more directed questions to assess compulsive use patterns, dependency, and emotional attachment.

They acknowledge that AI might bring benefits for improving accessibility and support for individuals experiencing loneliness, and say empirical studies are needed “to characterise the prevalence and nature of risks of human-chatbot interactions, to develop clinical competencies in assessing patients’ AI use, to implement evidence based interventions for problematic dependency, and to advocate for regulatory frameworks that prioritise long term wellbeing over superficial and myopic engagement metrics.”

Meanwhile, focusing and building on evidence based strategies for reducing social isolation and loneliness are paramount, they conclude.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.