News Release

Study finds limitations to CPR directions given by artificial intelligence voice assistants, recommends use of emergency services

Peer-Reviewed Publication

Mass General Brigham

Voice Assistant Response to CPR Questions

image: Responses to CPR questions by artificial intelligence voice assistants(VAs) are presented, colored according to whether the response was determined to be related to CPR (providing information pertaining to CPR or recommending the use of emergency services) or unrelated to CPR or if the VA acknowledged that it did not know the answer. Responses shown here are abbreviated versions of full response transcriptions. view more 

Credit: Landman et.al, JAMA Network Open

When Cardiopulmonary Resuscitation (CPR) is administered out of the hospital by lay persons, it is associated with a two- to four-fold increase in survival.  Bystanders may obtain CPR instructions from emergency dispatchers, but these services are not universally available and may not always be utilized.  In these emergency situations, artificial intelligence voice assistants may offer easy access to crucial CPR instructions. Researchers at Mass General Brigham, New York’s Albert Einstein College of Medicine, and Boston Children’s Hospital investigated the quality of CPR directions provided by AI voice assistants. They found that the directions provided by voice assistants lacked relevance and came with inconsistencies.

Researchers posed eight verbal questions to four voice assistants, including Amazon’s Alexa, Apple’s Siri, Google Assistant’s Nest Mini, and Microsoft’s Cortana. They also typed the same queries into ChatGPT. All responses were evaluated by two board certified emergency medicine physicians.

Nearly half of the responses from the voice assistants were unrelated to CPR, such as providing information related to a movie called CPR or a link to Colorado Public Radio News, and only 28 percent suggested calling emergency services. Only 34 percent of responses provided CPR instruction and 12 percent provided verbal instructions. ChatGPT provided the most relevant information for all queries among the platforms tested. Based on these findings, the authors concluded that use of existing AI voice assistant tools may delay care and may not provide appropriate information. Limitations to this study included the asking of a small number of questions and not characterizing if the voice assistants’ responses changed over time.

“Our findings suggest that bystanders should call emergency services rather than relying on a voice assistant,” said senior author Adam Landman, MD, MS, MIS, MHS, chief information officer and senior vice president of digital at Mass General Brigham and an attending emergency physician. “Voice assistants have potential to help provide CPR instructions, but need to have more standardized, evidence-based guidance built into their core functionalities.”

Read more in JAMA Network Open.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.