News Release

AI algorithm to detect suicide risk takes next steps toward clinic with funding from NIH

Medical University of South Carolina researcher receives National Institute of Mental Health funding to refine an artificial intelligence algorithm that analyzes clinical notes to identify patients at risk of suicide.

Grant and Award Announcement

Medical University of South Carolina

Dr. Jihad Obeid

image: Dr. Jihad Obeid is co-director of the Biomedical Informatics Center at the Medical University of South Carolina. view more 

Credit: Medical University of South Carolina

Jihad Obeid, M.D., co-director of the  Biomedical Informatics Center at the Medical University of South Carolina (MUSC), has been awarded more than a half million dollars by the National Institute of Mental Health (NIMH) to refine an artificial intelligence (AI) algorithm that analyzes text in the electronic medical record to identify patients at risk of suicide. 

Suicide is the 10th leading cause of death in the U.S., according to the American Foundation for Suicide Prevention, and claims an average of 47,000 lives each year. This is a problem that has been worsened by the COVID-19 pandemic. Effective treatments are available for those at risk, but clinicians do not have a reliable way of predicting which patients are likely to make a suicide attempt.

Most current models for predicting suicide risk are based on tabulated or coded data in the electronic health record (EHR). However, 80% to 90% of the information in the EHR is found in the clinical notes. Until recently, it has been difficult to analyze those notes using computers.

Neural networks are a type of artificial intelligence known as deep learning. They have come into their own with the simultaneous advent of huge datasets, such as those available via the EHR, and greatly enhanced computing capacity. Neural networks can now progressively use layers of artificial networks to extract nuanced information from raw input data.

In a 2020 article in the Journal of Medical Internet Research (JMIR) Medical Informatics, Obeid and his collaborators at the University of South Florida (USF) showed that these models, once trained, could identify patients at risk of intentional self-harm. 

“If deep learning models can be used to predict which patients are at risk for suicide based on clinical notes, then clinicians can refer high-risk patients early for appropriate treatment,” said Obeid.

In the article, Obeid and his collaborators describe the performance of an algorithm after it was trained on clinical notes taken during visits at MUSC, in which patients had been assigned an International Classification of Diseases (ICD) code relevant to intentional self-harm. To do so, they relied on MUSC’s research data warehouse, which was created with support from the South Carolina Clinical & Translational Research Institute and provides MUSC researchers access to patient electronic health record data, once they have obtained appropriate permissions. 

The question being asked was whether these trained models  could “read” clinical notes on their own. Once trained, the models were indeed able to identify those charts automatically with an accuracy of around 98.5%. 

Although predicting future suicide risk proved more challenging when the algorithm was trained on historical clinical notes in the medical record, the algorithm still achieved an accuracy of almost 80% with relatively high sensitivity and precision. That is better than the accuracy achieved with most existing methodologies. 

With funding from the recent grant, Obeid will further improve on the algorithm and validate its accuracy across institutions and against existing predictive models. 

Instead of ICD codes, however, the algorithm will be trained on data obtained not only from the EHR but also from the Center for Disease Control and Prevention’s National Death Index, which will more clearly show which patients eventually go on to commit suicide. 

At MUSC, the accuracy of the algorithm at detecting/predicting suicide risk will be compared to that achieved with the Mental Health Research Network model. That model is an established means for predicting suicide risk, based on structured data in the EHR.

At USF, collaborator Brian Bunnell, Ph.D., a clinical psychologist, will replicate the preliminary study conducted at MUSC, the results of which were reported in the JMIR article, training and then testing the algorithm, using USF EHR data. If the algorithm achieves similar accuracy at predicting suicide risk at USF, that will be an important step in validating its clinical usefulness across institutions.

Obeid and his colleagues will then use the data obtained from the studies funded by this one-year grant to apply for longer-term NIMH funding.

If they obtain the additional funding, they will focus increasingly on the implementation of the algorithm – how to take it from a research project to a clinical-decision aid. 

“So right now, it's all a research study, right? The algorithms haven't been implemented as clinical-decision support where you add those models into the clinical workflow,” explained Obeid. “For example, in a primary care setting, you could bring up an alert for clinicians to refer patients for treatment due to elevated risks of suicide.”

It will be important to guard against “alert fatigue” by explaining to clinicians how the algorithm works to detect risk, a concept known as explainable AI.

“If you're presenting a probability of risk to clinicians when they're seeing patients in the clinic, they are going to want to know why a patient is at risk,” said Obeid.

Explaining that risk and how it is determined will be important to encouraging clinician adoption of an eventual clinical-decision aid.

To remove barriers to implementation of the clinical-decision aid, Obeid and his collaborators will also explore the potential for transfer learning, the process of creating new AI models by fine-tuning previously trained neural networks for the appropriate task and location. Such training requires less time, cost and infrastructure than training AI algorithms/models from scratch.

“We hope that the outcome of this research will help health care providers to identify patients at risk more effectively, which in turn will improve suicide prevention,” said Obeid.


About MUSC

Founded in 1824 in Charleston, MUSC is the oldest medical school in the South as well as the state's only integrated academic health sciences center with a unique charge to serve the state through education, research and patient care. Each year, MUSC educates and trains more than 3,000 students and nearly 800 residents in six colleges: Dental Medicine, Graduate Studies, Health Professions, Medicine, Nursing and Pharmacy. The state's leader in obtaining biomedical research funds, in fiscal year 2019, MUSC set a new high, bringing in more than $284 million. For information on academic programs, visit

As the clinical health system of the Medical University of South Carolina, MUSC Health is dedicated to delivering the highest quality patient care available while training generations of competent, compassionate health care providers to serve the people of South Carolina and beyond. Comprising some 1,600 beds, more than 100 outreach sites, the MUSC College of Medicine, the physicians' practice plan and nearly 325 telehealth locations, MUSC Health owns and operates eight hospitals situated in Charleston, Chester, Florence, Lancaster and Marion counties. In 2020, for the sixth consecutive year, U.S. News & World Report named MUSC Health the No. 1 hospital in South Carolina. To learn more about clinical patient services, visit

MUSC and its affiliates have collective annual budgets of $3.2 billion. The more than 17,000 MUSC team members include world-class faculty, physicians, specialty providers and scientists who deliver groundbreaking education, research, technology and patient care.

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.