Children's Hospital Colorado
Advances and Answers in Pediatric Health
U.S. News & World Report ranked in all 10 specialties badge

Advancing child health and pediatric specialty care through clinical discovery, multidisciplinary research and innovation

Revolutionizing Suicide Prevention with AI



Can artificial intelligence models make it easier to predict - and successfully intervene - when kids are at risk of suicide?

Suicide is the leading cause of death among children ages 10 to 14 in Colorado, and the second leading cause of death among teens nationwide. Research has identified many factors contributing to this startling trend — including a history of depression, substance use and family conflict — but societal stigmas, lack of clinical resources and misunderstandings around mental health challenges have made it difficult for physicians to pinpoint the individual circumstances leading to suicidality.

AI-powered predictive analytics

Now, with the help of artificial intelligence (AI), Children’s Hospital Colorado psychiatrist Joel Stoddard, MD, MAS, is driving research that could finally predict suicide risk in kids. Dr. Stoddard serves as principal investigator at Children’s Colorado’s Emotion and Development Lab, which seeks to understand the neurological underpinnings of social emotional challenges in children and adolescents.

In the lab, Dr. Stoddard and his team will use AI to learn which patterns are most likely to lead to a suicide attempt. Then, the AI tool helps the team create models that predict future suicidal thoughts and behaviors as well as the potential interventions that might work best to save a patient’s life.

“Pattern detection is really important. You can do it as a human, but it takes a long time — it’s like finding the needle in the haystack,” Dr. Stoddard says. “We’re actually able to predict people’s clinical outcomes much faster using data about them, which includes both clinical impressions and computer-derived measures of their behavior.”

Enhancing suicide risk assessments with diverse data

The models created in Dr. Stoddard’s lab are powered by numerous real-world data sources. The first database comes directly from clinical screenings at Children’s Colorado, which use the Ask Suicide-Screening Questions (ASQ) to assess suicide risk quickly and effectively with just five questions. Children’s Colorado adopted ASQ in 2017, but in 2022, Lauren Wood, PhD, and Shaela Moen, MPH, helped implement another standardized screening for depression from The Hospital Transformation Project — a Colorado state initiative aimed at improving quality of care. These suicide and depression screenings are now conducted during every interaction with patients over the age of 10, and the answers are incorporated into the patient’s electronic health record.

Since the AI tool is searching for patterns in data, every bit of clinical information makes its models more accurate. “We’re trying to amplify the power of predictive screening,” Dr. Stoddard says. “We don’t just want to predict what’s going to happen in the next month, but also for years to come.”

Moen and Dr. Wood have also helped facilitate Children’s Colorado’s partnership with Zero Suicide — a national framework for improving suicide care and prevention across healthcare settings. Collaborating with Zero Suicide enables Dr. Stoddard’s team to access data from the Colorado Department of Public Health and Environment, meaning the AI models incorporate information from real cases of children who’ve died by suicide.

Additionally, Dr. Stoddard and his team are beginning to look at data from the ABCD Study, the largest long-term study of brain development and child health in the United States, spanning more than 10,000 kids across 21 different research sites. Over the next year, Dr. Stoddard and his team aim to integrate key findings from the ABCD dataset into patients’ electronic health records. This data will appear in Epic alongside guidance for following up with patients after they’ve screened positive for depression and/ or suicide — another result of the Zero Suicide implementation by Moen and Dr. Wood.

This will make it possible for a provider to look at a child’s chart, instantly assess a complex level of risk based on clinical and nationwide data and view suggested treatment methods tailored to that patient’s risk factors. This will result in a level of youth suicide prevention care previously unattainable in clinical settings.

Predicting suicide with precision

Culling a wider representation of life experiences from a diverse range of data sources is incredibly important for reducing bias and improving accuracy in suicide prevention, all while ensuring that the resulting interventions and treatments are more equitable. For instance, there may be higher rates of suicidality among teens from a certain ZIP code or kids with specific sleep habits — data that might not traditionally be considered in a suicide screening assessment.

“I’m interested in looking at the interactions and the relationships between these predictors,” Dr. Stoddard says. “This can help us understand whether there are hotspots or particular risks, and we can understand whether they change over time.”

While the team is currently refining algorithms and inputting clinical data manually, the hope is that within a year and a half, it will be so advanced in its learning algorithm that it gathers and learns from every piece of data on its own in real-time as clinicians document on their patients.

Future goals aside, Dr. Stoddard’s AI-powered predictions are already helping save youth from suicide across the state. “We’re using this data and modeling to teach the people who are making policy what’s important to consider when screening for suicidality,” he says.

In a world where AI is sometimes seen as a threat to important aspects of our humanity, this work shows how it can be used as a positive force to implement policies that save lives, fighting back against one of the greatest public health challenges of our time. “It’s not just computers learning — we’re integrating the best of clinical screening, a primary form of suicide prevention, with AI,” Dr. Stoddard says. “We’re helping humans and machines to work together.”