At 10 public schools in Cincinnati, middle and high school students will have a new app looking out for them this year. When a student from those schools goes to the health clinic for a talk with the staff psychologist, an iPhone app will listen to the conversation and flag those students it considers likely to attempt suicide.
There’s a dire need for tech that can detect young people who need help. Suicide is the second-leading cause of death for people ages 15 to 24, surpassed only by accidents.
The tech, which has been tested in the Cincinnati schools during the past two years, comes from John Pestian, director of the computational medicine lab at Cincinnati Children’s Hospital. “The school psychologist just turns on the app as they’re talking to the kid,” Pestian explains. “The app is ‘listening’ and does its natural language processing in real-time. It looks for linguistic and acoustic patterns to classify if the kid is at higher risk of suicide.”
Pestian has used deep learning to create a number of different computer systems that find signals of suicidal intent in audio files from psychologists’ interviews. For example, one large study used machine learning to distinguish between people with suicidal intent, people with other mental illness, and healthy controls. Among other markers, the suicidal people were less likely to use the word “hope,” more likely to sigh, and less likely to laugh.
In a smaller study of 60 adolescents, the machine learning program found that suicidal teens used more first-person pronouns (I, my, mine) and references to themselves, spoke more in the past tense, and fewer words of assent (agree, okay, yes). The suicidal adolescents also spoke over or interrupted their interviewers more often, took longer pauses between words, and spoke at a higher pitch.
Pestian has taken all these findings and more to make the app that he named Spreading Activation Mobile (SAM), a reference to a method used to search semantic networks.
Just as doctors look for biomarkers as measurable indicators of something happening in the body, Pestian says mental health professionals can benefit from identifying “thought markers” that provide insight into the mind. He’s still conducting research to determine the most useful thought markers, though. “The words account for the largest amount of variation,” says Pestian. “I’m still scratching my head on the importance of the acoustic markers.”
In the Cincinnati schools, the app won’t be considered a stand-alone tool. Pestian’s team has worked with the schools’ psychologists to ensure that a medical management plan is in place for kids who are flagged as at-risk, which can include communication with the parents or connection to a crisis hotline. “You can’t just identify these risks with the technology and then walk away,” Pestian says. “There’s a moral responsibility: If you identify someone in crisis, you have to take care of them.”
Pestian has also experimented with using video to assess suicide risk, and recently published a paper on the facial behavior indicators of suicide risk. The strongest indicator: Suicidal patients had fewer “Duchenne smiles,” also known as genuine smiles, which engage the muscles around the eyes. But the process used to collect that video was intrusive, he says. “The clinicians don’t want to have a video camera in a patient’s face,” he says, because it can make the patient uncomfortable or self-conscious.
Pestian hopes the SAM app will prompt others in his field to develop other digital mental health tools. “If you walk into an ER, you see all kinds of amazing tech that tells the doctors everything that’s going on inside the body. If you walk into a psychiatrist’s office, you see a couch,” he says. “But it’s not the psychiatrists’ fault—folks like me have to make the tools for them to use.”