Science

A Machine Might Just Read Your Mind and Predict If You're Suicidal

Brainstorm

A psychology professor says his software can figure out if a person is suicidal. But does it really work?

171031-basu-ai-suicide-tease_gwpaml
Sarah Rogers/The Daily Beast

A few years ago, Marcel Just was trying to figure out how to have real-life applications for his research. Just, a professor of psychology at Carnegie Mellon and director of the Center for Cognitive Brain Imaging, had spent the majority of the past decade teaching computer programs how to identify thoughts. He had found—with the help of an functional magnetic resonance imaging (fMRI) machine—that each emotion we feel had a specific signature in the brain and lit up in uniquely identifiable ways. Just had trained a piece of software to follow these patterns and recognize about 30 concepts and emotions.

"We asked whether we could identify what a person was thinking from the machine learning patterns," Just explained. "The machine learning data was figured out with various kinds of concepts; eventually it learned how to map between patterns and concepts."

"From that research," he added, "we realized it was possible to tell what a person was thinking."

ADVERTISEMENT

In other words, Just had created artificial intelligence that could read your thoughts.

Around this time, Just spoke at the University of Pittsburgh's medical school. David Brent, a professor of psychology specializing in children, approached him.

"Do you think this could be used to identify suicidal thoughts?" Brent asked.

It hit Just then: What if artificial intelligence could predict what a suicidal person was thinking? And maybe even prevent a suicidal person from committing suicide?

Earlier this week, Just, Brent, and a few other colleagues published a landmark paper in the journal Nature that finds that with an astonishing 91% accuracy, artificial intelligence is able to figure out if a person is considering suicide.

The experiment is remarkable for more than one reason. There's of course the idea of using a machine trained to figure out neural patterns to identify those who might consider suicide. It's a subset of the population that is typically difficult to pinpoint and help before it's too late, relying not only on their telling others of their desire to kill themselves, but also that person actually acting on it and helping the suicidal person in trouble.

"From that research, we realized it was possible to tell what a person was thinking."

Just and Brent recruited 34 individuals from local clinics: 17 who'd self-professed to having had suicidal thoughts before, and 17 others who hadn't. The research team put the 34 individuals through an fMRI machine and had them think about words (with the help of the Adult Suicidal Ideation Questionnaire, which measures "suicide ideation") that represented different "stimulus" concepts, ranging from positive ones (praise, bliss, carefree, and kindness), negative ones (terrible, cruelty, evil), and suicide (fatal, funeral, death).

The last one—death—was the most damning of the brain signatures in Just's study. Those who were suicidal showed a spot of angry crimson at the front of the brain. Control patients, meanwhile, just had specks of red amidst a sea of blue in the pictures. "These people who are suicidal had more sadness in their representation of death, and more shame as well," he said.

171031-ai-suicide-embed_pfmyzs
Machine learning of neural representations of suicide and emotion concepts identifies suicidal youth

So Just and Brent set to work teaching a machine the concepts that were most associated with suicide, and those that weren't. "If you trained the machine on 10 people in their anger signature, and put the 11th person on the scanner, it should be able to tell if the 11th person is angry or not," Just said of how the machine was put to the test. The machine then figured out if the person was suicidal or not.

The results are strong, even if the sample size is relatively small: After going through all 34 people, Just and his research team were able to say with a 91% success rate which of the individuals had displayed suicidal thoughts.

That's, in short, amazing. But it's not perfect. What about the other 9%? "It's a good question," he said of the gap. "There seems to be an emotional difference [we don't understand]" that the group hopes to test in future iterations of the study.

Another problem with the study as it stands? The fact that it uses the fMRI machine in the first place. "Nobody used machine learning in the early days," Just said. "This [artificial intelligence] uses multiple volume elements—'voxels'—to figure out neural representation." If that sounds expensive, it is. And expense makes any potential therapy therefore more difficult to access, a criticism of brain scanning studies covered by Ed Yong at The Atlantic: "When scientists use medical scanners to repeatedly peer at the shapes and activities of the human brain, those brains tend to belong to wealthy and well-educated people. And unless researchers take steps to correct for that bias, what we get is an understanding of the brain that's incomplete, skewed, and ... well ... a little weird."

Is this a universal way to identify suicidal individuals? We're not sure, and should be cautious of saying these are conclusive suicidal.

The more subtle nuance of the study that deserves note is the very real possibility that artificial intelligence can do something that strongly resembles reading your mind. We like to conceive of thoughts as amorphous concepts, as unique to our own headspace and experience. What might tickle one person's fancy might not necessarily do the same for another; what brings one individual shame won't bother someone else. But those core feelings of happiness, shame, sadness, and others seem to look almost identical from person to person.

Does this mean that we can potentially eradicate suicide, though? Just is hesitant to make that assumption, though he thinks this is a huge step towards understanding what he terms "thought disorders." "We can look at the neural signature and see how it's changed," he said, "see what this person is thinking, whether it's unusual." Just thinks that most realistically, this is going to be a huge first step towards developing a unique therapy for suicidal individuals: If we know the specific thought processes that are symptomatic of suicide, we can know how to potentially spot suicidal individuals and help them.

"This isn't a wild pie in the sky idea," Just said. "We can use machine learning to figure out the physicality of thought. We can help people.”

Got a tip? Send it to The Daily Beast here.