Artificial intelligence is renowned for its pattern recognition capabilities. That is why Google’s video analysis platform can interpret text, images, and sound and classify them accordingly. It is also what allows AIs to predict heart attacks better than doctors and recommend a new place to eat based on your tastes or a new connection you might know on social media.

It is via pattern recognition that AIs are so skillful at prediction, often well beyond human capabilities.

While many professionals are still fearful of humanity’s demise at the hand of artificial intelligence, right now AI is saving lives by helping to predict and prevent suicide.

In America alone, over 44,000 people die by suicide, and, for each of those deaths, 25 people make an attempt to take their own life.

Among the most famous companies using AI to lower suicide rates is Facebook.

After an increasing number of suicides over Facebook Live video feed, the company implemented a way for viewers to alert Facebook that the person on cam was in danger. When a user reports a post or video may indicate a suicide risk, Facebook then offers mental health resources to the person in need. When on video, Facebook offers options such as reaching out to a friend of the National Suicide Prevention Hotline while the person is still on air.

By immediately providing the endangered person with help, Facebook hopes to decrease the number of suicides using early prevention methods.

But Facebook is moving a step further by aggregating the surrounding data when an at-risk person has been identified in order to train an artificial intelligence on pattern recognition to prevent those people who may be at risk or may become higher risk for suicide or self-harm.

This new AI algorithm is alerted to any phrases found to be common surrounding suicide such as posts mentioning pain and suffering and responses from friends asking, “Are you ok?”

They are working together with Crisis Text Line, the National Eating Disorder Association and the National Suicide Prevention Lifeline so that users can receive immediate support in a crisis.

But Facebook isn’t the only one.

According to Wired, an AI trained on data from medical records can predict suicide at a rate of 80–90% up to two years in advance.

Other social media platforms are also doing their due diligence to support those in mental health crises.

Instagram also has a tool for users to reach out if they spot something that may suggest a user is at risk for suicide or self-harm. Instagram will then send a message checking to see if they are okay and offering support and crisis line numbers.

By recognizing the patterns–whether by AI or humans–we can help prevent suicide.

If you or a friend is in need, you can also contact these numbers:


National Suicide Prevention Hotline
1-800-273-TALK (8255)  (They also offer  text chat)

Crisis Text Line
Text NAMI to 741-741

National Sexual Assault Hotline
800-656-HOPE (4673)

National Domestic Violence Hotline
800-799-SAFE (7233)

What can you do to help prevent suicide? Let us know in the comments below!