Facebook is rolling out new artificial intelligence tools and algorithms to help better spot suicidal users and get them help.
Facebook is rolling out a new set of tools designed to help improve its efforts in suicide prevention among users, the company announced in a blog post on Wednesday. The tools will help users in real time on Facebook Live, provide increased support through Messenger, and use artificial intelligence (AI) to assist in reporting posts of at-risk users, the post said.
The social media giant has been offering suicide prevention tools for roughly a decade now, the post said, which were built in partnership with Save.org, National Suicide Prevention Lifeline, Forefront, and the Crisis Text Line. However, the new features hope to provide even more resources to users who are in distress.
The move comes after a series of suicides were broadcast on the social media site. In January 2017, a Miami teen committed suicide while live-streaming video on Facebook Live. The incident happened only a few days after a Los Angeles man broadcast his suicide as well.
"There is one death by suicide in the world every 40 seconds, and suicide is the second leading cause of death for 15-29 year olds," the post said. "Experts say that one of the best ways to prevent suicide is for those in distress to hear from people who care about them."
These new tools are designed to do just that—get Facebook users who may be considering harming themselves or committing suicide connected to their friends or a professional. Facebook has long allowed users to report concerning posts, but now it will be able to more fully investigate posts unprompted.
The report process will be improved, the post said, by utilizing AI and pattern recognition to identify posts that show characteristics that have been associated with previous posts reported for suicide. And, this will make the option to report posts about "suicide or self injury" more prominent to other users who may come across them. Facebook will also be using AI to "identify posts as very likely to include thoughts of suicide," the post said, and reach out to the user who made the post and offer resources and help, even if another user has not reported the post.
The AI tools created to work in posts will also work in Facebook Live, allowing users viewing a live video to directly contact the poster or report the video, the post said. The users sharing the video will also see resources on their screen, giving them the option to contact a friend or help line.
In Facebook Messenger, users will be able to directly connect with crisis support partners such as the Crisis Text Line, the National Eating Disorder Association, and the National Suicide Prevention Lifeline in real time.
This kind of announcement shows the positive side of AI and its potential impact for helping people. In addition to helping individuals and those in distress, the application of AI tools like natural language processing and sentiment analysis could also help in applications such as customer service, human resources, and training efforts for employees.
The 3 big takeaways for TechRepublic readers
- Facebook is rolling out new tools, powered by AI, that will help detect potentially suicidal users and get them connected to help.
- Facebook posts and Facebook Live will both have access to the AI features, and users on Facebook Messenger will be able to directly connect with professional organizations such as the National Suicide Prevention Lifeline in real time.
- This announcement points to the positive applications of AI, which could also improve other aspects of daily life and work.
- Turning to the internet to find customer support numbers? Be wary, especially for Facebook (TechRepublic)
- Facebook to use AI to find fake news, offensive live video (ZDNet)
- 6 ways tech is changing how we help people with mental illness (TechRepublic)
- Facebook's strong earnings: What it'll mean for marketers in 2017 (ZDNet)
- 13 apps for bettering your mental health (TechRepublic)