Breaking News
You are here: Home / All / Facebook artificial intelligence to help spot suicidal users

Facebook artificial intelligence to help spot suicidal users

Facebook has unveiled new tools which will make use of artificial intelligence to identify members that may be at risk of killing themselves and help prevent the suicides.

While the site already has self-harm prevention features, they rely on users to spot and report friends’ problematic posts. Now, the company is testing AI tech that can detect comments that are “likely to include thoughts of suicide.” They can then be checked by the company’s Community Operations teams, opening up a new way for troubled users to get help.

Facebook’s AI is also making it easier for users to help friends in trouble. Using pattern recognition, it will check posts and then, if needed, make “suicide or self injury” reporting options more prominent. The AI detection and reporting options, whether aided by friends or Facebook employees, are running as a “limited test” in the US for now, however.

It marks the first use of AI technology to review messages on the network since founder Mark Zuckerberg announced last month that he also hoped to use algorithms to identify posts by terrorists, among other concerning content.

Facebook has also created new Messenger tools in collaboration with the Crisis Text Line, the National Eating Disorder Association, the National Suicide Prevention Lifeline and other organizations. That’ll help at-risk users or concerned friends contact knowledgeable groups over chat either directly from the organization’s page or via Facebook’s suicide prevention tools. The Messenger program is also in the testing phases, but Facebook will expand it “over the next several months” so that organizations can ramp up to increased message volumes.

Finally, the social network has integrated suicide prevention tools into Facebook Live. If users see a troubling livestream, they can reach out directly to the person and report it to Facebook at the same time, as shown above. It will “also provide resources to the person reporting the live video to assist them in helping their friend,” it wrote. Meanwhile, the person sharing the video will see resources that let them reach out to a friend, contact a help line or see tips.

“Some might say we should cut off the livestream, but what we’ve learned is cutting off the stream too early could remove the opportunity for that person to receive help,” Facebook Researcher Jennifer Guadagno

Share on TwitterShare on FacebookShare on LinkedInPin it on PinterestSubmit to redditSubmit to StumbleUponShare on Tumblr

//

About admin

Profile photo of admin
Close
Please support the site
Let's connect on our social media pages

Facebook

Twitter

Google+

Skip to toolbar