SAN FRANCISCO — Facebook added suicide prevention tools to its live video service Wednesday in hopes of bringing real-time help to users might may harm themselves while broadcasting on the social media platform.
The new Facebook Live tools will allow users to report dangerous behavior and suicidal comments directly to Facebook, which will then intervene with help.
The social media giant also rolled out new suicide prevention tools for it messaging app, as well as artificial intelligence (AI)-assisted reporting for posts.
“Facebook is in a unique position — through friendships on the site — to help connect a person in distress with people who can support them,” the California-based company said in a statement announcing the updates.
If a Facebook user recording live video makes a suicidal comment, suggests self-harm or does anything else that might make their audience worry, viewers will be able to flag the video.
From there, Facebook will send resources to the user recording the video, including the options to contact a help line, reach out to a friend or read materials about suicide prevention.
Facebook users were already able to report troubling posts — a process that just got easier thanks to robots. The company will use AI to identify posts that might contain suicidal messages, then create the option to flag them as such more prominent.
The same AI robots can read and report posts on their own, allowing Facebook staffers to reach out to users who might need help, even if a human hasn’t reported the content.
Additionally, Facebook is working with several suicide prevention organizations to provide more help through its messaging service, allowing users to directly message them for help, providing an alternative to phone-based hotlines and text lines.
At least seven social media suicides
The Facebook Live suicide prevention update comes after several users have taken their lives while streaming video on the site.
The option to broadcast live video rolled out to all users in April. The service, which allowed users to record themselves and their surroundings in real time, has been used to stream everything from high school football games to the aftermath of police-involved shootings.
Dan Reidenberg, the executive director of the suicide prevention site Save.org, said there have been at least seven cases of live-streamed suicide on Facebook and other social media sites since their inception.
The cases broadcast on Facebook Live include the October death of a Turkish man who shot himself after he and his girlfriend broke up; a 14-year-old Florida girl who hanged herself in front of about 1,000 viewers on Jan. 22, some of whom bullied her and mocked her as the video rolled; and, a day later, an aspiring actor who shot himself in California.
Ire and support
After the series of Facebook Live suicides, critics blasted the company, alleging a wide array of failures.
After the Florida teenager hanged herself, her school district superintendent at least partially blamed Facebook for her death, saying that the company could have done a better job policing for cyberbullying before and during her suicide.
Other advocates have questioned why it’s taken Facebook so long to remove graphic videos from the site — the footage of the 12-year-old Georgia girl’s death remained online for two weeks.
Allowing Facebook users to broadcast suicide live online — and allowing others to keep watching the footage for days or weeks after — could potentially encourage copycat acts, experts warned.
But Facebook said the decision to remove a video isn’t so black and white. Live streams of people threatening suicide, but not acting on those threats, are especially complex.
“Some might say we should cut off the live stream, but what we’ve learned is cutting off the stream too early could remove the opportunity for that person to receive help,” Facebook researcher Jennifer Guadagno told TechCrunch.
That’s where the beefed-up reporting tools come in. Suicide prevention advocates have so far lauded the updates.
“Facebook’s approach is unique. Their tools enable their community members to actively care, provide support, and report concerns when necessary,” said John Draper, the director of the National Suicide Prevention Lifeline.