facebook wants to help prevent suicide
On
Wednesday, Facebook announced it would use artificial intelligence to identify
Facebook users at risk of dying by suicide in an effort to prevent suicide
deaths. The news comes after multiple people livestreamed their deaths on
the platform.
To
help users at risk of suicide, Facebook created artificial intelligence (AI)
that can identify posts featuring self-harm or suicidal thoughts. According to BuzzFeed News, the AI scans the post and
its comments, compares it with flagged posts that required attention,
and, if uncertain, passes the post along to Facebook’s community team for
review. Those whose posts are flagged will then be contacted by Facebook,
and shown a screen of support and mental health resources.
"The
AI is actually more accurate than the reports that we get from people that are
flagged as suicide and self injury" Facebook Product Manager Vanessa
Callison-Burchold told BuzzFeed News. "The people who have posted that content 'that AI reports' are more likely to be sent resources of support versus people
reporting to us"
Facebook’s
latest suicide prevention tools were created with help from the National Suicide Prevention
Lifeline and the Mental Health Association of New York City (MHA-NYC).
Facebook will also be partnering with suicide prevention organizations to offer chat based suicide hotlines through Facebook messenger and suicide prevention resources to Facebook Live users.
Facebook will also be partnering with suicide prevention organizations to offer chat based suicide hotlines through Facebook messenger and suicide prevention resources to Facebook Live users.
"It’s
important that community members, whether they’re online or offline, don’t feel
that they are helpless bystanders when dangerous behavior is occurring" John
Draper, Ph.D., Director of the National Suicide Prevention Lifeline, said in a
press release.
This
is not the first time Facebook has taken steps to prevent self harm. In
October, Instagram, whose parent company is Facebook, released a suicide and self harm reporting tool.
Instagram’s tool was modeled after Facebook’s reporting system, which was released
in April 2016.
Post a Comment