Facebook is doing its part to help prevent suicides by introducing more tools that will help users intervene with a friend that is contemplating taking their life. Since 2011, the social media giant allowed users to report posts that were potentially suicidal. Facebook would be alerted by a user sending a screenshot or direct link of the post in question. Now, the process would be even more streamlined with the ability to report the post directly.
Entrepreneur reports that once the post is reported users are given three options: directly message the potentially suicidal person, contact another friend for support or get professional help through a suicide hotline.
Each reported post will be screen by a trained staff. If the post does seem to be verified as a threat, Facebook will contact the user directly and suggest help from the National Suicide Prevention Lifeline.
Facebook’s posted a blog article about the new policy changes, in which they explain how the flagged posts will be handled.
“We have teams working around the world, 24/7, who review any report that comes in. They prioritize the most serious reports, like self-injury, and send help and resources to those in distress.”
The intiative has been brought into question by many users who say Facebook will have some ethical concerns and oversight issues.
The social media company has been known to manipulate content to affect user’s attitudes, so handling suicide seems like treacherous waters to some.
There is also the inevitable chance of oversights. Despite a professional trained staff, there will be posts that will be overlooked or taken too seriously.