Facebook has enabled users to report suicidal content for some time now, but as of this week the social network will increase its efforts at suicide prevention by partnering with the Substance Abuse and Mental Health Services Administration and the National Suicide Prevention Lifeline. Facebook has 800 million users and thousands of them have reported that posts from their friends contain references to “self-harm” or “suicidal content.” But as the site has grown and a handful of suicides seem to have been foreshadowed by status updates, Facebook is taking things a step further.

From CBS News:

When reports like these come through, Facebook’s safety team reviews the content and sends them to Lifeline. Facebook sends an email to the user that’s been reported, which includes Lifeline’s phone number and a link to start a confidential chat session. After that, it’s up to the recipient to respond. Facebook also sends whoever filed the report and email to let them know the site has responded. The social network realizes the system isn’t full proof for emergencies, so they encourage you to call law enforcement if the harmful behavior appears imminent.

The new service will also allow users to report suicidal behavior even if it’s not connected to a post on the site, 24 hours a day. However, like anything else communicated through Facebook, any information contained within this service will not be protected by medical privacy laws and is ultimately public.

Is Facebook taking a step in the right direction? Or is preventing suicide through social networking a lost cause?

Like Us On Facebook Follow Us On Twitter