It looks like Facebook has decided to actually make themselves useful this week by launching an app to help its users rather than exploit them. In a world where teen suicide has almost become an epidemic, Facebook has launched a tool that allows users to anonymously report a friend's potentially suicidal behaviour.

Any piece of content that seems like it may be a sign of depression or suicidal thoughts, can be reported by selecting "suicidal content" from the harmful behaviour option. Facebook will then email the distressed user a link and phone number to chat with a *representative *from the National Suicide Prevention Lifeline. The idea is to allow those in need a chance to get help *when they are too afraid to seek it out themselves.
The only fear I have is that it will end up being used improperly. Now that quoting depressing *song lyrics has become a regular *occurrence *among heartbroken teenagers, will those be reported or left alone? Will other teenagers have the *wherewithal *to detect suicidal behaviour from their friends? Will school guidance councillors teach students how to use the tool properly? These are questions that need to be answered in order to allow Facebook users to use this effectively. Let's hope it will be.
[VIA]