Site icon Vanguard

Facebook works with its users to prevent suicide

How a new reporting function may save lives

Every day, approximately 100 Americans end their lives by suicide. According to statistics gathered by the American Foundation for Suicide Prevention, over 36,000 people in the United States will take their own life each year.

A new tool recently implemented on Facebook will hopefully take part in preventing some of these deaths. With over 800 million users on Facebook, this new application could prove to be a true lifesaver.

Over the past year, Facebook has made several new improvements to its website in order to provide more thorough care and protection to its users.

Previously, users only had the option to block postings or report them as spam, but the new system is far more specific.

Users can now select a friend’s posting and report it as “self-harm.” Once a user submits a formal report of a potentially harmful post, it is processed by Facebook employees before further action is taken.

Should the report be deemed accurate, the user who posted the worrisome comment will be contacted promptly by Facebook. The message relayed is one of encouragement to get professional help and it includes information to connect them to Lifeline, the national suicide crisis line.

Though Facebook is taking great steps to ensure a happy and healthy online environment for its users, it is admittedly also part of the problem. Cyber-harassment has become the wave of the future for bullies.

According to the Cyberbullying Research Center, over half the adolescents, teenagers and young adults in the U.S. experience cyberbullying each year. Of about 20 percent of those are bullied consistently, usually by the same handful of aggressors.

Unfortunately, well over half of those dealing with cyberbullying will never mention it to a parent, guardian or authoritative figure.

As the most popular social network online, Facebook is unintentionally teaming up with cyberbullies. Facebook can try to provide methods of protection, but every day, slanderous comments will be made that can contribute to breaking down the self esteem of its users.

Posts can be deleted, but words cannot be unsaid.

Without intending to, Facebook provides an easy open format where bullies can attack their victims without even facing them.

More and more, people are expressing suicidal thoughts through Facebook. The most infamous account to date occurred in September 2010, when Tyler Clementi, a freshman at Rutgers University, posted “Jumping off the gw bridge sorry,” before committing suicide by jumping off the George Washington Bridge.

In November, authorities from Pittsburg, Calif., released that a man posted his suicide note publicly on Facebook before he proceeded to murder his wife and in-laws, and then took his own life.

Fortunately, Facebook has realized that it is part of the problem, and therefore should contribute to being part of the solution.

The application they have designed works simply. Users submit a report about a friend that may be in danger and Facebook contacts them and immediately offers several methods of help while maintaining the anonymity of the friend who suggested them.

Aside from providing the Lifeline suicide prevention hotline phone number, a new chat function is now available to assist with 24-hour help. It gives troubled users the ability to seek help without even placing a phone call.

Many people who have sought help from Lifeline have mentioned that the thought of a direct phone call can be overwhelming. The live instant message function allows access to help from a professional crisis counselor without the concerns of direct contact, providing a vital tool that has not been seen before.

Facebook had previously thought of creating a system that could scan for potential suicidal posts automatically. However, the algorithm to detect such verbal warnings would be hugely inaccurate and practically impossible to formulate.

This new tool is not to be taken lightly and must be used with great discretion.

Suggesting that someone may be contemplating suicide should be carefully considered before submitting a formal warning to administrators. Facebook is trusting that an individual’s close friend and family members will be able to identify serious problems and threats from harmless complaining or jokes.

Suicide is a serious and delicate subject. The effects of a suicide are far-reaching. It is a tragedy the touches every human being that person is in contact with. Every friend, family member and co-worker in that person’s life feels devastated, confused and deeply hurt by the loss. It is truly impossible to encapsulate the hardships that develop from the intentional loss of life.

Though Facebook is providing this new doorway to help, it is the awareness alone that is most important.

A personal message, phone call or any form of contact from a concerned friend speaks far louder than an automated message. The greatest amount of success is seen in cases that are approached immediately, at the first signs of trouble. The embarrassment of reaching out to an individual who seems seriously depressed and having the assumption be wrong is entirely less detrimental than any harm coming to that person’s life.

Hopefully, if this new application accomplishes anything, it will make all online users more alert to potentially life-threatening posts.

It is necessary that we form an online community that is attentive and concerned for the well being of others. There is no worse feeling in the world than being left to wonder, “Could I have done something more?”

If we continue to be conscious of our own actions and observant of potential distress signs from our friends, lives could be improved or even saved.

Exit mobile version