Facebook Claims It Is Doing More To Stop Self-Harm And Suicide
On behalf of the World Suicide Prevention Day, Facebook shared 3 extra measures it is taking to stop self-harm and suicide. Apart from the alterations Facebook made in the previous year, the firm claims it is appointing a well-being and health expert to get onboard its safety policy group. Facebook aims to share CrowdTangle, its monitoring tool for social media, with specific academic researchers who will look into how Instagram and Facebook can additionally enhance suicide prevention. And the firm is adding Orygen’s #chatsafe rules in Safety Center of Facebook and in Instagram resources when somebody searches for self-injury or suicide content.
Facebook has been operating on suicide prevention actions for quite some time. In 2017, it launched out its AI-supported tools for suicide prevention. In 2018, Instagram started hiding self-harm pictures behind sensitivity screens. Instagram also stops self-harm material from coming into view the Explore section, and it has taken measures to ban content that might market eating disorders.
As Facebook claims, one of the best methods to stop suicide is for users to hear from family and friends who care about them. “Facebook has an exclusive role in facilitating those types of links,” the firm claimed. Unluckily, it can also turn out to be a place of negativity, and actions like these are essential to defend at-risk consumers.
On a related note, Facebook verified in August that it is employing a small group of reporters to select the leading stories for its forthcoming News Tab segment. Now, media has disclosed more info about the rules those editors have to obey. The media claims that Facebook sent workers an internal memo having guidelines for curators, comprising prioritizing content over those with anonymous ones and with on-the-record sources. The group will prioritize the media outlets that report a specific news article first.