Sci-Tech

Leak reveals Facebook’s rules for controversial content

Thousands of pages of internal documents from Facebook have been leaked, revealing the rules and regulations the social media giant uses to decide what can be shared on its platform.

Among the rules detailed in documents obtained by the Guardian are those covering nudity, violence and threats – all things that Facebook has been accused of letting slide in the past.

A threat to kill the US President would be deleted, but similar remarks against an ordinary person would not be viewed as credible unless further comments showed signs of a plot.

“[Facebook] is not a traditional technology company. It’s not a traditional media company. We build technology, and we feel responsible for how it’s used,” Monika Bickert, ‎Facebook’s head of global policy management, told the newspaper.

Perhaps the most contentious issue is that of violence. Facebook has come under fire in recent months for allowing videos of rape and suicide – often involving minors – to be shared on the site.

In one instance, a 12-year-old girl live-streamed her suicide on Live.me. YouTube took the video down almost immediately, but it took administrators two weeks to remove it from Facebook, where it received thousands of shares and comments.

In another instance, Facebook appeared to react too soon, removing live-streamed footage of the police shooting of Philando Castile. The firm later called the removal a “technical glitch”.

The two episodes show the tension Facebook faces between shielding viewers from potentially disturbing content and censoring them from content that may prove useful.

To combat the issue, the social media site has adopted a plethora of different guidelines: videos depicting self-harm are allowed, as long as there exists an “opportunity to help the person”.

Videos of suicide, however, are never allowed.

Film of child and animal abuse (as long as it is non-sexual) can remain in an effort to raise awareness and possibly help those affected.

“We have a really diverse global community and people are going to have very different ideas about what is okay to share. No matter where you draw the line there are always going to be some grey areas,” Ms Bickert said.Aside from footage of actual violence, Facebook must also decide how to respond to threats of it - what they call “credible threats of violence”.

The social media site has an entire rulebook for what is considered “credible” and what is not.

Statements like “someone shoot Trump” will be deleted by the website, but comments like “let’s go beat up fat kids,” or “I hope someone kills you” will not.

A leaked Facebook documents states that violent threats are “most often not credible”, until specific statements make it clear that the threat is “no longer simply an expression of emotion but a transition to a plot or design”.Less dangerous, but just as controversial, are Facebook’s rules regarding nudity.

Users have sparked outrage in the past when photographs of their mastectomies, or even re-posts of iconic Vietnam War photos, were deleted from their pages for “violating community guidelines”.

The site now makes allowances for “newsworthy exceptions”. like the famous Vietnam War photo of a naked young girl hit by napalm, and for “handmade art”. Digitally made art showing sexually explicit content is not allowed.

The publication of the rules is sure to spark concern from privacy and free speech advocates alike. But some within Facebook say it’s not the content guidelines that are the problem, but the ever-growing number of  content creators.“Facebook cannot keep control of its content,” one source told The Guardian. “It has grown too big, too quickly.”

Source: The Independent