Report: Facebook shares rules for censoring violence, sex

Report: Facebook shares rules for censoring violence, sex

A picture illustration shows a Facebook logo reflected in a person's eye, March 13, 2015. (Reuters file photo)
A picture illustration shows a Facebook logo reflected in a person's eye, March 13, 2015. (Reuters file photo)

LONDON -- Facebook Inc has created a rule book for moderators to use when censoring the posts of its nearly 2 billion users, responding to global criticism for failing to prevent the circulation of images of violence, sex, hate speech and other controversial material, <i>The Guardian</i> reported.

On April 24, a Thai man posted videos of himself hanging his 11-month-old baby daughter from the rooftop of an abandoned building in Phuket, and the footage was not removed for 24 hours.

Facebook relies on thousands of human moderators to review potentially offensive posts, including videos of death, violence, sexual material, abuse and threatening speech. The Guardian said it obtained copies of thousands of slides and pictures that Facebook shared with moderators last year as guidelines, and that many moderators feel overwhelmed by the volume of posts that need to be reviewed and confused by apparent contradictions in Facebook’s policies. 

The moderators have about 10 seconds to decide on whether to remove material from the site, according to The Guardian.

The report said that Facebook’s policies include the following guidelines:

  • Videos of violent death may be allowed if used to create awareness for issues like mental health.
  • Images of child abuse are removed if it’s shared with "sadism and celebration". Otherwise, it can remain on the site and be marked as "disturbing".
  • Animal abuse is allowed but may need to be classified as "disturbing".
  • Violent threats against political figures like US President Donald Trump or those in religious groups are to be removed, but less specific language, such as "let’s beat up fat kids" or "kick a person with red hair," can remain on Facebook because it’s not considered credible. 

Facebook told The Guardian that it’s difficult to reach a consensus for a service with nearly 2 billion users. People have different views on what’s appropriate to share, Facebook said.

Earlier this month, Facebook said it was hiring an additional 3,000 people to monitor images on the site. That came after the company faced criticism when a murder and suicide were broadcast on the social network.

Do you like the content of this article?
COMMENT (5)