The Guardian recently got hold of Facebook's detailed content moderation guidelines and published the same. It contains all the controversial issues, such as, violence, hate speech, and self-harm, which have been gathered from more than 100 internal training manuals, spreadsheets and flowcharts that the newspaper has seen.
The documents put light on the methods and stance of Facebook in terms of dealing with contents that are disturbing or sensitive in nature. The guidelines, followed by Facebook, shows that it tries to maintain a balance between taking down controversial contents and supporting "free speech", which explains why this platform often finds itself into moderation problems.
For example, the internal moderation guidelines show that Facebook allows sharing pictures of non-sexual child abuse, so that, the child can be identified and rescued but it doesn't allow the same if posted with sadism or celebration. Similarly, it does allow the sharing of animal cruelty to spread awareness but takes it down if the contents are "extremely upsetting", such as a visual of mutilation or burning alive.
In a weird policy, Facebook is also comfortable with people live streaming suicide attempts, because it "doesn't want to censor or punish people in distress who are attempting suicide." The guideline also says that "experts have told us what's best for these people's safety is to let them live-stream as long as they are engaging with viewers."
The social media platform takes the similar approach with violent death videos, as they think those will increase awareness. While certain types of generally violent written statements — such as those advocating violence against women, for example — are allowed to stand as Facebook's guidelines require what it deems "credible calls for action" in order for violent statements to be removed, says the report.
Other details from the guideline show that anyone with more than 100,000 followers is considered a public figure.
Facebook also changed its policy on nudity following the outcry over its decision to remove an iconic Vietnam War photograph depicting a naked child screaming. It now allows "newsworthy exceptions" under its "terror of war" guidelines. (However, images of child nudity in the context of the Holocaust are not allowed on the site.)
Although there are a bunch of pre-set guidelines that need to be followed, Guardian reported that there are several instances where the Facebook moderators tread in between, where they don't know what action should be taken on the content that is overlapping in nature.