Skip to main content

Facebook’s confusing hate speech policy detailed in leaked documents

Facebook’s confusing hate speech policy detailed in leaked documents


‘Migrants are dirt’ will get removed, but ‘migrants are dirty’ is fine

Share this story

Facebook stock image

German newspaper Süddeutsche Zeitung has obtained what it says are internal documents used to guide content moderation on Facebook. Excerpts from the documents, which the paper published on Friday, shed light on how the world’s largest social network defines hate speech and other offensive content — something that Facebook has long been reluctant to disclose. A separate report from SZ, published on Thursday, detailed operations at a Berlin office where more than 600 people work to moderate content on Facebook, earning barely more than Germany’s minimum wage.

Lawmakers in Germany and other European countries have pressured Facebook to more swiftly remove racist and xenophobic content, much of which has been directed toward migrants. German authorities have argued that Facebook must curb hateful content at a time of rising anti-migrant sentiment; but critics of the crackdown have warned of “creeping censorship,” raising concerns over how Facebook would define hate speech.

“I’ve seen things that made me seriously question my faith in humanity.”

The documents published by SZ provide some insight into the company’s approach. According to the newspaper, Facebook strictly prohibits content that targets a person based on characteristics such as race, national origin, religion, or sexual orientation — factors that the company defines as a “protected category.” The documents also outline sub-categories that receive extra protection, such as youth and senior citizens, and include hundreds of examples meant to cover a range of permutations and contexts.

The documents allow for content that attacks a religion or a country, though attacks on individuals based on religion or nationality are removed. But the line is a bit blurrier for migrants, despite the fact that many who have sought asylum in recent years are from majority-Muslim countries like Syria. From the SZ report:

For instance, saying “fucking Muslims” is not allowed, as religious affiliation is a protected category. However, the sentence “fucking migrants” is allowed, as migrants are only a “quasi protected category” – a special form that was introduced after complaints were made in Germany. This rule states that promoting hate against migrants is allowed under certain circumstances: statements such as “migrants are dirty” are allowed, while “migrants are dirt” isn’t.

A Facebook spokesperson declined to confirm the authenticity of the documents published by SZ this week. “Facebook is no place for the dissemination of hate speech, racism or calls for violence,” the spokesperson said in a statement to The Verge. “We evaluate reported content seriously and do our best to get it right. And as we learn from experts, we continue to refine the way we implement our policies to keep our community safe, especially for people that may be vulnerable or under attack.”

Under pressure from German authorities, Facebook, Twitter, and Google last year agreed to remove hate speech within 24 hours, though a report this month said the tech firms are still failing to comply with similar commitments made to the European Union. German lawmakers said this week that they are considering tougher legislation that would oblige Facebook and other tech companies to rapidly remove both hate speech and fake news.

According to SZ, Facebook has outsourced content moderation in Germany to a company called Arvato, and the work has taken a toll on those responsible for removing hate speech, child pornography, and fake news. Those who spoke to the newspaper said that workers at the bottom of the chain are expected to review 2,000 post a day, while those at the top have around eight seconds to determine whether a video should be removed. And as previous reports on Facebook moderation have described, the work can be psychologically taxing.

One employee told SZ: “I’ve seen things that made me seriously question my faith in humanity.”