Skip to main content

    How Facebook stems the deluge of pornography, violence, and cruelty

    How Facebook stems the deluge of pornography, violence, and cruelty

    /

    Gawker has investigated the process behind Facebook's content moderation, which uses labor in developing countries to make a final decision on whether images and posts breach the network's abuse standards.

    Share this story

    Ever wondered what happens to an image or post after you've hit the report button on Facebook? An article published in Gawker has tracked down members of the outsourced teams responsible for policing Facebook's "Abuse Standards" guidelines. Unlike Facebook's vague "Community Standards," the document given to the moderators is exacting, with definitive lines drawn and very little discretion left in the hands of the people viewing the content. Facebook's views on sex and violence seem to mirror the FCC's regulation of the US broadcast media, where most sexual content is inexcusable but violence and gore — up to a point — gets carte blanche.

    Facebook subcontracts its screening to a company named oDesk, which employs roughly 50 people in countries where labor is cheap including Turkey, the Philippines, Mexico, India, and Morocco. This team manually checks every piece of content flagged by the users for around $1 an hour, rising to around $4 on commission.

    Gawker interviewed a number of moderators (both past and current), including one man who only managed to last for three weeks before quitting. He told the interviewer that he was forced to see "Pedophelia, Necrophelia, Beheadings, Suicides, etc. I left [because] I value my mental sanity." He said that oDesk had told him that "the job was not for the light of heart," however, he believes that he underestimated "just how disturbing it'd be."