Skip to main content

Facebook pledges to improve oversight of contractor firms amid rising criticism

Facebook pledges to improve oversight of contractor firms amid rising criticism

/

The Verge detailed harrowing work conditions at Facebook contractor Cognizant

Share this story

facebook stock art
Illustration by Alex Castro / Th

Facebook says it’s working on an improved compliance and audit process for the third-party contractors it hires to perform content moderation, following an exposé by The Verge detailing harrowing working conditions at the Phoenix, Arizona facility of a Facebook contractor called Cognizant.

”We are putting in place a rigorous and regular compliance and audit process for all of our outsourced partners to ensure they are complying with the contracts and care we expect,” writes Justin Osofsky, Facebook’s vice president of global operations, in a message to employees over the weekend, later published to the company’s website. “This will include even more regular and comprehensive focus groups with vendor employees than we do today.”

The Verge’s Casey Newton found that some employees of Cognizant — one of many companies around the globe that provide the human labor behind the social network’s moderation process — worked in a chaotic environment and often experienced severe trauma from looking at hate speech and violent media all day long.

The initial report found that contractors often suffer from post-traumatic stress disorder, and regularly adopt the fringe and conspiratorial beliefs of the posts they were moderating. The same contractors deal with strictly managed breaks and unrealistic benchmarks and often burn out of the job due to the emotional and mental toll the job took on their personal lives. Some fear that disgruntled former employees would return to work seeking violent retribution, and Cognizant grants them just nine minutes per day of “wellness time” to step away from their screens if they feel particularly traumatized from watching murder or sexual exploitation videos.

Cognizant contractors are paid just $15 an hour, for an annual salary of $28,800, while the median Facebook salary for full-time employees is more than $240,000.

A subsequent Bloomberg report published this morning confirms that many of these issues are not isolated to Cognizant, but endemic to the US moderation industry. (Numerous publications — including Wired, The Guardian, and Motherboard — have also reported on Facebook moderation working conditions overseas.) After hearing about how contractors at the similar moderation firm Accenture were not allowed to leave the building or answer personal phone calls during the work day, employees of Facebook itself began expressing concern on internal messaging boards, Bloomberg reports.

Facebook doesn’t say it’s doing anything drastically different in response to these findings. The auditing systems described by Osofsky are aimed at preventing harsh working conditions, but they were largely in place before The Verge’s report on Cognizant. There’s no indication these conditions will improve in response to the news. Additionally, Facebook has made clear it will continue to rely on tens of thousands of contractors at firms like Accenture and Cognizant.

Moderation contractors make just $15 an hour and are subject to severe trauma

Managers at Facebook reportedly feel contract labor is the only way to properly screen all the user-uploaded content every day in multiple languages around the world, at least until the company can deploy more sophisticated artificial intelligence. According to Bloomberg’s Sarah Frier, use of contract labor for moderation may be in part because of the legal risk full-time employees would bring if they were able to sue for psychological trauma on the job.

Osofsky does say Facebook is also planning to standardize contracts with third-party firms to “ensure consistent terms of care and resiliency programming across all areas of business.” The company also plans to host a “partner summit” in April to discuss mental wellness and other negative aspects of the moderation job that have garnered bad press for Facebook.

“We will reinforce our expectations for our partners, provide in-depth sessions on wellness and resiliency, quality, and training, and address some of the recent concerns and reinforce our standards on wellness and support,” Osofsky writes.

In one particularly telling part of The Verge’s investigation, content moderators complained that Facebook forced them to use its internal and algorithmically controlled Workplace tool as a means of receiving policy updates, even during high-stress breaking news moments. This resulted in conflicted messaging from managers. Facebook now says it will be refining the ways it communicates policy updates and training materials and looking into ways to “improve communication channels particularly with sites managed by partners.”