Skip to main content

Former Facebook moderators worried for the upcoming US election

Former Facebook moderators worried for the upcoming US election

/

They spoke out about the conditions they faced while working for the company

Share this story

Photo by Michele Doying / The Verge

When Viana Ferguson was a content moderator for Facebook, she came across a post that she immediately recognized as racist: a photo of a white family with a Black child that had a caption reading “a house is not a home without a pet.” But she had a hard time convincing her manager that the picture was not just an innocent photo of a family.

“She didn’t seem to have the same perspective, there was no reference I could use,” Ferguson said. She pointed out that there was no pet in the photo, but the manager also told her, “Well, there’s also no home in the picture.”

Ferguson said it was one of several examples of the lack of structure and support Facebook moderators face in their day-to-day jobs, a vast majority of which are performed for third-party consultancies. Ferguson spoke on a call organized by a group that calls themselves the Real Facebook Oversight Board, along with Color of Change, a progressive nonprofit that led the call for a Facebook advertiser boycott over the summer, and UK-based nonprofit technology justice organization Foxglove.

“Clickbait still rules lies and hate still travels on Facebook like a California wildfire.”

“In 2020 on the world’s largest social network, clickbait still rules lies and hate still travels on Facebook like a California wildfire,” said Cori Crider, co-founder of Foxglove. “Things are still so bad that in two days, Mark Zuckerberg will testify once again to the Senate about what Facebook is doing to address this problem, and protect American democracy.”

Crider said Facebook points to its massive workforce of content moderators as evidence it takes the issues seriously. “Content moderators are the firefighters on the front lines guarding our elections,” she said. “They’re so critical to Facebook’s work that Facebook has hauled them back into their offices during the pandemic and kept them in the offices.”

The challenges of working as a Facebook moderator both in the US and overseas have been well-documented, and consistent complaints over the course of many years about how viewing traumatic content for hours on end led to the company agreeing to pay $52 million to current and former US-based moderators to compensate them for mental health issues developed on the job.

Former moderator Alison Trebacz said on the call she remembered the day after the 2017 mass shooting at Las Vegas’ Mandalay Bay casino, her work queue was full of videos of injured and dying shooting victims. But to mark a video as “disturbing,” moderators had to verify that a person was completely incapacitated, something that was nearly impossible to do in a timely way. “We end up as moderators and agents trying to make these big decisions on popular content without having full direction and guidance within five minutes of the event happening,” she said.

As part of her job, Trebacz said she and other moderators regularly had to view graphic content, and she felt mentally drained by the nature of the work. She was paid $15 an hour and said while she was there, from 2017 to 2018, there was little mental health support. The company used nondisclosure agreements, which limited moderators from being able to talk about their jobs with people outside the company, adding to the overall stress of the job. The moderators are independent contractors, and most don’t receive benefits or sick leave, noted Jade Ogunnaike of Color of Change.

“When companies like Facebook make these grand statements about Black Lives Matter, and that they care about equity and justice, it is in direct contrast to the way that these content moderators and contractors are treated,” Ogunnaike said.

The group wants to see Facebook make moderators full-time employees who would receive the same rights as other Facebook staff and provide adequate training and support. While the company relies on artificial intelligence to help root out violent and problematic content, that’s not sufficient to address more nuanced instances of racism like the one Ferguson mentioned.

“If Facebook wants valuable feedback from the people doing the bulk of the work, they would benefit by bringing them in house.”

But Trebacz pointed out that human moderators aren’t going away; rather, they’re becoming even more important. “If Facebook wants valuable feedback from the people doing the bulk of the work, they would benefit by bringing them in house.”

Ferguson said she saw a sharp uptick in hate speech on Facebook following the 2016 US presidential election. She said the platform was ill-equipped to handle newly emboldened people posting more and more hateful content. If a moderator removed a piece of content later found not to be against Facebook rules, they could be disciplined or even fired, she added.

Trebacz said she hoped Facebook would provide more real-time communication with moderators about content decisions and that more decisions will be made preemptively instead of reacting all the time. But she said she expects the next few weeks will be “outrageously difficult” for current content moderators.

“I think it’s going to be chaos,” she said. “Truly.”

Facebook did not immediately reply to a request for comment Monday. The Wall Street Journal reported Sunday that the company is bracing for possible chaos around next week’s election with plans to implement internal tools it’s used in at-risk countries. The plans may include slowing the spread of posts as they begin to go viral, altering the News Feed algorithm to change what content users see, and changing the rules for what kind of content should be removed.