Keith Utley loved to help.
First, he served in the Coast Guard, where he rose to the rank of lieutenant commander. He married, had a family, and devoted himself utterly to his two little girls. After he got out of the military, he worked as a moderator for Facebook, where he purged the social network of the worst stuff that its users post on a daily basis: the hate speech, the murders, the child pornography.
Utley worked the overnight shift at a Facebook content moderation site in Tampa, FL, operated by a professional services vendor named Cognizant. The 800 or so workers there face relentless pressure from their bosses to better enforce the social network’s community standards, which receive near-daily updates that leave its contractor workforce in a perpetual state of uncertainty. The Tampa site has routinely failed to meet the 98 percent “accuracy” target set by Facebook. In fact, with a score that has been hovering around 92, it is Facebook’s worst-performing site in North America.
The stress of the job weighed on Utley, according to his former co-workers, who, like all Facebook contractors at the Tampa site, must sign a 14-page nondisclosure agreement.
“The stress they put on him — it’s unworldly,” one of Utley’s managers told me. “I did a lot of coaching. I spent some time talking with him about things he was having issues seeing. And he was always worried about getting fired.”
On the night of March 9th, 2018, Utley slumped over at his desk. Co-workers noticed that he was in distress when he began sliding out of his chair. Two of them began to perform CPR, but no defibrillator was available in the building. A manager called for an ambulance.
The Cognizant site in Tampa is set back from the main road in an office park, and between the dim nighttime lighting and discreet exterior signage, the ambulance appears to have had trouble finding the building. Paramedics arrived 13 minutes after the first call, one worker told me, and when they did, Utley had already begun to turn blue.
Paramedics raced Utley to a hospital. At Cognizant, some employees were distraught — one person told me he passed by one of the site’s designated “tranquility rooms” and found one of his co-workers, a part-time preacher, praying loudly in tongues. Others ignored the commotion entirely, and continued to moderate Facebook posts as the paramedics worked.
Utley was pronounced dead a short while later at the hospital, the victim of a heart attack. Further information about his health history, or the circumstances of his death, could not be learned. He left behind a wife, Joni, and two young daughters. He was 42 years old.
On Monday morning, workers on the day shift were informed that there had been an incident, and they began collecting money to buy a card and send flowers. But some site leaders did not initially tell workers that Utley had died, and instructed managers not to discuss his death, current and former employees told me.
“Everyone at leadership was telling people he was fine — ‘oh, he’ll be okay,’” one co-worker recalled. “They wanted to play it down. I think they were worried about people quitting with the emotional impact it would have.”
But the illusion shattered later that day, when Utley’s father, Ralph, came to the site to gather his belongings. He walked into the building and, according to a co-worker I spoke to, said: “My son died here.”
In February, I wrote about the secret lives of Facebook contractors in America. Since 2016, when the company came under heavy criticism for failing to prevent various abuses of its platform, Facebook has expanded its workforce of people working on safety and security around the world to 30,000. About half of those are content moderators, and the vast majority are contractors hired through a handful of large professional services firms. In 2017, Facebook began opening content moderation sites in American cities including Phoenix, Austin, and Tampa. The goal was to improve the accuracy of moderation decisions by entrusting them to people more familiar with American culture and slang.
Cognizant received a two-year, $200 million contract from Facebook to do the work, according to a former employee familiar with the matter. But in return for policing the boundaries of free expression on one of the internet’s largest platforms, individual contractors in North America make as little as $28,800 a year. They receive two 15-minute breaks and a 30-minute lunch each day, along with nine minutes per day of “wellness” time that they can use when they feel overwhelmed by the emotional toll of the job. After regular exposure to graphic violence and child exploitation, many workers are subsequently diagnosed with post-traumatic stress disorder and related conditions.
My initial report focused on Phoenix, where workers told me that they had begun to embrace fringe views after continuously being exposed to conspiracy theories at work. One brought a gun to work to protect himself against the possibility of a fired employee returning to the office seeking vengeance. Others told me they are haunted by visions of the images and videos they saw during their time on the job.
Conditions at the Phoenix site have not improved significantly since I visited. Last week, some employees were sent home after an infestation of bed bugs was discovered in the office — the second time bed bugs have been found there this year. Employees who contacted me worried that the infestation would spread to their own homes, and said managers told them Cognizant would not pay to clean their homes.
“Bed bugs can be found virtually every place people tend to gather, including the workplace,” Cognizant said in a statement. “No associate at this facility has formally asked the company to treat an infestation in their home. If someone did make such a request, management would work with them to find a solution.”
Facebook executives have maintained that the working conditions described to me by dozens of contractors do not accurately reflect the daily lives of the majority of its workers. But after publishing my story about Phoenix, I received dozens of messages from other contractors around the world, many of whom reported having similar experiences. The largest single group of messages I received came from current and former Facebook contractors in Tampa. Many of them have worked closely with employees at the Phoenix site, and believe working conditions in Florida are even more grim.
In May, I traveled to Florida to meet with these Facebook contractors. This article is based on interviews with 12 current and former moderators and managers at the Tampa site. In most cases, I agreed to use pseudonyms to protect the employees from potential retaliation from Facebook and Cognizant. But for the first time, three former moderators for Facebook in North America agreed to break their nondisclosure agreements and discuss working conditions at the site on the record.
Employees told me that pressure from managers to improve its performance has taken a toll on the workforce. Cognizant’s contract with Facebook is coming up for renewal, and with the entire company struggling to hit the 98 percent accuracy target, there are widespread concerns internally that Cognizant will lose Facebook’s business.
Contractors told me that Cognizant had lured them away from less demanding jobs by promising regular schedules, bonuses, and career development, only to renege on all three.
They described a filthy workplace in which they regularly find pubic hair and other bodily waste at their workstations. Employees said managers laugh off or ignore sexual harassment and threats of violence. Two discrimination cases have been filed with the Equal Employment Opportunity Commission since April.
They said marijuana use is so prevalent that the site manager jokingly complained at an all-hands meeting that he had gotten a contact high walking in the door.
More than anything else, the contractors described an environment in which they are never allowed to forget how quickly they can be replaced. It is a place where even Keith Utley, who died working alongside them, would receive no workplace memorial — only a passing mention during team huddles in the days after he passed. “There is no indication that this medical condition was work related,” Cognizant told me in a statement. “Our associate’s colleagues, managers and our client were all saddened by this tragic event.” (The client is Facebook.)
Utley’s family could not be reached for comment. Employees who began working after he died told me they had never heard his name.
“We were bodies in seats,” one former moderator told me. “We were nothing to them — at all.”
Shawn Speagle was 23 and employed at an online education company working with English language learners when he visited a Cognizant job fair. A recruiter there described to him a role in which Speagle would primarily help businesses analyze engagement on their Facebook pages. He might have to do some content moderation, the recruiter said, but Speagle entered the interview believing he was about to embark on a new career in high technology — one that he hoped would eventually lead to a full-time role at Facebook.
Cognizant offered Speagle $15 an hour to do the job full time — a marked improvement over his previous job, which was seasonal. Only after he began training did he realize that the job would not, in fact, involve helping businesses with Facebook marketing. Instead, two weeks after Speagle was put onto the production floor, a manager told him he and a colleague would be reviewing graphic violence and hate speech full time.
“For our associates who opt to work in content moderation, we are transparent about the work they will perform,” a Cognizant spokesman said in response. “They are made aware of the nature of the role before and during the hiring process, and then given extensive and specific training before working on projects.”
But had his managers asked, they would have learned that Speagle had a history of anxiety and depression, and that he might not be suited well for the role. No one did.
“They just said me and [my colleague] were very meticulous and had a lot of promise to move up to the SME position,” Speagle said, referring to the subject matter experts who make $1 more per hour in exchange for answering moderators’ questions about Facebook policy. “They said Facebook is basically shoving all of their graphic violence content to us, that they didn’t want it anymore. So they had to move more people to cover it. And that’s all that we saw, every single day.”
Speagle vividly recalls the first video he saw in his new assignment. Two teenagers spot an iguana on the ground, and one picks it up by the tail. A third teenager films what happens next: the teen holding the iguana begins smashing it onto the street. “They beat the living shit out of this thing,” Speagle told me, as tears welled up in his eyes. “The iguana was screaming and crying. And they didn’t stop until the thing was a bloody pulp.”
Under the policy, the video was allowed to remain on Facebook. A manager told him that by leaving the video online, authorities would be able to catch the perpetrators. But as the weeks went on, the video continued to reappear in his queue, and Speagle realized that police were unlikely to look into the case.
Speagle had volunteered at animal shelters in the past, and watching the iguana die on a regular basis rattled him. “They kept reposting it again and again and again,” he said, pounding the table as he spoke. “It made me so angry. I had to listen to its screams all day.”
Cognizant’s Tampa facility opened in a maze-like office park in the summer of 2017, about two months after the Phoenix facility came online. It operates out of a single-story building next to a pond fed by two storm drains. On most days, an alligator emerges from one of the drains to bask in the sun.
Before the office opened, the company began advertising work on Indeed and other job sites, using opaque titles such as “social media analyst.” Initially, applicants are not told they will be working for Facebook — only a “large social media company.”
Cognizant was not always straightforward with applicants about the nature of the work in Tampa. Marcus*, who worked in management, told me that a recruiter had persuaded him to leave a more normal job with the promise of a regular schedule, performance bonuses, and a good work-life balance. Once he joined, though, he was made to work nights, and the bonuses never materialized.
Marcus was made to moderate Facebook content — an additional responsibility he says he was not prepared for. A military veteran, he had become desensitized to seeing violence against people, he told me. But on his second day of moderation duty, he had to watch a video of a man slaughtering puppies with a baseball bat. Marcus went home on his lunch break, held his dog in his arms, and cried. I should quit, he thought to himself, but I know there’s people at the site that need me. He ultimately stayed for a little over a year.
Cognizant calls the part of the building where contractors do their work “the production floor,” and it quickly filled with employees. The minimum wage in Florida is $8.46, and at $15 an hour, the job pays better than most call center work in the area. For many content moderators — Cognizant refers to them by the enigmatic title of “process executive” — it was their first real job.
In its haste to fill the workplace, Cognizant made some odd staffing decisions. Early on, the company hired Gignesh Movalia, a former investment advisor, as a moderator. Cognizant conducts background checks on new hires, but apparently failed even to run a basic web search on Movalia. Had they done so, they would have learned that in 2015 he was sentenced to 18 months in prison for his involvement in a $9 million investment fraud scheme. According to the FBI, Movalia had falsely claimed to have access to shares of a fast-growing technology startup about to begin trading on the public market.
The startup was Facebook.
Movalia was eventually fired, but employees I spoke with believed his tenure exemplified Cognizant’s approach to hiring moderators: find bodies wherever you can, ask as few questions as possible, and get them into a seat on the production floor where they can start working.
The result is a raucous workplace where managers send regular emails to the staff complaining about their behavior on the site. Nearly every person I interviewed independently compared the Tampa office to a high school. Loud altercations, often over workplace romances, regularly take place between co-workers. Verbal and physical fights break out on a monthly basis, employees told me. A dress code was instituted to discourage employees from wearing provocative clothing to work — “This is not a night club,” read an email to all employees obtained by The Verge. Another email warned employees that there had been “numerous incidents of theft” on the property, including stolen food from the office refrigerator, food from vending machines, and employees’ personal items.
Michelle Bennetti and Melynda Johnson both began working at the Tampa site in June 2018. They told me that the daily difficulty of moderating content, combined with a chaotic office environment, made life miserable.
“At first it didn’t bother me — but after a while, it started taking a toll,” Bennetti told me. “I got to feel, like, a cloud — a darkness — over me. I started being depressed. I’m a very happy, outgoing person, and I was [becoming] withdrawn. My anxiety went up. It was hard to get through it every day. It started affecting my home life.”
Johnson was particularly disturbed by the site’s sole bathroom, which she regularly found in a state of disrepair. (The company says it has janitors available every shift in Tampa.) In the stalls, signs posted in response to employee misbehavior proliferated. Do not use your feet to flush the toilet. Do not flush more than five toilet seat covers at one time. Do not put any substances, natural or unnatural, on the walls.
“And obviously the signs are there for a reason, because people are doing this,” said Johnson, who worked at the site until March. “Every bit of that building was absolutely disgusting. You’d go in the bathroom and there would be period blood and poop all over the place. It smelled horrendous all the time.”
She added: “It’s a sweatshop in America.”
The work day in Tampa is divided into five shifts, and desks are shared between employees. Contractors I spoke with said they would frequently come to work and find their workstation for the day in dire condition — encountering boogers, fingernails, and pubic hairs, among other items. The desks would be cleaned whenever Facebook made one of its regular planned visits to the site. At other times, employees told me, the office was filthy.
Florida law does not require employers to offer sick leave, and so Cognizant workers who feel ill must instead use personal leave time. (They are granted five hours of personal leave per pay period.) Missing work is one of the few reasons Cognizant regularly fires its contractors. And so to avoid receiving an “occurrence,” as the company calls unapproved absences, contractors who have exhausted their break time come to work sick — and occasionally vomit in trash cans on the production floor.
A worker named Lola* told me that health problems had resulted in her receiving so many occurrences she was at risk of being fired. She began going into work even when she felt ill to the point of throwing up. Facebook contractors are required to use a browser extension to report every time they use the restroom, but during a recent illness, Lola quickly took all her allotted breaks. She had previously been written up for going to the bathroom too many times, she said, and so she felt afraid to get up from her desk. A manager saw that she was not feeling well, and brought a trash can to her desk so she could vomit in it. So she did.
“Then I was crying at my desk,” Lola said. “I was like, ‘I can’t go on.’ My co-workers said, ‘Just go home.’ I said ‘I can’t, because I’m going to get an occurrence.’” She stayed at her desk and cried.
Employees told me about other disturbing incidents at the Tampa site. Among them:
- An employee who used a colostomy bag had it rupture while she was at work, spilling some waste onto the floor. Senior managers were overheard mocking her. She eventually quit.
- An employee who threatened to “shoot up the building” in a group chat was placed on paid leave and allowed to return. He was fired after making another similar threat. (A Cognizant spokesperson said the company has security personnel on site at all hours. “Our goal is to ensure that our employees feel assured that they work in a safe environment,” he said.)
- Another employee broadcast himself on Facebook Live talking about wanting to bash a manager’s head in. Another manager determined that he was making a joke, and he was not disciplined.
In April, two women who work at the Tampa site filed complaints with the US Equal Employment Opportunity Commission alleging that they had been sexually harassed by two of their male co-workers. According to the complaint, the men regularly discussed anal sex in the office. When the women were not receptive to the discussion, one of the men said he “was going to start a YouTube channel and record himself shooting up the place,” according to the complaint. On April 3rd, the Hillsborough County Sheriff’s Office came to the site to interview the women. According to the officer’s report, one of the men had been photographed following one of the women home.
A Cognizant spokesman told me that the employee has been suspended while the claims are being investigated. But some workers say they are still concerned.
“Every time I get an email or a phone call from my clients, I worry that there’s been a shooting — and I know that’s their worry as well,” said KC Hopkinson, an attorney who represents several current and former Cognizant employees in Tampa. “They go in there every morning asking, ‘what am I going to see today? And am I going to make it home tonight?’”
Hopkinson told me that her clients who have reported incidents to human resources are generally either ignored or retaliated against, a claim that was echoed to me by several other employees there. In some cases, the site’s human resources staff has followed workers who filed complaints to the bathroom, and questioned them about what they were doing for the few minutes they were inside. (“We take allegations such as this very seriously,” a company spokesman told me. “Cognizant strives to create a safe and empowering workplace.”)
“I wouldn’t want my worst enemy to work there,” Hopkinson said. “It’s a terrible, terrible environment.”
For the six months after he was hired, Speagle would moderate 100 to 200 posts a day. He watched people throw puppies into a raging river, and put lit fireworks in dogs’ mouths. He watched people mutilate the genitals of a live mouse, and chop off a cat’s face with a hatchet. He watched videos of people playing with human fetuses, and says he learned that they are allowed on Facebook “as long as the skin is translucent.” He found that he could no longer sleep for more than two or three hours a night. He would frequently wake up in a cold sweat, crying.
Early on, Speagle came across a video of two women in North Carolina encouraging toddlers to smoke marijuana, and helped to notify the authorities. (Moderator tools have a mechanism for escalating issues to law enforcement, and the women were eventually convicted of misdemeanor child abuse.) To Speagle’s knowledge, though, the crimes he saw every day never resulted in legal action being taken against the perpetrators. The work came to feel pointless, never more so than when he had to watch footage of a murder or child pornography case that he had already removed from Facebook.
In June 2018, a month into his job, Facebook began seeing a rash of videos that purportedly depicted organs being harvested from children. (It did not.) So many graphic videos were reported that they could not be contained in Speagle’s queue.
“I was getting the brunt of it, but it was leaking into everything else,” Speagle said. “It was mass panic. All the SMEs had to rush in there and try to help people. They were freaking out — they couldn’t handle it. People were crying, breaking down, throwing up. It was like one of those horror movies. Nobody’s prepared to see a little girl have her organs taken out while she’s still alive and screaming.” Moderators were told they had to watch at least 15 to 30 seconds of each video.
Speagle helps to take care of his parents, who have health problems, and was afraid to quit Cognizant. “It was tough to find a job down here in this market,” he said. To cope with the stress, he began binge-eating pastries from the vending machines, and eventually put on a significant amount of weight. He sought out the on-site counselor for support, but found him unhelpful.
“He just flat-out told me: ‘I don’t really know how to help you guys,’” Speagle said. The counselor he spoke with had been substituting for the regular counselor, who had more training. Cognizant also offers a 24/7 hotline, full healthcare benefits, and other wellness programs. But the experience soured Speagle on the site’s mental health resources. Other times, when he was having a particularly bleak day in the queue, a manager would hand him a bucket of Legos and encourage him to play with them to relieve the stress as he worked. Speagle built a house and a spaceship, but it didn’t make him feel better.
By last fall, Speagle told me, he was sleeping only an hour or two each night. The lack of sleep, coupled with depression, made it difficult for him to exercise. He began lashing out at his parents. Meanwhile, at work, he felt micromanaged by his team leaders, who pressured him to moderate more posts.
“I felt like I was trapped inside my own body,” he said. “I couldn’t, for the life of me, get up from my desk, or I would be yelled at to stay in my desk. So I was trapped at my desk and in my body. I was so scared.”
Cognizant periodically purges large numbers of staff members in what have come to be known as “red bag days” for the red bags that managers give to the newly fired to collect their belongings. Sometimes the dismissals are related to job performance, and sometimes employees aren’t given any explanation at all. Speagle was laid off as part of a red bag day last October.
In February, he went to a psychiatrist, who diagnosed him with PTSD. He is currently in treatment. Meanwhile, he has gone back to school to get his teaching certificate. Seeing so many children harmed on Facebook made him want to make a positive contribution to the lives of young people, he said.
“I really wanted to make a difference,” Speagle told me of his time working for Facebook. “I thought this would be the ultimate difference-making thing. Because it’s Facebook. But there’s no difference being made.”
I asked him what he thought needed to change.
“I think Facebook needs to shut down,” he said.
Last week, I visited the Tampa site with a photographer. It had received a deep cleaning the night before I visited, according to two employees I spoke with, and the bathroom sparkled. As I walked the floor with the site manager and a Facebook spokeswoman, I noted that most rooms smelled of cleaning products.
Work stopped while we were there to ensure we did not see any Facebook user’s personal information. Moderators, mostly in their 20s and 30s, chatted at their desks, or shot baskets in one of the miniature hoops around the building. The site’s senior managers, who employees say are normally cloistered in their offices, made a show of walking the production floor and chatting with their subordinates.
Every few feet, a wall decal or poster offered an inspirational platitude. Exhortations to always try your hardest and maintain a positive attitude were punctuated with other signs that came across as slightly more sinister. “No news is good news,” read one. “Our reputation depends on you,” read another.
We saw an activity room where workers are invited to participate in yoga sessions, and a break room presided over by a small Buddha holding an electric candle. Across the room from the Buddha, coloring books were fanned out on a table beside windows overlooking the alligator pond.
The tour ended about an hour after we arrived.
“That was a dog-and-pony show,” an employee named Bob told me over the phone the next day. “That was completely staged. We’re out there playing games, and the senior management are out there interacting with people — it’s all a facade.”
Facebook sees a similar facade when it visits the site, he said.
The person responsible for managing Facebook’s growing contractor workforce is Arun Chandra, whose title is vice president of scaled support. Chandra arrived at Facebook last November after a long career at HP, where he helped to oversee the company’s global supply chain. In his new role, he told me, he hopes to gradually improve contractors’ standard of living while also working to ensure they become more effective at their jobs.
“I’m trying to address the macro picture, and move the bigger things forward in the right way,” said Chandra, who struck me as energetic and deeply sincere. “We’ll never solve 100 percent, but I’m trying to show I can solve 80 to 90 percent of the larger problems.”
Chandra has visited more than a dozen of the company’s far-flung partner sites in the United States and abroad, and has plans to visit them all. When he arrives, he likes to pull rank-and-file contractors into rooms and ask them about working conditions without their managers around. He told me that in the Philippines, content moderation has become an attractive career track, and that everywhere he goes, he meets moderators who take great pride in their work. “The level of enthusiasm people have is amazing,” he said.
This spring, Chandra organized a summit of around 200 leaders from content moderation sites around the world — an event he plans to hold twice a year, with another coming this fall. Up until now, vendors have had different policies and programs for promoting workers’ mental health. At the summit, they agreed to share information about their approaches — effectively agreeing to stop competing on the basis of who does a better job taking care of workers.
“We have to run a very large-scale platform. We have to take care of the community. And that means we have to get a whole lot of work done,” Chandra said. “But that is not at the expense of [contractors’] well-being.”
Chandra plans to launch a new audit program later this year to promote better working conditions. That will include more surprise visits — an effort to get around the dog-and-pony-show phenomenon I observed last week. He also plans to stop evaluating partners on the sole basis of whether vendors achieve a 98 percent accuracy rate — instead, he said, Facebook will develop a balanced “scorecard” approach to measuring vendors’ performance. Chandra intends for worker well-being to be part of that score, though Facebook has not yet determined how it will be measured.
In May, Facebook announced that it will raise contractor wages by $3 an hour, make on-site counselors available during all hours of operation, and develop further programs for its contractor workforce. But the pay raises are not due to take effect until the middle of 2020, by which time many, if not most, of the current Tampa workforce will no longer work there. Turnover statistics could not be obtained. But few moderators I have spoken with make it to two years on the job — they either are fired for low accuracy scores, or quit over the working conditions. And so while the raises will be a boon to a future workforce, the contractors I spoke to are unlikely to benefit.
Nor will the many contractors who have already left the job. As in Phoenix, former employees of the Tampa site described lasting emotional disturbances from their work — one for which neither Facebook nor Cognizant offers any support.
I asked Chandra whether Facebook should hire more content moderators in house, rather than relying on big staffing companies. He told me that Facebook’s business changes so quickly that it might not be possible. But he did not rule it out.
“I completely get the debate,” he said. “If anything I’m very empathetic to the entire conversation, having spent a lot of time with these people. I don’t think we have a better answer right now.”
In the meantime, Facebook is building a “global resiliency team” tasked with improving the well-being of both full-time employees and contractors. Chris Harrison, who leads the team, told me that he aspires to build a wellness program that begins at the point of hiring. He wants to screen employees to gauge their psychological fitness — a move that might prevent someone like Shawn Speagle from being assigned to a queue filled with graphic violence — but says Facebook is still working to understand whether this is possible under employment law.
Harrison plans to make “resiliency” — the art of bouncing back after seeing something awful — a key part of contractor training. He helped to develop new tools for moderators that can automatically blur out faces in disturbing videos, turn them grayscale, or mute the audio — all things that can reduce the psychological harm to the moderator viewing them.
Eventually, Harrison hopes Facebook will offer post-employment counseling to moderators who suffered psychological harm on the job. “Of course we should do that,” he said. But the idea is still in the earliest discussion stages, he said. “There’s just so many layers of complexity globally. It’s really, really hard to pull it off in a legally compliant way.”
I asked Harrison, a licensed clinical psychologist, whether Facebook would ever seek to place a limit on the amount of disturbing content a moderator is given in a day. How much is safe?
“I think that’s an open question,” he said. “Is there such thing as too much? The conventional answer to that would be, of course, there can be too much of anything. Scientifically, do we know how much is too much? Do we know what those thresholds are? The answer is no, we don’t. Do we need to know? Yeah, for sure.”
“If there’s something that were to keep me up at night, just pondering and thinking, it’s that question,” Harrison continued. “How much is too much?”
If you believe moderation is a high-skilled, high-stakes job that presents unique psychological risks to your workforce, you might hire all of those workers as full-time employees. But if you believe that it is a low-skill job that will someday be done primarily by algorithms, you probably would not.
Instead, you would do what Facebook, Google, YouTube, and Twitter have done, and hire companies like Accenture, Genpact, and Cognizant to do the work for you. Leave to them the messy work of finding and training human beings, and of laying them all off when the contract ends. Ask the vendors to hit some just-out-of-reach metric, and let them figure out how to get there.
At Google, contractors like these already represent a majority of its workforce. The system allows tech giants to save billions of dollars a year, while reporting record profits each quarter. Some vendors may turn out to mistreat their workers, threatening the reputation of the tech giant that hired them. But countless more stories will remain hidden behind nondisclosure agreements.
In the meantime, tens of thousands of people around the world go to work each day at an office where taking care of the individual person is always someone else’s job. Where at the highest levels, human content moderators are viewed as a speed bump on the way to an AI-powered future.
In such a system, offices can still look beautiful. They can have colorful murals and serene meditation rooms. They can offer ping pong tables and indoor putting greens and miniature basketball hoops emblazoned with the slogan: “You matter.” But the moderators who work in these offices are not children, and they know when they are being condescended to. They see the company roll an oversized Connect 4 game into the office, as it did in Tampa this spring, and they wonder: When is this place going to get a defibrillator?
(Cognizant did not respond to questions about the defibrillator.)
I believe Chandra and his team will work diligently to improve this system as best as they can. By making vendors like Cognizant accountable for the mental health of their workers for the first time, and offering psychological support to moderators after they leave the company, Facebook can improve the standard of living for contractors across the industry.
But it remains to be seen how much good Facebook can do while continuing to hold its contractors at arms’ length. Every layer of management between a content moderator and senior Facebook leadership offers another chance for something to go wrong — and to go unseen by anyone with the power to change it.
“Seriously Facebook, if you want to know, if you really care, you can literally call me,” Melynda Johnson told me. “I will tell you ways that I think that you can fix things there. Because I do care. Because I really do not think people should be treated this way. And if you do know what’s going on there, and you’re turning a blind eye, shame on you.”
Have you worked as a content moderator? We’re eager to hear your experiences, especially if you have worked for Google, YouTube, or Twitter. Email Casey Newton at email@example.com, or message him on Twitter @CaseyNewton. You can also subscribe here to The Interface, his evening newsletter about Facebook and democracy.
Update June 19th, 10:37AM ET: This article has been updated to reflect the fact that a video that purportedly depicted organ harvesting was determined to be false and misleading.