Skip to main content

Facebook’s former chief security officer Alex Stamos on protecting content moderators

Facebook’s former chief security officer Alex Stamos on protecting content moderators


He discusses Facebook and democracy with Casey Newton at SXSW

Share this story

Facebook Austin

Facebook’s former chief security officer Alex Stamos joined The Verge’s Casey Newton onstage at SXSW to discuss the difficult issues that plague Facebook and democracy.

You can listen to their discussion in its entirety on The Vergecast right now. Below is a lightly edited excerpt from this interview between Stamos and Newton about how to better serve the mental health of content moderators.

Casey Newton: Let me ask about content moderation. I recently wrote a story about content moderators, and some of the working conditions they encounter is really rough. How should Facebook and other platforms think about moderation going forward? Should these employees be paid more? Should this be a full-time job?

Alex Stamos: Yeah. So you can always pay people more.

I think specifically for content moderation, you have to think about what the mental health impacts are. I thought your story was great and really helped outline those impacts. I didn’t have any content moderators working for me. I had a child safety team and a counterterrorism team, and the emotional and psychological impact on the people is pretty extreme. We were able to do a bunch of things and support them because they were full-time employees.

The fundamental issue here is that the companies use companies, almost all use other companies like Accenture to provide their content moderators and you can’t provide intense mental health support through the contractual barrier of a different employer. Accenture is never going to do more than what is minimally contractually required to help their moderators. So in the long run this will reduce the need for growth of content moderation. But my guess would be 10 years from now, Facebook still has the same number of content moderators doing different kinds of things. I think you’re going to have to bring most of them in-house so [Facebook] can provide them with support, especially people that work on the really high-risk stuff like harassment, bullying the child stuff.

Terrorism and looking all day at beheading videos really starts to mess you up and there’s actually been some psychology research on the people that work in these fields. It’s sometimes gender biased, but men can get violent. There’s more violence at home for people who work in these kinds of jobs. Women sometimes become cutters. It has been reasonably steady because you have the same problem with police officers, social service workers, people who have to work with these horrible things. So I think the company does have a responsibility to help.

As it turns out, the laws here make it hard. Companies can’t just employ psychologists, psychiatrists because of HIPAA and a law called a ERISA. Because of the way the laws work, you can’t just hire a PhD and have them on staff to see people as doctors. You have to come up with a whole insurance plan and then you have to offer the insurance plan to all of your employees. It’s kind of messed up. If Congress is looking for things to do, making small changes to employment law to make it easier to get mental health services to people is actually positive.

We did that kind of stuff, but you can’t do any of that for contractors because it becomes what’s called a co-employment situation. And so legally you just can’t do it. And I think that’s why they got to bring them in-house.

The Vergecast /

Weekly tech roundup and interviews with major figures from the tech world.