Skip to main content

Former child porn moderators claim Microsoft failed to treat their PTSD

Former child porn moderators claim Microsoft failed to treat their PTSD

/

Ex-employees were also responsible for reviewing murders, bestiality, and other gruesome content

Share this story

Microsoft Store stock

Two former Microsoft employees who were responsible for monitoring child pornography and other criminal material have filed a lawsuit against the company, The Daily Beast reports, alleging that they were not provided with psychological support to treat post-traumatic stress disorder (PTSD).

The employees, Henry Soto and Greg Blauert, were part of Microsoft’s Online Safety Team, where they were charged with reviewing material that had been flagged as potentially illegal. According to the lawsuit, Soto’s job involved viewing “horrible brutality, murder, indescribable sexual assaults,” and other content “designed to entertain the most twisted and sick-minded people in the world.” Blauert had to “review thousands of images of child pornography, adult pornography and bestiality that graphically depicted the violence and depravity of the perpetrators,” according to the complaint.

“Child pornography, adult pornography and bestiality.”

Both men say they suffered “vicarious trauma” and symptoms associated with PTSD, including nightmares, anxiety, and hallucinations. When they complained about their health, Microsoft offered a “Wellness Program,” but the suit alleges that the therapist involved with the program was not qualified to treat their symptoms. Program supervisors also advised them to take smoke breaks and walks to deal with their problems, while Blauert was advised to play more video games, according to the complaint.

In a statement to The Guardian, a Microsoft spokesperson said that the company “disagrees” with the allegations, and that it “takes seriously its responsibility to remove and report imagery of child sexual exploitation and abuse being shared on its services, as well as the health and resiliency of the employees who do this important work.”

The spokesperson also said that Microsoft uses technology to “reduce the realism of the imagery” that workers view, and that it places limits on the time employees spend moderating content. The Online Safety Team reviews material flagged by algorithms or reported by users to confirm its illegality before passing it along to law enforcement authorities, as required by law.

The lawsuit is seeking damages for both men, and recommends changes that Microsoft could make to protect the health of other workers. The recommendations include more time off, regular psychological consultations, and a wellness program for spouses. Previous reports have suggested that moderators at Facebook, YouTube, and other tech companies have suffered similar psychological trauma.