Opening statements from Mark Zuckerberg, Sundar Pichai, and Jack Dorsey have been published ahead of Thursday’s misinformation hearing in the House — and they show all three CEOs taking on an unusually sensitive issue for tech platforms. All three statements are worth reading, with Dorsey focusing on internal tools like Birdwatch and Pichai warning of the dangers of a full repeal of Section 230 of the Communications Decency Act.
But the most detailed proposal came from Zuckerberg, who talked at length about his preferred changes to Section 230. Instead of repealing the law entirely — as President Biden called for during the campaign — Zuckerberg’s proposal would make Section 230 conditional on companies maintaining a system to remove illegal content.
As Zuckerberg describes it in the letter:
We believe Congress should consider making platforms’ intermediary liability protection for certain types of unlawful content conditional on companies’ ability to meet best practices to combat the spread of this content. Instead of being granted immunity, platforms should be required to demonstrate that they have systems in place for identifying unlawful content and removing it. Platforms should not be held liable if a particular piece of content evades its detection — that would be impractical for platforms with billions of posts per day — but they should be required to have adequate systems in place to address unlawful content.
The standards for retaining 230 protections could be set by a third party, Zuckerberg goes on to say, and would exclude demands around encryption and privacy “that deserve a hearing in their own right.” That distinguishes Zuckerberg’s proposal from previously introduced 230 bills like the EARN IT Act, which conditions protections on a long-sought encryption backdoor. Zuckerberg’s proposal is closer to the PACT Act, which conditions the protections on transparency disclosures and other measures but focuses less on the removal of illegal content.
Broadly, it’s unusual for companies to propose rules for how they would like to be regulated, but it’s less unusual for Zuckerberg, who has previously written at length about favorable data portability and content moderation rules.
This is the most detailed proposal for Section 230 that Zuckerberg has yet put forward, and it’s one that would require few material changes for Facebook itself. Facebook already maintains significant systems for identifying and removing illegal or otherwise objectionable content. Still, the proposal might address some of the most urgent objections to Section 230, which often focus on smaller sites entirely dedicated to malicious activity.
The issue is particularly urgent for groups like the National Center for Missing and Exploited Children (NCMEC), which struggle with websites that don’t scan or moderate for child abuse imagery.
“There are a lot of companies, especially some of the very large companies, that engage in really tremendous voluntary measures,” NCMEC’s Yiota Souras told The Verge earlier this year. “But there are a lot of companies that don’t, and there is no legal requirement for them to use any kind of detection or screening.”
The hearing is scheduled to begin at 12PM ET on Thursday, March 25th.