There’s finally momentum in Congress to make serious changes to Section 230 — and not everyone’s happy about it. Last year’s antitrust hearings have given way to a full-court press on regulating big companies like Facebook and Google, and many in Congress see peeling back Section 230 as an easier way forward than GDPR-style privacy regulation or a full-scale antitrust breakup.
What is Section 230?
Section 230 of the Communications Decency Act, which was passed in 1996, says an “interactive computer service” can’t be treated as the publisher or speaker of third-party content. This protects websites from lawsuits if a user posts something illegal, although there are exceptions for pirated and prostitution-related material.
Sen. Ron Wyden (D-OR) and Rep. Chris Cox (R-CA) crafted Section 230 so website owners could moderate sites without worrying about legal liability. The law is particularly vital for social media networks, but it covers many sites and services, including news outlets with comment sections — like The Verge. The Electronic Frontier Foundation calls it “the most important law protecting internet speech.”
So on Monday, we hosted an event exploring what those regulations might look like, starting with a keynote on tech regulation from Sen. Amy Klobuchar (D-MN). After that, we had a panel of three experts — Vimeo’s Michael Cheah, Wikimedia’s Amanda Keton, and writer and strategist Sydette Harry — dig into the details of Section 230 and how the things they care about on the internet would be affected if the law was repealed.
It made for a strange combination. On one side, Sen. Klobuchar urged the tech world not to dismiss changes to Section 230 out of hand and to leave room for the idea that some kind of regulation or antitrust action might actually make the industry better. On the other side, the panel pleaded that any changes to Section 230 be tailored to specific problems. But since no one can quite agree which problems need to be addressed, it’s a tricky bet to make.
If that sounds complicated, it is. The rise of online platforms like Facebook and YouTube has created several problems at once, and this kind of tangle is the inevitable result. There’s the antitrust problem, the misinformation problem, the hate speech problem, and half a dozen other issues. They’re all related, but fixing one will sometimes make others worse. Lawmakers are increasingly aware of the problems with online platforms, but balancing them in an actual piece of legislation will be a spectacularly difficult task.