Section 230 of the Communications Decency Act is widely criticized, widely praised, and widely misunderstood. The policy allows basically every major website — from YouTube to Wikipedia — to exist in its current form. Depending on who you ask, this is either a wonderful development or a complete disaster. That’s made Section 230 a fixture of recent internet policy debates, particularly at the US Department of Justice, where there is a growing interest in changing the law.
The Justice Department publicly kicked off that process today, assembling three panels of experts to lay out reasons for changing or preserving Section 230. Attorney General Bill Barr emphasized that this wasn’t a policy-making workshop, but the panels still hinted at which arguments the US government finds most compelling. And while this might sound like a low bar, they were actually arguments about the law — not the weird fantasy rules that dominate similar debates in Congress and mainstream press. That made it an unusually vivid window into the way prosecutors and lawmakers think about the 230 and how to change it.
Here are the five points that stood out the most.
The biggest battlegrounds were harassment and child abuse
Section 230 has been invoked for a lot of bad content — libel, shady gun sales, even defective dog collars. But today’s workshop workshop centered three particularly ugly issues: non-consensual pornography, harassment, and child sexual abuse material.
What is Section 230?
Section 230 of the Communications Decency Act, which was passed in 1996, says an “interactive computer service” can’t be treated as the publisher or speaker of third-party content. This protects websites from lawsuits if a user posts something illegal, although there are exceptions for pirated and prostitution-related material.
Sen. Ron Wyden (D-OR) and Rep. Chris Cox (R-CA) crafted Section 230 so website owners could moderate sites without worrying about legal liability. The law is particularly vital for social media networks, but it covers many sites and services, including news outlets with comment sections — like The Verge. The Electronic Frontier Foundation calls it “the most important law protecting internet speech.”
The Justice Department panelists included lawyer Carrie Goldberg, who started a high-profile fight with Grindr over a horrific harassment campaign; University of Miami professor Mary Anne Franks, who helped draft the first “revenge porn” law; and Yiota Souras of the National Center for Missing and Exploited Children. All laid out — sometimes in graphic detail — ways that abusive partners and sexual predators have weaponized the web.
Some of the wonkier and less dramatic cases got short shrift. Panelists only briefly mentioned a brewing fight over how Section 230 covers online marketplaces, for instance — although it has huge implications for sites like Amazon and Airbnb.
The tight focus helped ground an abstract legal debate in human terms. At one point Souras objected to a flippant mention of “death by ten thousand duck-bites” — a reference to websites being inundated with legal complaints under a weakened law. “We need to be careful with this terminology,” she argued. “I know there is a business cost to that, but there is a person who has been harmed online behind every single one of those ‘duck bites.’”
But we’ve seen heart-wrenching issues get cynically coopted to pass bad laws before. The FOSTA-SESTA rule, which cut Section 230 protections for prostitution-related material, was billed as a fight against human trafficking while glossing over its very real collateral damage among sex workers. Interestingly, FOSTA-SESTA’s impact didn’t get discussed extensively during the panels — although Souras said its passage has roughly correlated with a drop in child abuse material.
The debate focused on ‘Big Tech’, garbage websites... and not much in between
In a short opening speech, Attorney General Bill Barr called the Section 230 workshop an outgrowth of antitrust investigations into big tech companies. “Not all of the concerns raised about online platforms clearly fall within antitrust,” he explained — so Section 230 changes might fill in some regulatory gaps.
“just because you’re small doesn’t mean you’re automatically good”
Panelists largely echoed that framing, focusing on how giants like Google or Facebook were failing at moderation. But they also periodically referenced the other end of the spectrum: small sites devoted to noxious content like “revenge porn.” These sites test the limits of Section 230. At best, they’re encouraging abuse with a wink and nod. At worst, they’re actively participating in the abuse — Hunter Moore, who founded the infamous website Is Anyone Up, was convicted of hiring a hacker to get nude photos. As industry group Tech:NYC’s founder Julie Samuels noted in one panel, these fall outside the normal “Big Tech” debate lines: “just because you’re small doesn’t mean you’re automatically good.”
But beyond periodic complaints from Samuels and a few others, critics didn’t really address the potential challenges for medium-sized sites like Reddit or Craigslist — which don’t have the financial resources or lobbying power of Facebook or Google. “Section 230 is not just for ‘Big Tech,’” argued Patrick Carome, who has defended a long list of Section 230 cases. If sites can only operate with armies of moderators or sophisticated automation, that’s functionally an advantage for the biggest and wealthiest companies.
Encryption might be on the chopping block — but nobody quite admitted it
“There simply has to be a compromise in how encryption gets rolled out”
The Justice Department has tentatively supported a bill called the EARN IT Act, which many see as a Trojan horse for encryption bans. Today’s workshop didn’t allay that concern. Barr referenced how Section 230 might hurt “efforts to combat lawless spaces online,” warning that platforms could use the policy to lock out law enforcement. And Assistant Attorney General Beth Williams, who moderated a panel, specifically asked how encryption could hurt efforts to find child sexual abuse material. “There simply has to be a compromise in how encryption gets rolled out,” responded Souras.
But the Justice Department has been asking for concessions on encryption for years, and it’s still not clear what such a compromise might look like. In response to the same question, CCIA president Matt Schruers broadly expounded on “balancing” encryption with law enforcement access, but more as a general principle than a legal doctrine.
The vagueness isn’t exactly surprising. The EARN IT Act doesn’t even mention encryption, and even without the issue, there’s plenty of disagreement on how to change Section 230.
Debating legal fixes could be a mess
A lot of big tech policy fights can be summarized as one big, clear demand. Pass a net neutrality law. Repeal mass surveillance rules. Stop a bad intellectual property bill.
But the Section 230 debate is harder to pin down. Should anybody be able to sue a website for hosting illegal content? Should state prosecutors just have more power? Do only certain kinds of website get protections?
Carveouts vs. bargaining chips
Neil Chilson, a fellow at the Charles Koch Institute, grouped reform proposals into two categories. One is a “carveout” approach that strips protection from certain categories of content — like FOSTA-SESTA did for sex work-related material. The other is a “bargaining chip” system that ties liability protection to meeting certain standards — like the EARN IT Act, which (as its name suggests) makes sites prove they’re fighting child sex abuse.
These are vastly different visions for the internet, even before you define what the categories and standards are. It’s easy to articulate a flat opposition to changes. But even some of Section 230’s biggest proponents, like panelist and legal scholar Jeff Kosseff, are open to tweaking its language. The clearest rhetorical strategy might focus on what kind of terrible thing you want to scrub off the internet — however that’s accomplished.
The “political bias” fight looks like a pointless publicity circus
A handful of conservative politicians have promoted the notion that Section 230 should (or already does) require websites to be politically “neutral platforms.” Last year, Sen. Josh Hawley (R-MO) sponsored a proposal for making sites earn the approval of a government committee before getting liability protections — effectively turning tech policy into a cudgel to punish companies with opposing political views.
Thankfully, the Justice Department seems to have another approach in mind. This proposal earned one brief, slightly mocking aside during the nearly four-hour workshop. Barr complained that decreasing competition was hurting the “diversity of political discourse,” but he didn’t tie that to Section 230 changes. Neither did panel moderators from the Justice Department. Policing Facebook’s political slant might be a crowd-pleasing goal for politicians and pundits, but it simply wasn’t a serious conversation topic. Neither was the popular misconception that Section 230 defines websites as “publishers” or “platforms” and polices them differently.
This created space to address more nuanced points. Barr, for example, tried to explain why the Justice Department cares so much about Section 230 reform despite the existing exemption for federal criminal prosecutions. (“Federal criminal prosecution is powerful, but necessarily, it’s a limited tool that addresses only the most serious conduct,” and civil liability can “work hand in hand” with it to offer more recourse for victims.) Several panelists asked for more evidence that Section 230 had actually incentivized good moderation — or whether, in Souras’ words, that goal is “kind of aspirational.”
You can reasonably disagree with these claims. But unlike a lot of the broadsides against Section 230, they’re arguments that can be actually disputed — not just debunked as nonsense.