This is a living guide to Section 230: what it is, what it isn’t, why it’s controversial, and how it might be changed. This guide will be updated as events warrant.
What is Section 230?
Join us on March 1st at 2PM for a virtual event about tech regulation and antitrust — featuring a keynote and Q&A from Sen. Amy Klobuchar. Sign up here!
Section 230 of the Communications Decency Act, which was passed in 1996, says an “interactive computer service” can’t be treated as the publisher or speaker of third-party content. This protects websites from lawsuits if a user posts something illegal, although there are exceptions for copyright violations, sex work-related material, and violations of federal criminal law.
Sen. Ron Wyden (D-OR) and Rep. Chris Cox (R-CA) crafted Section 230 so website owners could moderate sites without worrying about legal liability. The law is particularly vital for social media networks, but it covers many sites and services, including news outlets with comment sections — like The Verge. The Electronic Frontier Foundation calls it “the most important law protecting internet speech.”
It’s increasingly controversial and frequently misinterpreted, however. Critics argue that its broad protections let powerful companies ignore real harm to users. On the other hand, some lawmakers incorrectly claim that it only protects “neutral platforms” — a term that’s irrelevant to the law.
Similar legislation exists in the European Union and Australia.
What’s the relationship between Section 230 and the First Amendment?
In the United States, the First Amendment prohibits the government from restricting most forms of speech, which would include many proposals to force tech companies to moderate content. A law that required companies to moderate content based on the political viewpoint it expresses, for example, would likely be struck down as unconstitutional.
Private companies can also create rules to restrict speech if they so choose. This is why Facebook and Twitter ban hate speech, for example, even though it is legally permitted in the United States. These moderation rules are protected by the First Amendment as well.
This issue is distinct from discussions over whether platforms should be liable for what their users post, though it often gets lumped in with the 230 discussion.
How has Donald Trump tried to change Section 230?
In May 2020, President Donald Trump released an executive order targeting Section 230 and social media. (He reportedly drafted the order a year earlier, but it was tabled following confusion from regulators and legal experts, until a feud with Twitter revived the idea.) The order asked regulators to redefine Section 230 more narrowly, bypassing the authority of Congress and the courts. It also pushed agencies to collect complaints of political bias that could justify revoking sites’ legal protections.
Trump has broadly backed Republican efforts to change the law in Congress. Following Joe Biden’s election, he’s gone further and pushed for complete Section 230 abolition — threatening to veto the National Defense Authorization Act unless it includes a repeal of the law and packaging it into the ongoing push for $2000 direct stimulus payments.
How might Joe Biden change Section 230?
President-elect Joe Biden is less vocal than Trump about Section 230. But he’s also not a fan of the law. In January 2020, Biden proposed revoking Section 230 completely. “The idea that it’s a tech company is that Section 230 should be revoked, immediately should be revoked, number one. For Zuckerberg and other platforms,” Biden said. “It should be revoked because it is not merely an internet company. It is propagating falsehoods they know to be false.”
Biden hasn’t advanced a specific Section 230 agenda since the election. In December 2020, however, a Biden advisor suggested “throwing out” Section 230 and developing new legislation — saying the rule allowed children to view disturbing material online.
How has Section 230 been modified over the years?
In April 2018, Trump signed into law the Allow States and Victims to Fight Online Sex Trafficking Act (FOSTA), a bill that purports to fight sex trafficking by reducing legal protections for online platforms. (It’s also sometimes referred to as the Stop Enabling Sex Traffickers Act, or SESTA, after an earlier version of the bill.)
FOSTA carves out a new exception to Section 230, stating that Section 230 doesn’t apply to civil and criminal charges of sex trafficking or to conduct that “promotes or facilitates prostitution.” The rule applies retroactively to sites that violate it.
What effect has FOSTA-SESTA had?
Following the passage of the bills, websites began to censor parts of their platforms — not because they were currently hosting prostitution ads, but because of the faint possibility that some third party could do so in the future. The laws are why Craigslist no longer has a Personals section. Now, sex workers say that they have broadly been forced offline, making their work far less safe. Prostitution-related crime in San Francisco alone — including violence against workers — more than tripled.
Democrats have called for a study of the harms created for sex workers by the law. There is little to no evidence that the law has had much of an effect on reducing online sex trafficking.
What other changes are US legislators proposing?
In February 2020, the US Department of Justice held a day-long workshop to discuss ways in which Section 230 could be further amended. They’re examining cases in which platforms have enabled the distribution of nonconsensual pornography, harassment, and child sexual abuse imagery.
Proposals to reform the law generally fall into two categories. One is a “carveout” approach that removes protections from certain categories of content — like FOSTA-SESTA did for sex work-related material. The other is a “bargaining chip” system that ties liability protection to meeting certain standards — like the proposed Eliminating Abusive and Rampant Neglect of Interactive Technologies Act (EARN IT), which, as its name suggests, would make sites demonstrate that they are fighting child sex abuse. (This would likely have the intended side effect of weakening encryption for private messaging.) This approach is often bundled with broader data privacy and tech regulation proposals, which are covered in more detail in a separate guide.
To date, legislators have paid less attention to online marketplaces like Airbnb, which also benefits from the liability shield created by Section 230.
So far, the only bill to pass out of committee is the EARN IT Act, which was amended into a milder version before advancing.
What changes are congressional Democrats proposing?
Democrats have largely been concerned with getting platforms to remove more content because of the harms associated with hate speech, terrorism, and harassment. To facilitate this, they’ve helped introduce several bipartisan proposals to erode Section 230.
Sen. Richard Blumenthal (D-CT) was a sponsor of the EARN IT Act and is a frequent critic of Section 230’s protections. Sen. Brian Schatz (D-HI) has proposed an alternative called the Platform Accountability and Consumer Transparency (PACT) Act, which focuses on requiring websites to transparently report how they moderate content.
What changes are congressional Republicans proposing?
The most serious Republican effort to rewrite Section 230 has come not from Congress, but from the Department of Justice. In June 2020, Attorney General William Barr released a series of recommendations for how Section 230 might be reformed, playing off a string of workshops earlier in the year. The recommendations include new restrictions on cyberstalking and terrorism, which would likely result in more proactive moderation efforts, along with measures intended to punish arbitrary or discriminatory moderation. Barr’s proposal would only grant immunity for moderation decisions that are “done in accordance with plain and particular terms of service and accompanied by a reasonable explanation” — a far narrower scope than the current law.
Barr’s recommendations would need to be passed by Congress to have any legal force, but so far, they’re the best blueprint congressional Republicans have for what mainline conservative 230 reform might look like.
A smaller faction of Republicans has focused entirely on restricting moderation immunity, punishing platforms that moderate in a biased or otherwise discriminatory way. Sen. Josh Hawley (R-MO) has also proposed a bill that would bind platforms to a “duty of good faith,” entitling users to significant monetary damages if they were able to show in court that the platform had breached its duty.
More extreme versions of that approach include Rep. Paul Gosar (R-AZ)’s Stop the Censorship Act, which sought to prevent platforms from removing content that they found “objectionable.” (That would mean they could only remove posts that violated the law.) Introduced in 2019, Hawley’s Ending Support for Internet Censorship Act would have required platforms’ content moderation teams to be certified as politically “neutral” by a bipartisan panel in order to retain their liability protections.
Neither proposal has so far advanced. Republicans are also behind the EARN IT Act described above.
What do tech companies think the government should do?
Among tech platforms, Facebook has led the call for more regulation. In February 2020, CEO Mark Zuckerberg said the company ought to be regulated as something in between a telecommunications company and a newspaper. That same day, Facebook released a white paper laying out the approach it would prefer regulators take.
The approach rests on a handful of core assumptions: that platforms are global and thus subject to many different laws and competing cultural values; that they are intermediaries for speech rather than traditional publishers; that they will change constantly for competitive reasons; and that they will always get some moderation decisions wrong. (There’s another assumption buried in that last one: that they will never hire enough people to screen content in advance or in real time.)
Facebook argues that the government could hold tech platforms accountable for certain key metrics: holding violating posts below a certain number of views, for example, or setting a mandatory median response time for removing them. But they note that any of these efforts could create perverse incentives. If platforms are required to remove certain posts within 24 hours, for example, they are likely to simply stop looking at older posts while they focus on posts that are still within the 24-hour window.
What happens next?
Section 230 reform may take a different direction after Biden’s inauguration, but it’s likely to remain on the table, and Republicans will likely continue to push for their own changes to the law.
Section 230 will probably be modified again. The big questions are when — and how.
Update May 28th, 3:45PM ET: Updated with the details of President Trump’s proposed executive order.
Update June 18th, 1:30PM ET: Updated with details of the Barr recommendations and Hawley’s “duty of good faith” bill.
Update December 21st, 10:45AM ET: Updated with details of the Biden administration’s Section 230 plans.
Update December 29th, 4:41PM ET: Updated with Trump’s efforts to package Section 230 repeal with enhanced direct stimulus payments.