Skip to main content

Anti-exploitation bill advances in Senate despite free speech concerns

Anti-exploitation bill advances in Senate despite free speech concerns


Critics say the EARN IT Act would ‘make it far riskier for platforms to host user-generated content’

Share this story

Photo by BILL O’LEARY/POOL/AFP via Getty Images

In a markup hearing on Thursday, the Senate Judiciary Committee advanced a controversial new bill targeting Section 230 protections for content involving online sexual exploitation. In a majority vote, the EARN IT Act was favorably referred for a floor vote, despite vigorous objections from human rights and privacy groups.

Among other provisions, the bill would strip companies of immunity for the knowing transmission of child sexual abuse imagery (CSAM) on their platforms. Currently, companies are protected from civil cases by Section 230.

“It’s turning social media companies into an arm of the government”

Human rights groups have compared the bill to FOSTA — an anti-sex trafficking bill passed into law in 2018 — saying EARN IT will restrict online speech while doing little to address the underlying problems. In a group letter to the committee, more than 60 human rights organizations (including the EFF and ACLU) called on the committee to abandon the effort.

“By opening providers up to significantly expanded liability, the bill would make it far riskier for platforms to host user-generated content,” the letter reads.

Critics also worry that the bill could open the door to a de facto ban on end-to-end encryption. The current version of EARN IT says encryption shall not “serve as an independent basis for liability of a provider,” but it can still be taken into account when courts consider whether a company has taken reasonable efforts to root out exploitation on its networks.

Within the committee, lawmakers characterized the bill differently. Sen Blumenthal (D-CT), a lead sponsor of the measure, described the bill as “a narrow carveout to section 230” and dismissed concerns over encryption as “a red herring.” Blumenthal also detailed separate incidents in which victims saw non-consensual pornography (also called “revenge porn”) shared on Twitter and Reddit respectively and faced few options without involving federal law enforcement. 

What is Section 230?

Section 230 of the Communications Decency Act, which was passed in 1996, says an “interactive computer service” can’t be treated as the publisher or speaker of third-party content. This protects websites from lawsuits if a user posts something illegal, although there are exceptions for pirated and prostitution-related material.

Sen. Ron Wyden (D-OR) and Rep. Chris Cox (R-CA) crafted Section 230 so website owners could moderate sites without worrying about legal liability. The law is particularly vital for social media networks, but it covers many sites and services, including news outlets with comment sections — like The Verge. The Electronic Frontier Foundation calls it “the most important law protecting internet speech.”

It’s increasingly controversial and frequently misinterpreted, however. Critics argue that its broad protections let powerful companies ignore real harm to users. On the other hand, some lawmakers incorrectly claim that it only protects “neutral platforms” — a term that’s irrelevant to the law.

“The EARN IT Act will stop exploitation before it starts,” Blumenthal said in an opening statement. “It fosters the next generation of technology to stop exploitation, and it does hold tech companies when they fail survivors and they enable these predators to spread child sexual abuse material.”

Most online platforms scan aggressively for CSAM, reporting material to the National Center for Missing and Exploited Children (NCMEC) on a voluntary basis. Hosting services like Google Drive and Dropbox also scan their platforms against hashes generated by NCMEC to ensure CSAM can’t be stored privately in the cloud or included as an attachment to an email. (Apple recently tabled a complex plan that would impose similar scans in iCloud storage.) Despite these efforts, CSAM continues to circulate in less regulated online spaces, particularly those hosted outside the US.

Outside of CSAM, systems for blocking non-consensual pornography are far less developed, although platforms like Facebook and Pornhub have tested limited portals that would allow victims to flag specific photos. Unlike CSAM, other kinds of non-consensual pornography continue to circulate in mainstream online spaces, and there is no centralized database for flagging and removing them, as Blumenthal’s anecdotes show.

Critics of the bill acknowledge the urgency of these problems but say the EARN IT Act won’t help fight exploitation. In particular, legal experts worry that compelling companies to disclose information to NCMEC will create significant problems for existing prosecution. If that reporting is legally compelled, courts could rule that tech companies are acting on behalf of law enforcement, and thus subject to constitutional restrictions under the state actor doctrine. In essence, Congress can’t compel private companies to do anything that would be unconstitutional for law enforcement to do on its own.

“It’s turning social media companies into an arm of the government,” says Carl Szabo, a vice president and general counsel at NetChoice. The result would be new restrictions on how tech companies can scan for CSAM. State laws in Illinois and South Carolina directly prohibit legal requirements for such information sharing.

First introduced in 2020, the EARN IT Act previously cleared committee the following July, but stalled in the broader Senate. This time, the committee moved to mark up the EARN IT Act without a hearing, a move critics see as an effort to sidestep debate over its contents.

“They recognize that the more time people have to talk about this bill, the more time they have to identify its fundamental problem,” Szabo said.