Skip to main content
All Stories Tagged:

Speech

On today’s internet, the boundaries of acceptable speech are set by a few massive platforms, including Facebook, Twitter, Instagram, YouTube, and a handful of others. If those companies find something unacceptable, it can’t travel far — a restriction that’s had a massive impact for everyone from copyright violators to sex workers. At the same time, vile content that doesn’t violate platform rules can find shockingly broad audiences, leading to a chilling rise in white nationalism and violent misogyny online. After years of outcry, platforms have grown more willing to ban the worst actors online, but each ban comes with a new political fight, and companies are slow to respond in the best of circumstances. As gleeful disinformation figures like Alex Jones gain power — and the sheer scale of these platforms begins to overwhelm moderation efforts — the problems have only gotten uglier and harder to ignore. At the same time, the hard questions of moderation are only getting harder.

Supreme Court to hear case on how the government talks to social media companies

Murthy v. Missouri could change how platforms deal with covid misinfo, election threats, and more.

A
External Link
The post office is not an interactive computer service.

Legal blogger Eric Goldman covers a weird (and rightfully smacked down) attempt at getting Section 230 immunity for nonconsensually distributing nude pictures through the mail — including some brief, useful observations on the state of nonconsensual pornography law both on- and offline.


L
External Link
DeSantis vetoes bill that would have barred teens under 16 from using social media, but teases a “superior” one.

He explained that the veto was “because the Legislature is about to produce a different, superior bill.” The Florida governor added that he expects that new bill “will be signed into law soon.”

On X, Speaker Paul Renner later posted that the state Senate would hear a new bill, HB3, on Monday, which “will empower parents to control what their children can access online.”


A
External Link
“He didn’t shotgun a beer, crunch the can against his head, rip his robe off, and scream ‘First Amendment forever, motherfuckers!’ But that was the vibe.”

Corbin Barthold at The Daily Beast has a good piece on Brett Kavanaugh’s role at yesterday’s Supreme Court arguments: the only one in the room (besides NetChoice’s own lawyer) treating government censorship as a unique and serious concern.

Kavanaugh called the states out for trying to turn the First Amendment upside down. “In your opening remarks,” he told Florida’s solicitor general, “you said the design of the First Amendment is to prevent ‘suppression of speech.’ And you left out…three words…, by the government.”


You can check out any time you want...

Amy Coney Barrett is bringing up the Hotel California clause yet again, and I’m actually grateful — she points out the plain language of the law seems to specifically say you can’t ban Texan users. Nielson says it’s a conditional rule. “If you choose to do business in Texas, then this provision kicks in,” but “if you don’t want to do business in Texas at all” you’re okay to geofence the state. “You can’t darn well discriminate” against Texas users if you operate in there, he says.

Barrett pushes back — what does that mean? “You have to have customers in Texas,” Nielson says, although he acknowledges a court hasn’t really defined the boundaries.


Why does Texas’ solicitor general keep insisting the phone carrier industry is competitive?

Or in his words, “intensely competitive” — which he says doesn’t change the fact they could be considered common carriers, so the existence of multiple social networks shouldn’t save those from regulation either. But there are basically three mobile networks in the country! It’s not an outright national monopoly, but it’s a pretty consolidated space.


The Hotel California clause rears its head again.

Nielson says it’s not an accurate read of the law, but Roberts expresses apprehension about whether it would really be possible to pull out of the state in a way that satisfies its requirements. “I don’t see how they can wall off Texas,” he says.

Nielson suggests Facebook could geofence off everyone in the state and avoid selling Texas users’ data, which (he says) would make it possible to also reasonably ban Texas-based users from the site.


“What platforms does Texas’ law cover?”

Amy Coney Barrett suggests HB 20’s scope is more limited than Florida’s equivalent law, saying it only covers the “classic social media sites” like Facebook, not platforms like Etsy. Nielson agrees with her, and weirdly nobody brings up Wikipedia — whose operators have expressed concern they’d be covered by the law.


Texas solicitor general Aaron Nielson brings up Zephyr Teachout and Tim Wu’s support.

Nielson is defending HB 20 now. A group of legal scholars, he points out, worry that striking down the Texas law could make tech company regulation in general impossible — although even they call the Texas law “dangerous.”


Does the Texas moderation law really have a ‘Hotel California’ clause?

Clement mentioned what he calls a “Hotel California” provision of HB 20, which he interprets as a ban on companies pulling out of Texas if they can’t meet its legal burdens. Brown Jackson questions whether that’s a reasonable interpretation of the law; she seems less convinced it’s the right read.

You can read the rule itself below — check out the third “based on” section, citing geographic location.


“Sec.A143A.002.AACENSORSHIP PROHIBITED. (a) A social media platform may not censor a user, a user ’s expression, or a user ’s ability to receive the expression of another person based on: (1)AAthe viewpoint of the user or another person; (2)AAthe viewpoint represented in the user ’s expression or another person ’s expression; or (3)AAa user ’s geographic location in this state or any part of this state.”
Alito: All your metaphors are wrong.

A lot of today’s fight has been about metaphors, and Alito is questioning whether some of them make sense — a newspaper in NetChoice’s view, and a common carrier like a telegram company in the states.’ Clement points out that the court has regulated the internet specifically before in cases like Reno v. ACLU, which struck down most of the Communications Decency Act. Unfortunately, that doesn’t really clear up the metaphor question.


“This is an absolute requirement to respond to every takedown.”

Clement fields a question from the court on why Texas’ requirement to explain social media takedowns is more problematic than EU laws requiring some level of consistent moderation and explanation — he argues that Texas’ individual response provision would be “incredibly burdensome.”


Kavanaugh: “When I think of Orwellian, I think of the state.”

Several justices (both liberal and conservative) have seemed sympathetic to the idea that private companies can engage in harmful censorship, with Alito referring to the possibility as “Orwellian.” But Kavanaugh keeps pushing back on the premise. “We don’t want the state interfering” with these private entities, he says, even if they’re powerful.

Prelogar gently disagrees, saying social networks can seriously affect speech rights. “We are not suggesting that governments are powerless to respond” to concerns about platform censorship, she says — just not through laws like Florida’s.


Gorsuch has a confusing take on the history of Section 230.

Speaking to US Solicitor General Elizabeth Prelogar, he suggests Section 230 is conditioned on the idea that web platforms are common carriers. That’s a weird take on Section 230, which is regularly applied to small blogs and online newsletters — and was passed partly to avoid punishing services that moderated content to be “family friendly” rather than acting as neutral conduits. Prelogar, for what it’s worth, disagrees with Gorsuch.


NetChoice: “If it’s not the government, you really shouldn’t label it censorship.”

Brown Jackson brings up a common critique of NetChoice’s position: the internet has become the public square, and large social media companies play a huge gatekeeper role, so why not regulate them? NetChoice’s Clement argues that web platforms rise (he mentions TikTok) and fall (sorry, X) quickly enough that it’s clear there are lots of avenues for speech online.


Clarence Thomas is still trying to make this Section 230 argument happen.

It’s been brought up by other justices in passing, but Thomas really keeps harping on whether striking down this law should mean Section 230 is unjustifiable. He’s bringing up some questions that are interesting but also pretty tangential to the core case, like whether “deep learning algorithms” are platform speech. (Clement says... probably.) “What is the algorithm saying?” Thomas asks.


Coney Barrett: “It’s not obvious to me that [DMs and Gmail] can’t qualify as common carriers.”

Amy Coney Barrett expresses discomfort with whether striking down the Florida law would prevent any regulation of web platforms as common carriers. Clement’s reaction is effectively that the law is so bad that it should simply be struck immediately and any future questions sorted later, but other justices break in and question that logic.


Kagan: Going after “bigness” alone isn’t a First Amendment violation.

So why can’t a law focus on specifically regulating some of the world’s biggest communications services? Clement vociferously disagrees, citing Minneapolis Star Tribune Co. v. Commissioner among other cases. We’re getting into the most complicated question of the case here: how far should the Supreme Court go in protecting “Big Tech” companies from regulation overall, not just striking down these particular laws?


Clarence Thomas is mad at Section 230 again.

After reminding us he’s older than the internet, Thomas makes the claim that sites have been claiming Section 230 protections as neutral “conduits,” accusing them of hypocrisy for saying they hold editorial standards now. NetChoice’s Clement disagrees — saying they’re simply not supposed to be treated as the publishers of specific user-generated content.


Etsy isn’t political? Ohh, boy.

Etsy keeps coming up as an example of a smaller user-generated content platform that could get hit by Florida’s law, and Whitaker keeps suggesting that its moderation is fundamentally non-speech-related and therefore irrelevant to the debate. As Platformer lays out, that’s simply not true — Etsy’s moderation of speech around Israel and Palestine has in fact been incredibly fraught.


Amy Coney Barrett: “If you have an algorithm do it, is that not speech?”

Coney Barrett asks whether automated decisions are fundamentally different from conventional editorial judgments. Whitaker tries to bring up Twitter v. Taamneh, in which sites said their automation in a particular case (involving terrorism) didn’t present a viewpoint.

Coney Barrett smacks that down, saying whatever happened in that case, it’s clearly not the argument sites are making here. Whitaker won’t let it go and keeps saying they’re “neutral” ways to organize information — but the justice seems skeptical.


Kavanaugh: “Do you agree ‘by the government’ is what the First Amendment is targeting?”

Brett Kavanaugh notes Whitaker’s opening statement doesn’t mention that the First Amendment is conventionally focused on government suppression of speech, not private speech decisions. Whitaker says there’s still a larger First Amendment interest in protecting freedom of speech from censorship by other parties.


Justices keep pointing out that lots of online moderation involves editorial judgment calls, and there’s a huge variety of sites online.

Clarence Thomas complains that there’s a lack of specificity in the discussion about what’s covered, and Samuel Alito pushes on whether the law could regulate “expressive” conduct that should deserve First Amendment protection.

Overall, justices are (understandably) focusing a lot whether these companies are really presenting themselves as “open for business” to all comers, or whether they’re making newspaper-like judgments — Kagan asks why banning these editorial-style judgments is not, as she puts it, a “classic First Amendment violation.”


Justice Kavanaugh: Could the government regulate movie theaters and bookstores like it could Facebook?

Brett Kavanaugh asks Florida attorney Whitaker about one ongoing question from critics: would saying the First Amendment doesn’t prevent forced speech on websites undercut the rights of all kinds of other businesses that deal with speech?

Whitaker says no, but Justice Ketanji Brown Jackson picks up the question — asking exactly how the law would pick its targets consistently.


A
The Verge
Justice Sotomayor: “This is so, so broad, it’s covering almost everything.”

“But the one thing I know about the internet is that its variety is infinite,” Sotomayor continues. “So at what point at a challenge like this one does the law become so generalized, so broad, so unspecific really that you bear the burden of coming in and telling us exactly what the sweep is?”

Florida solicitor general Henry Whitaker disagrees that the law is overbroad, saying it only regulates websites that “host user-generated content.” Sotomayor disagrees — bringing up sites like Etsy that are far smaller than the Facebooks and YouTubes of the world, and focus specifically on particular kinds of content. “They’re going to have to censor” to maintain those limits, Sotomayor says. Why shouldn’t they be able to do that?


We’re an hour away from the Supreme Court’s politically complicated online moderation fight.

Texas and Florida’s anti-moderation laws are explicitly pro-Republican, but as my colleague Lauren laid out last week, the lawsuits over them could have bigger tech regulation fallout. While we’re waiting for oral arguments to start, check out The Atlantic and The New Republic for a couple of pieces on the states’ counterintuitive bipartisan appeal, plus some countertakes from Lawfare and Techdirt too.

Arguments begin at 10AM ET — you can listen in on the Supreme Court site directly, and I like the annotated C-SPAN feed too.


A
External Link
Problems that impact people: too much of a bummer for Threads and Instagram.

The Washington Post has been pushing Meta on what its recent decision to stop recommending “political” content and “social topics” on Threads and Instagram means, and it’s culminated in this illuminating quote from Meta spokeswoman Claire Lerner:

“Social topics can include content that identifies a problem that impacts people and is caused by the action or inaction of others, which can include issues like international relations or crime.”

So there you have it! It’s yet another indication that Meta is trying to avoid anything getting too contentious on its platforms — but a definition that might not sit well with anyone who’s interested in even the mildest forms of activism online.


J
External Link
The BBC explores pushing further into the fediverse.

Six months after kicking off an initial Mastodon trial that saw it launch its own instance on the federated platform, the UK’s public broadcaster is not just extending the experiment by another six months, it’s also “planning to start some technical work into investigating ways to publish BBC content more widely using ActivityPub.”

It feels like a promising sign for the future of the fediverse.


A
External Link
A really good paper on AI, law, and child abuse.

Child sexual abuse material is a well-known exception to the First Amendment, but the law around AI-generated simulations of it is vastly more complicated. Lawfare’s new analysis addresses some of my longstanding questions — along with others I hadn’t even thought to ask, including the status of tools that accidentally train on CSAM. It’s long, but if you’re interested in how AI will test the criminal justice and legal systems, absolutely worth the read.