Skip to main content

2015 was the year 'censorship' lost all meaning

2015 was the year 'censorship' lost all meaning

/

What is free speech now, and who can limit it?

Share this story

This year, in lieu of the traditional "Best Of" lists, we thought it would be fun to throw our editors into a draft together and have a conversation. This is the year that censorship was either a big deal or lost all meaning. We got an error code for governments blocking web pages, a presidential candidate suggested we "close up" the internet, we spent a lot of time wondering whether college campuses were limiting free speech, and we got into a gigantic debate over whether Reddit’s stricter policies constituted censorship. And of course we heard about whether all the things Gamergate and similar internet bottom-feeders hated last year — Twitter block lists, comment moderation, saying a game was sexist — were still turning the internet into a censorious wasteland. We brought together Adi Robertson and Russell Brandom to discuss whether anyone knows what censorship means now.

Adi Robertson: When I think back to older internet censorship fights, the ones I remember are about laws — like the Communications Decency Act or the suit over First Amendment protection for video games. But that's incomplete. The government has a kind of big, abstract control over the internet, but the most immediate power is in the hands of ISPs and web platforms like Google and Facebook. They can’t outright stop you from saying something, but they can make it really hard.

So okay, I’m down with talking about ISP or Google takedowns as censorship. But now it feels like we’re at some illogical extreme where censorship is just any person discouraging any other person from saying something. Turning off comments is censorship. Blocking people on Twitter is censorship. Changing a product based on user feedback is censorship. Criticism is censorship.

All social norms are basically censorship

The thing is, I sort of get the logic behind it. If you think of the internet as a big system for generating attention and social pressure, then you’re not worrying about a government or science fiction mega-corp literally putting you in jail. You’re worried about either losing access to that attention or getting the weight of that social pressure (i.e. the internet mob) thrown at you, no matter who's behind it. The problem is that by this definition, all social norms are censorship. So where do we draw the line for what you can put on the internet?

twitter block

Russell Brandom: It’s tricky, although like pornography, I like to think I know censorship when I see it. When articles about a certain topic start abruptly disappearing from Facebook, it’s hard to call it anything else. That’s not covered by the First Amendment, but it’s still erasing an idea from what we think of as a public space. Maybe Facebook counts as a privately owned public space, a la Zuccotti Park?

Maybe Facebook is like Zuccotti Park

On the other end of the spectrum, we saw stuff like BlockTogether, the Twitter extension that lets you share block lists with friends. For people targeted by the Gamergate hordes, it was a really important tool for keeping trolls out of their mentions feed, which was itself a form of heckler’s-veto censorship. Even better, it was all decentralized. Instead of a single decision being made by Twitter Inc., it was the product of thousands of different people voting with their block buttons. If you believe the EFF crowd, this is how it’s supposed to work.

Of course, it’s still arguably a form of censorship — and it turned out people on the block lists weren’t swayed by abstract points about decentralization. A friend of mine told me he was worried he’d argue with the wrong person and end up blocked by half the writers in New York. You can’t please everyone!

Reddit stock

Adi: Which is exactly the problem. The very fact that many earlier free speech debates were about laws means that we’ve sort of forced our conversation into the mold of a legal issue. We want the government to be extremely cautious about drawing distinctions between speech it likes and speech it doesn’t. If Congress says, "I’d like people to do this" instead of "People should be legally forced to do this," or "I won’t listen to you" instead of "You can’t speak," that can still have a chilling effect. But when you’re talking about Twitter users or forum moderators, you can’t cover those nuances with the same blanket rules.

"Only governments can censor" is still too simplistic

One of the big responses to Gamergate calling everything censorship has been saying that "only governments can censor, period," and anything platforms do is up to them. That’s technically true, but as some people have pointed out, it’s a pretty extreme libertarian position for groups that generally think corporate control can still be coercive and bad. But the alternative seems to be Reddit users’ idea that it’s totally impossible to separate rules and norms — so if something like white supremacy is technically permissible, it’s an intrinsically valuable viewpoint that we need to listen to. I think we should be wary of asking gigantic walled gardens like Facebook and Reddit to enforce rules about bad speech. It’s not contradictory to say that if you run a clubhouse inside one of those gardens, you should still kick out the guy who keeps calling black people apes.

But it’s a lot harder to draw the line around what’s ethical to say than what we want people to be capable of saying. I think it’s a conversation we started to have around publishing this year, especially with Hulk Hogan’s lawsuit against Gawker. As a journalist, it’s hard not to want the broadest possible legal protections for reporting. But there’s also a certain social responsibility that comes with journalism, and a lot of people called out Gawker for violating it. And when something like the Condé Nast escort exposé came out, it wasn’t the legal backlash that hurt them as much as the social backlash.

Secure Laptop Hacking Story

Russell: If nothing else, we’re living in a golden age of moral complaints about the media. On one hand, you have cases like Gawker’s escort story, where genuine moral problems seem to have slipped through the standard code of conduct for journalism. At the same time, you have a lot of complaints that go too far on the other side, like the growing chorus of people who seem to think it’s a moral failing of the media to cover Donald Trump at all. (What do you call it when journalists decide a given candidate is too radical to even discuss?)

And in the middle of all of it, you have hacked data. Around this time last year, I might have told you the Sony leaks were a one-off — but after Hacking Team, Ashley Madison, and Patreon, it’s pretty clear anonymous data dumps are here to stay. In the case of Ashley Madison, the leaks were the tail end of an extortion scheme, but it didn’t stop outlets (including this one) from jumping on the data once it was out there. Covering it up because we didn’t like where it came from would have felt like... well, censorship. I don’t think we’ll find a way out of that bind in 2016, but we’ll definitely need a new way of thinking about it.