Ev Williams last came to Austin for the South by Southwest Interactive Festival eight years ago. The entrepreneur, who previously founded Blogger and sold it to Google, was then a regular attendee at SXSW, where he relished an opportunity to gather with like-minded technologists who believed in the potential of the web. In 2007, he and his co-founders came to Austin to promote their latest project — a fast-paced, text-based social platform called Twitter — and it exploded in popularity, eventually becoming a global phenomenon.
“Fifteen years ago, when we were coming here to Austin to talk about the internet, it was this magical place that was different from the rest of the world,” said Williams, now the CEO of Medium, at a panel over the weekend. “It was a subset” of the general population, he said, “and everyone was cool. There were some spammers, but that was kind of it. And now it just reflects the world.” He continued: “When we built Twitter, we weren’t thinking about these things. We laid down fundamental architectures that had assumptions that didn’t account for bad behavior. And now we’re catching on to that.”
If the last decade of SXSW celebrated the promise of social media, the next years may well be dominated by the reckoning. Questions about the unintended consequences of social networks pervaded this year’s event. Academics, business leaders, and Facebook executives weighed in on how social platforms spread misinformation, encourage polarization, and promote hate speech. It wasn’t the first time SXSW has examined the darker side of social media — several sessions grappled with targeted harassment in 2016 — but it felt like the most sustained reflection on the subject to date. “I don’t think it’s a losing battle,” said Alex Hardiman, head of news products at Facebook, “but I think it’s a really hard one.”
“I don’t think it’s a losing battle, but I think it’s a really hard one.”
The idea that the architects of our social networks would face their comeuppance in Austin was once all but unimaginable at SXSW, which is credited with launching Twitter, Foursquare, and Meerkat to prominence. Social apps took off in Austin first because the festival was once one of few places with a strong concentration of smartphone owners. And while smartphones have now swept the globe, SXSW continues to attract thousands of early adopters who are seeking new experiences.
But this year, the festival’s focus turned to what social apps had wrought — to what Chris Zappone, who covers Russian influence campaigns at Australian newspaper The Age, called at his panel “essentially a national emergency.” Zappone argued that Russian influence campaigns had rarely been as successful in undermining Americans’ trust in their institutions. Social networks, which often mask users’ true identities, offer bad actors a cheap and effective way of promoting their messages, Zappone said. Meanwhile, American norms promote both free speech and anonymity online, which can discourage companies like Facebook and Twitter from taking action against bad actors.
Steve Huffman, the CEO of Reddit, gave voice to that tension during his own panel. Reddit has historically embraced anonymity in the belief that it would encourage more robust public debate, even at the cost of hosting subreddits that promote racism and targeted harassment. “Reddit’s role is to be a platform for debate,” he told interviewer Christine Lagorio-Chafkin. “To let the ideas, and let these conversations, emerge and play out. That’s a really important part of the process in any political conversation.”
“It’s very important that we stand by our values and don’t try to overcorrect.”
It’s a noble sentiment that came without any clear ideas about how Reddit itself can protect the platform from propaganda. (Earlier this month, Huffman acknowledged for the first time that Russia used Reddit as part of its influence campaign leading up to the 2016 election.) Huffman discouraged strong intervention from the government. “The foundation of the United States and the First Amendment is really solid,” Huffman said. “We’re going through a very difficult time. And as I mentioned before, our values are being tested. But that’s how you know they’re values. It’s very important that we stand by our values and don’t try to overcorrect.”
Sen. Mark Warner (D-VA), vice chairman of the Senate Select Committee on intelligence, echoed that sentiment. “We’re going to need their cooperation, because if not, and you simply leave this to Washington, we’ll probably mess it up,” he said at a panel that, he noted with great disappointment, took place in a room that was more than half empty. “It needs to be more of a collaborative process. But the notion that this is going to go away just isn’t accurate.” Warner called for news literacy campaigns and told me afterward that he still hoped his Honest Ads Act, which creates new disclosure requirements for platforms like Facebook, would get a hearing.
Nearly everyone I heard speak on the subject of propaganda this week said something like “there are no easy answers” to the information crisis. But Apple, which has never been much for free speech, seemed much more relaxed about it. The company this week bought Texture, a “Netflix for magazines” service that offers subscribers access to 200 magazines. Eddy Cue, the company’s senior vice president for internet software and services, offered one alternative to the current mess: Apple News, an information platform with no real social features and a narrow whitelist of trusted outlets allowed to publish on it.
“People draw lines, and you’ve got to decide where you draw the line,” Cue said. “We do think free speech is important, but we don’t think white supremacist or hate speech is important speech that ought to be out there. Free speech is important, but that doesn’t mean it’s everything.”
“Free speech is important, but that doesn’t mean it’s everything.”
Susan Wojcicki, CEO of YouTube, arrived at the event with some practical ideas for addressing misuse of the platform. During an interview on Tuesday, she said the company would begin adding reality-based information from Wikipedia underneath conspiracy videos on YouTube. She also said the company would limit the amount of disturbing content that moderators have to view to four hours a day.
Even these moves seemed to draw mostly criticism, though. Craig Silverman, BuzzFeed’s media editor, called YouTube’s approach a “nesting doll” strategy: “Whereby platforms offload editorial responsibility in layers until you get to the final doll and it’s either volunteer community editors working for free or low-paid content moderators employed by contractors.”
Few people have watched misinformation change as quickly as David Mikkelson, the proprietor of the internet’s first major fact-checking site, Snopes.com. After his panel, I asked Mikkelson how the rise of social platforms had changed the nature of his work. The main change, he said, has been how much faster propaganda spreads than when he started, and hoaxes spread primarily via email.
“It kind of took weeks for things to go viral — gave us plenty of time to look into them, write them up,” he said. “Now somebody posts something outrageous on Facebook, and 20 minutes later it’s a headline on the New York Post. So obviously we have to be faster at what we do.” But fact-checking requires research, and lying does not, which keeps fact-checkers and the social networks that rely on them at a constant disadvantage. “It’s hard to get a shortcut for that,” Mikkelson told me.
“We need to spend more time thinking about what we should build, even though we might not be required to, and what we shouldn’t build, even if we’re technically allowed to.”
Margaret Gould Stewart, a vice president of product design at Facebook, said during her talk that events of the past two years had caused the company to think more about how its tools can be hijacked by bad actors. “We’ve learned that we need to spend a lot of time thinking about what I call misuse cases — when people take tools meant to be used for good and do bad things with them,” she said. “We need to spend more time thinking about what we should build, even though we might not be required to, and what we shouldn’t build, even if we’re technically allowed to.”
But at a time when some former Facebook employees have expressed remorse for their role in building the social network, Stewart was defiant. “I don’t regret my role in creating these products,” she said. “I spent my whole career building tools [to] access information, to express ourselves creatively, to get and stay connected to the people we care about the most. These are enduring human needs that tech can and should address… I wouldn’t want to go back to a world without these inventions.”
And if there is one thing that hasn’t changed about SXSW, it was that: a sense that tech would prevail in the end. “It would also be naive to say we can’t do anything about it,” Ev Williams said. “We’re just in the early days of trying to do something about it.”
Correction March 14th, 11:34AM ET: This article originally said Williams had last appeared at SXSW more than a decade ago. In fact, he was last there in 2010.