Skip to main content

The creators of the buzzy audio app Clubhouse made a depressingly common mistake

The creators of the buzzy audio app Clubhouse made a depressingly common mistake


The $100 million startup is learning the hard way that content moderation needs to come first

Share this story


Yesterday we talked about how a recent social media conflict between journalists and the tech world might be better framed as a conflict between managers and their employees. Today I want to look at that conflict from another angle — how it played out on the buzzy, audio-only, invite-only social network Clubhouse. Like many social startups before it, the company neglected to develop or enforce strong community guidelines before launch — and the oversight could derail a company valued at $100 million while still in private beta.

Let me acknowledge up front that Clubhouse is barely a few months old, and currently has just two full-time employees — its founders, Paul Davison and Rohan Seth. I’ve known Davison for about seven years, and have always found him fun to talk to. He’s charming, he’s had multiple wild visions about what the future can look like, and he has repeatedly convinced venture capitalists to part with millions of dollars so that he can build it.

But one of the core principles of The Interface says this: “Most tech CEOs are intelligent, kind, hard-working people who want to make the world a better place, and this is largely beside the point.” And so this is not a column about the co-founder’s intentions, which I assume to be good. Instead, it’s about the way Davison has built products to date — and the gap between that style and the way I think modern social networks ought to be built.

Let’s start at Pinterest. In the summer of 2016, that company had hired the team behind Highlight, a boldly invasive app that broadcast your name, photo and other information to other users in hopes of introducing you to strangers. Highlight was led by Davison, a former Googler who had an expansive vision for transforming what previous generations would have regarded as privacy invasions into products. “If you don’t push things a little bit, you miss opportunities,” Davison told me in 2013. “Fifteen years ago, it would be crazy to post your resume online. This is new territory we’re figuring out.”

Highlight never got traction. Neither did the company’s next boundary-pushing effort, Shorts, which invited you to share your camera roll with friends and friends of friends. ”If you look at the most interesting and loved and useful social products over the last 20 years, you’ll find that lots of them have pushed us to share a little more openly than perhaps we felt comfortable doing,” Davison told me about that one.

Once he was acqui-hired by Pinterest, Davison took on a refreshingly mundane challenge: taking over the development of “tried it” pins, a feature that lets users post photos of the activities they’ve completed related to Pinterest posts. If you find a recipe for a cake on Pinterest and make it, for example, the feature lets you post your version in a thread attached to the original pin. The feature was in testing the summer that Davison joined, and he oversaw its development until it was released in November.

When it was released, though, there was a problem: the feature was not connected to the systems that screen content for pornography, harassment, and other violations of Pinterest’s content policies. As a result, Pinterest saw a surge in pornographic content uploaded to the service, two former employees told me. “About one out of every dozen photos uploaded was a penis for a good while,” one told me.

Pinterest told me the problem was fixed shortly after launch. Through a spokesperson, Davison declined to comment.

But to one former employee I spoke with, the lapse was emblematic of an overly laissez-faire attitude to content moderation on Davison’s part. “His entire perspective was always to push for, how do we get users to expose more data in the product?” the former employee said. “User trust and safety was completely an afterthought.”


All of that feels like necessary context for understanding how Clubhouse found itself at the center of a now much-discussed conflict between New York Times reporter (and friend of The Interface) Taylor Lorenz and the investor Balaji Srinivasan. When Lorenz joined a conversation about herself in the app — one in which she would eventually be accused of playing “the woman card” in complaining about harassment she was receiving on Twitter and elsewhere — she could not have reported it even if she wanted to.

The reason is that Clubhouse does not allow users to report harassment or other violations of its terms of service through the app. And Lorenz, who wrote an enthusiastic early profile of the app in May, told me she has been besieged by Clubhouse trolls. The app offers no ability to block users, and so some users are changing their profile pictures to Lorenz’s antagonists to taunt her while she uses the app. Screenshots of beta tester forums that I obtained show users begging Clubhouse’s founders to, among other things, write comprehensive community guidelines. (Its published terms of service are largely just legal boilerplate.)

“Writing up community rules to include expected behaviors, actions and giving people a place to appeal is super important,” one woman wrote in the private user forums. “It’s just as important to enforce these actions including timeouts / re-education and suspension when warranted. I don’t think Taylor’s incident is going to be the last, unfortunately.”

Davison called Lorenz to discuss the harassment she had faced, she told me, and asked her to offer suggestions for what Clubhouse could do. She offered a variety of suggestions, including banning people that harass other users, none of which have so far been implemented. Lorenz told me she felt disappointed when Davison went on to like a tweet that read, “Honestly in this whole Taylor vs Balaji S., Clubhouse won.”

During my reporting, I’ve also heard from Clubhouse users who have reminded me, in exasperated fashion, that the app is currently in a closed beta. Traditionally, the invitation-only stage of a social app has been used to build the exact systems these users are now clamoring for. A two-person startup that goes from idea to a $100 million valuation within a few weeks has countless problems to worry about, Clubhouse supporters tell me. Also the founders give out their email addresses to users, and respond to many of their complaints personally.

At the same time, we’ve seen enough social networks come and go that we now understand the consequences of making content moderation an afterthought. Ask Reddit, which just a few weeks ago thought to explicitly ban hate speech — years after nurturing communities of racists, nonconsensual porn distributors, and other blights on the internet.

And for Clubhouse, moderation issues promise to be particularly difficult — and if the app is to ever escape closed beta successfully, will require sustained attention and likely some product innovation. Tatiana Estévez, who worked on moderation efforts at the question-and-answer site Quora, outlined Clubhouse’s challenges in a Twitter thread.

Audio is fast and fluid; will Clubhouse record it so that moderators can review bad interactions later? In an ephemeral medium, how will Clubhouse determine whether users have a bad pattern of behavior? And can Clubhouse do anything to bring balance to the age-old problem of men interrupting women?

“Is this impossible? Probably not,” Estévez wrote. “But in my experience, moderation and culture have to be a huge priority for both the founding team as well as for the community as a whole.”

Moderation does not appear to have been a huge priority at Highlight, or at Shorts, or the team that built the “Tried It” feature at Pinterest. If Clubhouse is to live up to the potential its investors clearly see in it, its builders should consider making it one, and soon.


The civil rights leaders behind the major advertising boycott of Facebook say they are still not convinced that the company is taking enough action against hate speech and disinformation. The news comes after a meeting between the leaders, Mark Zuckerberg and Sheryl Sandberg on Tuesday. Russell Brandom has the story at The Verge:

After months of escalating pressure, leaders from the #StopHateForProfit boycott campaign met with Facebook CEO Mark Zuckerberg and other Facebook executives today. But in a call with reporters after the meeting, organizers from Color of Change, Free Press, the NAACP, and the Anti-Defamation League described the meeting as frustrating.

“The meeting that we just left was a disappointment,” said Color of Change president Rashad Robinson. “At this point, we were expecting a very clear answer to the demands we are making, and we did not get that.”

Scientists and other employees of the Chan Zuckerberg Initiative are pushing Mark Zuckerberg to create stronger policies around misinformation on Facebook. Their concerns echo employee discontent at the social platform over what some see as Zuckerberg’s inaction on hate speech. (Kurt Wagner and Sophie Alexander / Bloomberg)

A loose network of Facebook groups that organized protests over stay-at-home orders in April has pivoted to a variety of new targets. Their latest includes Black Lives Matter and the nationwide protests of racial injustice. (Amanda Seitz / Associated Press)

A climate scientist says Facebook is restricting her ability to share research and fact-check posts containing climate misinformation. The company also recently overruled a fact-check from a group of climate scientists — a move that concerned researchers. (Scott Waldman / E&E News)

Inside the difficulties of policing misinformation on WhatsApp. The attributes that make the app appealing — end-to-end encryption and private group chats — also make it hard to moderate. (William Davies / The Guardian)


Meet Mmhmm, a virtual camera that turns your boring Zoom call into a Weekend Update-style TV show. It can be used with Zoom, Google Meet, YouTube, and other video streaming services. This is the first piece of social tech that I’ve seen built since the COVID-19 pandemic that feels made for our times. I wrote about it at The Verge:

Mmhmm — “it’s important to have a name you can say while eating,” Libin jokes — is a virtual camera that can be used with Zoom, Google Meet, YouTube, and other video streaming services. Turn it on, and the app transforms your room into a virtual stage. Like other videoconferencing tools, Mmhmm offers a variety of still and animated virtual backgrounds to enliven your conversations.

But that’s just the start: the real power of Mmhmm comes in the way it lets you easily manipulate slides, backgrounds, and your own image — either for fun or for business reasons. With a simple gesture on a trackpad, you can move your face around the screen, shrink or enlarge your image, or disappear completely. (You can also turn a grainy, opaque blue in a touch modeled after Jedi holograms.) You can post slides that appear over your shoulder and advance them with a tap. And you can team up with another Mmhmm user to create a collaborative presentation, with each of you able to manipulate images on the screen and advance the show.

Here are some features Twitter could design to foster healthier conversations on the platform, and deescalate conflict when it arises. They’re less complicated than an edit button, and probably more effective. (Nick Punt)

Major brands, including Amazon and Nvidia Ties are getting caught up in the #MeToo movement against sexual harassment and assault that’s sweeping through video-game streaming. The brands have sponsorships with high-profile players and personalities. ( Olga Kharif and Ian King / Bloomberg)

Voice, a crypto-based social media platform, launched over the weekend. The platform rewards users with Voice tokens for posting quality content, to incentivize good contributions. (Greg Thomson / Decrypt)

Inside The Last Light, an ambitious project from a now-defunct division inside Magic Leap Studios. After SXSW was canceled due to the coronavirus pandemic, its future is uncertain. (Adi Robertson / The Verge)

Magic Leap does have a new CEO, though. It’s Peggy Johnson, who comes to the company from Microsoft. (Adi Robertson / The Verge)

The information security community reacted angrily to calls to abandon the use of the terms “black hat” and “white hat,” which are used to differentiate between criminal and non-criminal hackers. Many said the terms have nothing to do with racial stereotyping. (Catalin Cimpanu / ZDNet)

And finally...

Talk to us

Send us tips, comments, questions, and Clubhouse community guidelines: and