Skip to main content

The battle inside Signal

The fast-growing encrypted messaging app is making itself increasingly vulnerable to abuse. Current and former employees are sounding the alarm.

Share this story

Illustration by Alex Castro / The Verge

On January 6th, WhatsApp users around the world began seeing a pop-up message notifying them of upcoming changes to the service’s privacy policy. The changes were designed to enable businesses to send and store messages to WhatsApp’s 2 billion-plus users, but they came with an ultimatum: agree by February 8th, or you can no longer use the app.

The resulting furor sparked a backlash that led Facebook-owned WhatsApp to delay the policy from taking effect until May. In the meantime, though, tens of millions of users began seeking alternatives to Facebook’s suite of products. Among the biggest beneficiaries has been Signal, the encrypted messaging app whose development is funded by a nonprofit organization. Last month, according to one research firm, the six-year-old app had about 20 million users worldwide. But in a 12-hour period the Sunday after WhatsApp’s privacy policy update began, Signal added another 2 million users, an employee familiar with the matter told me. Days of temporary outages followed.

“The world needs products like Signal — but they also need Signal to be thoughtful.”

The pace has hardly relented since. Signal leapt to No. 1 in the app stores of 70 countries, and it continues to rank near the top of most of them, including the United States. While the company won’t confirm the size of its user base, a second employee told me the app has now surpassed 40 million users globally. And while Signal still has a small fraction of the market for mobile messaging — Telegram, another upstart messenger, says it added 90 million active users in January alone — the rapid growth has been a cause for excitement inside the small distributed team that makes the app.

Adding millions of users has served as a vindication for a company that has sought to build a healthier internet by adopting different incentives than most Silicon Valley companies.

“We’re organized as a nonprofit because we feel like the way the internet currently works is insane,” CEO Moxie Marlinspike told me. “And a lot of that insanity, to us, is the result of bad business models that produce bad technology. And they have bad societal outcomes.” Signal’s mission, by contrast, is to promote privacy through end-to-end encryption, without any commercial motive.

But Signal’s rapid growth has also been a cause for concern. In the months leading up to and following the 2020 US presidential election, Signal employees raised questions about the development and addition of new features that they fear will lead the platform to be used in dangerous and even harmful ways. But those warnings have largely gone unheeded, they told me, as the company has pursued a goal to hit 100 million active users and generate enough donations to secure Signal’s long-term future.

“They’ve been resistant to even considering what a policy might look like.”

Employees worry that, should Signal fail to build policies and enforcement mechanisms to identify and remove bad actors, the fallout could bring more negative attention to encryption technologies from regulators at a time when their existence is threatened around the world.

“The world needs products like Signal — but they also need Signal to be thoughtful,” said Gregg Bernstein, a former user researcher who left the organization this month over his concerns. “It’s not only that Signal doesn’t have these policies in place. But they’ve been resistant to even considering what a policy might look like.”

Interviews with current and former employees, plus leaked screenshots of internal deliberations, paint a portrait of a company that is justly proud of its role in promoting privacy while also willfully dismissing concerns over the potential misuses of its service. Their comments raise the question of whether a company conceived as a rebuke to data-hungry, ad-funded communication tools like Facebook and WhatsApp will really be so different after all.


Like a lot of problems, this one started with an imperative familiar to most businesses: growth.

Encrypted messaging has been a boon to activists, dissidents, journalists, and marginalized groups around the world. Not even Signal itself can see their messages — much less law enforcement or national security agencies. The app saw a surge in usage during last year’s protests for racial justice, even adding a tool to automatically blur faces in photos to help activists more safely share images of the demonstrations. This kind of growth, one that supported progressive causes, was exciting to Signal’s roughly 30-member team.

“That’s the kind of use case that we really want to support,” Marlinspike told me. “People who want more control over their data and how it’s used — and who want to exist outside the gaze of tech companies.”

“But until something is a reality, Moxie’s position is he’s not going to deal with it.”

On October 28th, Signal added group links, a feature that has become increasingly common to messaging apps. With a couple of taps, users could begin creating links that would allow anyone to join a chat in a group as large as 1,000 people. And because the app uses end-to-end encryption, Signal itself would have no record of the group’s title, its members, or the image the group chose as its avatar. At the same time, the links make it easy for activists to recruit large numbers of people onto Signal simultaneously, with just a few taps.

But as the US presidential election grew closer, some Signal employees began raising concerns that group links could be abused. On September 29th, during a debate, President Trump had told the far-right extremist group the Proud Boys to “stand back and stand by.” During an all-hands meeting, an employee asked Marlinspike how the company would respond if a member of the Proud Boys or another extremist organization posted a Signal group chat link publicly in an effort to recruit members and coordinate violence.

“The response was: if and when people start abusing Signal or doing things that we think are terrible, we’ll say something,” said Bernstein, who was in the meeting, conducted over video chat. “But until something is a reality, Moxie’s position is he’s not going to deal with it.”

Bernstein (disclosure: a former colleague of mine at Vox Media), added, “You could see a lot of jaws dropping. That’s not a strategy — that’s just hoping things don’t go bad.” 

Marlinspike’s response, he told me in a conversation last week, was rooted in the idea that because Signal employees cannot see the content on their network, the app does not need a robust content policy. Like almost all apps, its terms of service state that the product cannot be used to violate the law. Beyond that, though, the company has sought to take a hands-off approach to moderation.

“We would try to think really carefully about how we did that.”

“We think a lot on the product side about what it is that we’re building, how it’s used, and the kind of behaviors that we’re trying to incentivize,” Marlinspike told me. “The overriding theme there is that we don’t want to be a media company. We’re not algorithmically amplifying content. We don’t have access to the content. And even within the app, there are not a lot of opportunities for amplification.”

At the same time, employees said, Signal is developing multiple tools simultaneously that could be ripe for abuse. For years, the company has faced complaints that its requirement that people use real phone numbers to create accounts raises privacy and security concerns. And so Signal has begun working on an alternative: letting people create unique usernames. But usernames (and display names, should the company add those, too) could enable people to impersonate others — a scenario the company has not developed a plan to address, despite completing much of the engineering work necessary for the project to launch.

Signal has also been actively exploring the addition of payments into the app. Internally, this has been presented as a way to help people in developing nations transfer funds more easily. But other messaging apps, including Facebook and China’s WeChat, have pursued payments as a growth strategy.

An effort from Facebook to develop a cryptocurrency, now known as Novi, has been repeatedly derailed by skeptical regulators.

In the past, Marlinspike has advised MobileCoin, a cryptocurrency built on the Stellar blockchain designed to make payments simple and secure — and, potentially, impossible to trace. “The idea of MobileCoin is to build a system that hides everything from everyone,” Wired wrote of the project in 2017. “These components make MobileCoin more resistant to surveillance, whether it’s coming from a government or a criminal.”

Employees have been told that for Signal to become self-sustaining, it will need to reach 100 million users

People I spoke with told me they regard the company’s exploration of cryptocurrency as risky since it could invite more bad actors onto the platform and attract regulatory scrutiny from world leaders. 

Marlinspike played down the potential of crypto payments in Signal, saying only that the company had done some “design explorations” around the idea. But significant engineering resources have been devoted to developing MobileCoin integrations in recent quarters, former employees said. 

“If we did decide we wanted to put payments into Signal, we would try to think really carefully about how we did that,” Marlinspike said. “It’s hard to be totally hypothetical.” 


Signal’s growth imperatives are driven in part by its unusual corporate structure. The app is funded by the Signal Foundation, which was created in 2018 with a $50 million loan from WhatsApp co-founder Brian Acton. Signal’s development is supported by that loan, which filings show has grown to more than $100 million, and by donations from its users.

Employees have been told that for Signal to become self-sustaining, it will need to reach 100 million users. At that level, executives expect that donations will cover its costs and support the development of additional products that the company has considered, such as email or file storage.

But messaging is a crowded field, with products from Apple, Facebook, Google, and, more recently, Telegram. Signal’s initial customer base of activists and journalists will only get it so far. And so despite its anti-corporate ethos, Signal has set about acquiring users like any other Silicon Valley app: by adding new features over time, starting with those that have proven successful in rivals.

“I made a choice and a compromise. And I live with that every day.”

Those efforts have been led by two people in particular: Marlinspike, a former head of product security at Twitter whose long career in hacking and cryptography was recently profiled in The New Yorker, and Acton, whose title as executive chairman of the Signal Foundation dramatically understates his involvement in the project’s day-to-day operations.

In 2014, Acton and co-founder Jan Koum sold WhatsApp to Facebook for $22 billion, making them both billionaires. Acton left the company in 2017, later telling Forbes that his departure was prompted by Facebook’s plans to introduce targeted advertising and commercial messaging into WhatsApp. “I sold my users’ privacy to a larger benefit,” Acton told Forbes. “I made a choice and a compromise. And I live with that every day.”

A few months later, at the height of the Cambridge Analytica data privacy scandal, Acton caused a stir when he tweeted: “It is time. #deletefacebook.” 

Since then, he has increasingly devoted his time to building Signal. He participates in all-hands meetings and helps to set the overall direction of the company, employees said. He interviews engineers, screening them for their ideological commitment to encryption technology. He writes code and helps to solve engineering challenges.

While working at Facebook, Acton could be dismissive of the idea that technology companies should intervene to prevent all forms of abuse. “There is no morality attached to technology, it’s people that attach morality to technology,” Acton told Steven Levy for his book Facebook: The Inside Story. Acton continued:

“It’s not up to technologists to be the ones to render judgment. I don’t like being a nanny company. Insofar as people use a product in India or Myanmar or anywhere for hate crimes or terrorism or anything else, let’s stop looking at the technology and start asking questions about the people.”

“Recently this has been an all-hands-on-deck kind of thing.”

Asked about those comments, Signal told me that Acton does not have any role in setting policy for the company.

In recent interviews, Acton has been magnanimous toward his former colleagues, telling TechCrunch that he expects most people will continue to use WhatsApp in addition to Signal. But it’s hard not to see in Acton’s recent work the outlines of a redemption narrative — a founder who regrets selling his old company deciding to try again, but with a twist. Or maybe it’s a revenge narrative: I detected more than a little disdain in Acton’s voice when he told TechCrunch, “I have no desire to do all the things that WhatsApp does.”

Marlinspike told me that Acton’s increasingly heavy involvement in day-to-day development was a necessity given a series of recent departures at Signal, suggesting the WhatsApp co-founder might pull back once it was more fully staffed.

“Recently this has been an all-hands-on-deck kind of thing,” Marlinspike said. “He’s been great jumping in and helping where we need help, and helping us scale.” 

Still, Acton’s growing involvement could help explain the company’s general reticence toward implementing content policies. WhatsApp was not a “nanny company,” and it appears that neither will be Signal.

Whatever the case, Acton is clearly proud of Signal’s recent growth. “It was a slow burn for three years and then a huge explosion,” he told TechCrunch this month. “Now the rocket is going.”


Some rockets make it into orbit. Others disintegrate in the atmosphere. Signal employees I spoke to worry that the app’s appetite for growth, coupled with inattention to potential misuses of the product, threaten its long-term future. (Of course, not growing would threaten its long-term future in other ways.)

It’s often said that social networks’ more disturbing consequences are a result of their business model. First, they take venture capital, pushing them to quickly grow as big as possible. Then, they adopt ad-based business models that reward users who spread misinformation, harass others, and otherwise sow chaos.

Fascists are often quite public about their activities

Signal’s story illustrates how simply changing an organization’s business model does not eliminate the potential for platform abuse. Wherever there are incentives to grow, and grow quickly, dangers will accumulate, no matter who is paying the engineers’ salaries.

Signal employees I spoke to said they are confident that the app has not become a primary organizing tool for extremists — though, given its encryption nature, it’s difficult to know for sure. So far, there are no known cases of dangerous organizations posting Signal group links on Twitter or other public spaces. (One employee pointed out that fascists are often quite public about their activities, as the recent insurrection in broad daylight at the Capitol showed.) Usernames and cryptocurrencies are unlikely to cause major problems for the organization until and unless they launch.

At the same time, my sources expressed concern that despite the clear potential for abuse, Signal seemed content to make few efforts to mitigate any harms before they materialize. 

“The thing about software is that you never can fully anticipate everything,” Marlinspike told me. “We just have to be willing to iterate.”

Signal exists to improve that experience and make it accessible to more people, even if bad actors might also find it useful

On one hand, all software requires iteration. On the other hand, a failure to plan for abuse scenarios has been linked to calamities around the world. (Facebook’s links to genocide in Myanmar, a country in which it originally had no moderators who understood the language, is the canonical example.) And it makes Signal’s potential path more similar to Facebook than its creators are perhaps prepared to admit. 

In our conversation, Marlinspike committed to hiring an employee to work on issues related to policy and trust and safety. And he said Signal would change or even eliminate group links from the product if they were abused on a wide scale.

Still, Marlinspike said, it was important to him that Signal not become neutered in the pursuit of a false neutrality between good and bad actors. Marginalized groups depend on secure private messaging to safely conduct everything from basic day-to-day communication to organized activism, he told me. Signal exists to improve that experience and make it accessible to more people, even if bad actors might also find it useful. 

“I want us as an organization to be really careful about doing things that make Signal less effective for those sort of bad actors if it would also make Signal less effective for the types of actors that we want to support and encourage,” he said. “Because I think that the latter have an outsized risk profile. There’s an asymmetry there, where it could end up affecting them more dramatically.”

Bernstein, though, saw it differently.

“I think that’s a copout,” he said. “Nobody is saying to change Signal fundamentally. There are little things he could do to stop Signal from becoming a tool for tragic events, while still protecting the integrity of the product for the people who need it the most.”

Correction: This article originally stated Marlinspike is on the board of MobileCoin. While he has advised MobileCoin, he is not on the board.


This column was co-published with Platformer, a daily newsletter about Big Tech and democracy.