Skip to main content

What Facebook doesn’t understand about the Facebook walkout

What Facebook doesn’t understand about the Facebook walkout

/

The company is treating a moral issue as if it’s a legal one

Share this story

Illustration by Alex Castro / The Verge

I. The CEO

On Friday afternoon, Facebook made one of its most controversial content moderation decisions in company history. After President Trump posted to Facebook some tweets that Twitter had placed behind a warning for “glorifying violence,” Mark Zuckerberg said that the company would allow them to stand.

“I know many people are upset that we’ve left the President’s posts up,” Zuckerberg said in a Facebook post, “but our position is that we should enable as much expression as possible unless it will cause imminent risk of specific harms or dangers spelled out in clear policies.”

“When the looting starts, the shooting starts,” Trump had tweeted — quoting a former Miami police chief who, in 1967, called for a violent crackdown on the city’s black community. And just as the president suggested, a long weekend of violence followed in the United States, with police assaulting protesters and bystanders across the country in the days that followed.

Zuckerberg said that Facebook left the post up for two reasons: one, that “people need to know if the government is planning to deploy force.” And two, that Trump had sort of (maybe?) walked back his original post in a later one, “saying that the original post was warning about the possibility that looting could lead to violence.” So whether or not Trump meant to incite violence with his words, Zuckerberg argued, Facebook had good reason to let the post stand.

Shortly after sharing the post with the world, Zuckerberg held a meeting with employees to elaborate on his point of view. In audio of the meeting that I obtained, Zuckerberg said that he had agonized over the decision. “How to handle this post from the president has been very tough,” said Zuckerberg, who was joined in the discussion by his head of policy management, Monika Bickert. “It’s been something that I’ve been struggling with basically all day, ever since I woke up. ... This has been personally pretty wrenching for me.”

Zuckerberg reiterated his unhappiness with Trump’s remarks. “My first reaction ... was just disgust,” he said. “This is not how I think we want our leaders to show up during this time. This is a moment that calls for unity and calmness and empathy for people who are struggling.”

Ultimately, he and Bickert said, executives concluded that Trump’s remarks didn’t violate their existing policies. But he said they would re-examine their policies around politicians discussing the use of state force on Facebook, a process he said would likely take several weeks.

“There is a real question coming out of this, which is whether we want to evolve our policy around the discussion of state use of force,” he told employees Friday. “Over the coming days, as the National Guard is now deployed, probably the largest one that I would worry about would be excessive use of police or military force. I think there’s a good argument that there should be more bounds around the discussion around that.” Zuckerberg did not elaborate on what more “bounds” would mean in this case, or whether he thought the policy should change to disallow posts like Trump’s.

In response to an employee question, Zuckerberg also said he disagreed with Twitter’s approach of placing violating tweets behind a warning. “If you really believe that a post is going to cause people to go to go do real-world violence, then that’s not the type of thing that I think we should have up even behind a warning,” he said. “Some people might be comforted that Twitter took a step, even if it didn’t go all the way. But I don’t personally agree with that step.”

Zuckerberg’s message to employees Friday was that even if Facebook hadn’t removed this Trump post, it was prepared to do so in the future if the president violated a company policy. That satisfied some employees, but to others it smacked of appeasement. On Thursday, their anger bubbled up in a series of internal threads, as I reported at The Verge.

And then, over the weekend, the long-standing norm that Facebook employees never criticize their employer in public seemed to shatter, tweet by tweet.

On Monday, they mounted the most significant collective worker action in the company’s 15-year history. While it’s difficult to measure the number of people who participated in a virtual walkout, an internal group devoted to the effort had about 400 people, sources said.

II. The walkout

“I’m a FB employee that completely disagrees with Mark’s decision to do nothing about Trump’s recent posts, which clearly incite violence,” tweeted Jason Stirman, who works on research and development, on Saturday. “I’m not alone inside of FB. There isn’t a neutral position on racism.”

Jason Toff, a former employee of Twitter and Google who now works on experimental apps at Facebook, echoed those sentiments on Sunday. “I work at Facebook and I am not proud of how we’re showing up,” Toff tweeted. “The majority of coworkers I’ve spoken to feel the same way. We are making our voice heard.”

Within hours, there were more than a dozen such tweets from employees working across the company, all expressing disappointment with their employer’s decision. And on Monday, dozens of employees staged a virtual walkout, making themselves unavailable for the day and joining in protests. The New York Times reported that employees are working on a list of demands, and that some senior employees have threatened to resign if Zuckerberg doesn’t reverse his decision.

“As allies we must stand in the way of danger, not behind,” tweeted Sara Zhang, a product designer at Facebook. “I will be participating in today’s virtual walkout in solidarity with the black community inside and outside FB. #BlackLivesMatter”

By this point we have seen our share of worker actions at big tech companies. The Google walkout over sexual harassment kicked open the doors in 2018, and has been followed by high-profile protests at Amazon, Microsoft, and Salesforce, among others. And hundreds of Facebook employees signed an open letter to Zuckerberg in October about his decision to exempt political ads from fact-checking.

But notable as that letter was, it still adopted the form that dissent has almost always taken at Facebook: vigorous internal debate. (One source told me the internal furor over Joel Kaplan’s public support of controversial Supreme Court justice Brett Kavanaugh during his nomination hearings had been markedly more intense.) What’s different about Monday’s walkout is that the protests were public first — and posted about on a rival social network, to boot. For Facebook workers, the choice to discuss their concerns on Twitter was remarkably effective, for two reasons. One, Twitter is where journalists live, and so the posts were guaranteed to generate coverage. Two, sentiment about Facebook on Twitter is generally hostile, so current employees’ criticisms of the company got massive distribution through retweets.

The workers’ comments were less sweeping in their criticism than some former employees, co-founders, and top executives of Facebook have been over the years. None of these employees has yet quit, nor have they suggested, as WhatsApp co-founder Brian Acton once did, that people “delete Facebook.” But what they shared was a sense of shame in their employer that remains extraordinary among tech workers, even at a time when worker actions are becoming more common.

“Facebook’s inaction in taking down Trump’s post inciting violence makes me ashamed to work here,” tweeted Lauren Tan, an engineer. “I absolutely disagree with it. I enjoy the technical parts of my job and working alongside smart/kind people, but this isn’t right. Silence is complicity.”

Another inspired aspect of the workers’ protest was that executives had to sit back and accept it, at least in their public statements. You can’t bend over backwards to allow the president’s posts about shooting up crowds and then tell employees they can’t discuss their feelings about it. And so the official word from Facebook on all the controversy was that they should go for it. “We recognize the pain many of our people are feeling right now, especially our Black community,” the company told Bloomberg. “We encourage employees to speak openly when they disagree with leadership.”

Much of employees’ frustration appears to be rooted in the fear that there is no line Trump could cross that would lead Facebook to enforce its policies. Zuckerberg and Bickert spent much of the all-hands meeting on Friday pushing back on that idea — fairly, I think. It was barely two months ago that the company removed a post by the president of Brazil, Jair Bolsonaro, for promoting a phony coronavirus cure. You can argue that the company is more sensitive to pressure from conservatives in the United States, and a lot of good reporting has borne that out in the past few years. But the idea that there’s no line Trump can’t cross on Facebook doesn’t strike me as plausible.

Of course, we won’t know for sure until Facebook actually does take action against Trump. And in the meantime, a large number of employees have signaled that for them, that red line has already been crossed. For Zuckerberg and his policy team, Trump is a legalistic problem — a question of how certain words and phrases do or do not comport with the standards they have written. But for the workers speaking out, Trump is a moral problem — a danger to their friends, their families, their communities, and themselves.

Facebook’s scale depends on courting Republicans and Democrats equally — making regular concessions to both to ensure that the platform is as large as it can be. Zuckerberg has sought to draw a distinction between his role as CEO and his own feelings — arguing for Trump’s right to free expression at work while donating $10 million to groups working on racial justice in his personal time.

But since its founding, Facebook has been dedicated to the idea that in this life, you only get to have one real identity. When employees logged off Monday, the company began to see the limits of having it both ways.

What’s next

Senior black executives were meeting with Zuckerberg on Monday to discuss their concerns. Facebook is having an all-hands meeting with employees Tuesday where Zuckerberg is expected to take questions. Walkout organizers are developing a list of demands for the company.

The Ratio

Today in news that could affect public perception of the big tech platforms.

🔼 Trending up: YouTube is donating $1 million to the Center for Policing Equity, to demonstrate “solidarity against racism and violence.” The company tweeted that it was pledging the funds “in support of efforts to address social injustice.” (Kim Lyons / The Verge)

🔽 Trending down: A a technical glitch on TikTok made it look like posts using the hashtags #BlackLivesMatter and #GeorgeFloyd were receiving zero views. TikTok apologized for the error, saying “we understand that many assumed this bug to be an intentional act to suppress the experiences and invalidate the emotions felt by the Black community.” (See also Digital Forensic Research Lab on this.) TikTok says it will do more in the future to promote black creators.

Virus tracker

Total cases in the US: More than 1,807,200

Total deaths in the US: At least 104,700

Reported cases in California: 113,852

Total test results (positive and negative) in California: 1,944,848

Reported cases in New York: 376,520

Total test results (positive and negative) in New York: 2,063,825

Reported cases in New Jersey: 160,445

Total test results (positive and negative) in New Jersey: 746,145

Reported cases in Illinois: 120,588

Total test results (positive and negative) in Illinois: 898,259

Data from The New York Times. Test data from The COVID Tracking Project.

Governing

Misinformation about the protests in Washington DC related to the police killing of George Floyd is surging across Twitter. One of the rumors claimed authorities had somehow blocked protesters from communicating from their smartphones in order to crack down on the unrest. Here are Craig Timberg, Elizabeth Dwoskin and Fenit Nirappil from The Washington Post:

Protests in downtown Washington and near the White House were widespread Sunday night and into Monday morning. What started as largely peaceful protests over last week’s police killing of George Floyd in Minneapolis deteriorated after dark in Washington, with rioters smashing windows, starting fires and overturning vehicles — despite an 11 p.m. curfew.

But the degree of mayhem described by tweets using #DCblackout went far beyond reality. Alarming text was interspersed with shaky videos of confrontations between police and protesters, though it wasn’t clear how many of the images were from Washington, as opposed to other U.S. cities facing unrest.

Social media is being used to galvanize protestors. But misinformation about the unrest is also going viral on the platforms — and not just in DC. (Sarah E. Needleman and Sebastian Herrera / The Wall Street Journal)

Authorities and law enforcement officials are helping to push a narrative that outside groups were responsible for inciting violent confrontations at the protests over the weekend. But there’s little evidence for those claims. (Brandy Zadrozny and Ben Collins / NBC)

Over the weekend, people across the US have captured what may be the most comprehensive live picture of police brutality ever. This post includes images and videos of those scenes captured across the country, which were shared widely across social sites. (T.C. Sottek / The Verge)

Police violence will make it harder to fight COVID-19. In the aftermath of the police response to protests, vulnerable communities may be even less likely to trust and cooperate with health officials. That could make it harder to control another wave of illness. (Nicole Wetsman / The Verge)

The Dallas Police Department asked people to send in “video of illegal activity” from the Black Lives Matter protests in the city. Instead, it received a flood of pictures and videos of K-pop artists. (Caroline Haskins / BuzzFeed)

Apple CEO Tim Cook addressed the killing of George Floyd in a letter to employees, saying the “painful past is still present today.” He also said the company would be donating to the Equal Justice Initiative, a non-profit focusing on racial injustice. (Mark Gurman / Bloomberg)

Google CEO Sundar Pichai emailed employees on Friday about the growing protests and unrest in the US. The tech exec did not commit to supporting any specific government policy changes in response to the protests, unlike some other technology industry leaders. (Rob Price / Business Insider)

Snap CEO Evan Spiegel called for the creation of an American commission to address racial injustice and comprehensive tax reform. In a lengthy note to employees, he said he was “heartbroken and enraged” by racism in America. Give this one a read — it’s by far the most thoughtful of the corporate responses to this weekend’s violence that I’ve seen.

Trump’s executive order on social media companies could backfire. Without certain liability protections, companies like Twitter would have to be more aggressive about policing messages that press the boundaries — including the president’s. (Peter Baker and Daisuke Wakabayashi / The New York Times)

Also: Trump’s order could hurt online speech by pressuring social media platforms to give his content preferential treatment. (Charles Duan and Jeffrey Westling / Lawfare)

President Trump’s decision to go after Twitter for fact-checking his tweets is part of a long tradition upheld by aggrieved internet trolls.  A power user with a passionate following is lashing out against the moderators of his favorite internet services. The best description of the dynamic between Trump, Twitter and Facebook that I’ve read to date. (Kevin Roose / The New York Times)

Senate lawmakers are unveiling a bipartisan bill to regulate contact-tracing and exposure-notification apps. Their goal is to ensure that apps meant to combat the coronavirus don’t come at the expense of users’ privacy. (Tony Romm / The Washington Post)

State-based contact tracing apps could be a disaster. With no national plan for these apps, security and interoperability issues loom large. (Andy Greenberg / Wired)

YouTube said it made a mistake in deleting videos about the controversial drug hydroxychloroquine posted by a popular doctor. The content appears to have been collateral damage in the company’s fight against COVID-19 misinformation. (Mark Bergen / Bloomberg)

Industry

Google rescinded offers to several thousand contractors due to an advertising slump created by the coronavirus crisis. The move affected more than 2,000 people globally who had signed offers to be a contract or temp worker. Daisuke Wakabayashi at The New York Times has the story:

Many of the contract and temp candidates who had agreed to work at Google before the pandemic took hold in the United States were let go without any severance or financial compensation. This came after weeks of uncertainty as Google repeatedly postponed their start dates during which time they were not paid by Google or the staffing agencies.

Some of the would-be contractors left stable, full-time jobs once they received an employment offer at Google and are now searching for work in a difficult labor market. Some, who are Americans, said the rescinded offers have complicated and, in some cases, delayed their ability to receive unemployment benefits because they left their last jobs voluntarily, according to several of the workers facing this quandary.

Coronavirus mutual aid groups are organizing on Slack. They’re distributing groceries and spare air conditioners using tools designed to turn labor into money as efficiently as possible. (Kaitlyn Tiffany / The Atlantic)

Inside Amazon’s hierarchical company culture. This investigation is part of a series called Tech Nations, which examines the world’s largest technology companies as if they were countries – not corporations. Featuring one of the first interviews with recent Amazon apostate Tim Bray. (Alexi Mostrous and James Ball / Tortoise Media)

Zoom plans to roll out strong encryption for paying its customers — but not for those with free accounts. (Joseph Menn / Reuters)

Things to do

Stuff to occupy you online during the quarantine.

Donate to a bail fund. This comprehensive guide from Rolling Stone has a guide to supporting state and national efforts, including the George Floyd Memorial Fund in Minnesota. Today and every day, black lives matter.

And there are some more good ideas on how to help from The Verge here.

And finally...

Talk to us

Send us tips, comments, questions, and audio of the Facebook all-hands on Tuesday: casey@theverge.com and zoe@theverge.com.