For years, Mark Zuckerberg has faced criticism that Facebook is bad for democracy.

Employees want him to take a harder line against the Trump Administration. His user base wants him to do the opposite.

In a summer’s worth of leaked audio recordings obtained by The Verge, you can hear Facebook's CEO trying to hold the center.

MARK IN THE MIDDLE

By Casey Newton | Sep 23, 2020, 10:00am EDT

Graphic by William Joel / Photo by Chip Somodevilla/Getty Images

In 2020, Facebook would be roiled by a global pandemic, internal protests over racial injustice, a deeply polarizing election, and the ongoing threat of multiple state and federal investigations into antitrust and privacy. But on the morning of July 16th, Mark Zuckerberg found his workforce asking for something else: their missing office snacks.

A major sell to candidates is our office perks include free food, read the question, which had ranked near the top of questions asked that week in an internal poll. And now, with work from home, we’ve lost a huge financial part of our package. What is the plan on this?

There was not a plan. After the pandemic led the company to shutter its offices, Facebook had given its employees $1,000 bonuses and said it would give them all top marks on their first-half performance evaluations, no matter how they had actually performed. It also seized a rare opportunity to reverse declining public opinion about the company, rapidly spinning up ways to help with COVID-19 relief efforts: a $100 million grant program for small businesses and an initiative to help researchers track the spread of symptoms, among other efforts.

But three months in, Facebook had not yet explained how it might re-create for homebound workers the sight of refrigerated cases stocked with free Hint water, and cans of La Croix or baskets overflowing with energy bars and fruit.

“I’m not sure if I’m missing something from this question,” Zuckerberg responded, in polite disbelief,  “but I certainly haven’t seen any data that suggests that free food is anywhere near the list of primary reasons that people come to work at this company. I hope it’s not. I hope if you’re watching this, and I’d imagine that you’re here for some combination of reasons around the mission of the company, the impact that we can have in the world, trying to make sure that that’s as positive as possible. ... Those are typically the things that come up, not free food.”

To the outside world, depending on your point of view, Facebook is a hugely popular social network, a dangerous incubator of right-wing conspiracy theories and hoaxes, or a censorious liberal arm of the Democratic Party. But as that July meeting revealed, in some ways, Facebook is a tech company like any other. Its more than 50,000 employees care about fighting misinformation and protecting against election interference, sure. But also — what’s going on with the snacks?

At another company, the CEO might have ignored such a question from his workforce or declined to take it at all. But almost since the founding of Facebook, Zuckerberg has invited employees to submit questions each week and live-streamed answers to the 10 or so that get the most votes as part of a weekly all-hands meeting. The practice was borrowed from Google, where co-founders Larry Page and Sergey Brin instituted the tradition at the Friday afternoon town hall meetings known as TGIF.

Between May and August, The Verge obtained 16 audio recordings and dozens of internal posts and screenshots from meetings and groups at Facebook from employees. The recordings include the company’s weekly Q&As, “FYI Live” sessions in which top executives discussed a civil rights audit and preview the summer’s congressional antitrust hearing, and talks by top executives highlighting the work their teams are doing.

“As we approach one of the most consequential elections in recent history, our commitment to free expression means people on all sides of the political spectrum have strong views about our content decisions — including employees,” Liz Bourgeouis, a Facebook spokeswoman, said in a statement. “We’re proud of the open culture that we’ve built where our teams can express themselves openly and hear from Mark at Q&A each week.”

From Zuckerberg’s weekly addresses to Sheryl Sandberg’s annual Q&A session with interns, the recordings capture a company trying to wrap its mind around itself. Shaken by internal criticism, challenged by the largest advertiser boycott in company history, and threatened by elected officials around the world, Facebook left the summer with its image bruised, a fact Zuckerberg acknowledged in a July 31st Q&A.

“We are certainly exiting the first half of the year with our brand in a tougher place than when we started the crisis,” he said.

At companies that, due to total founder control over their boards, are essentially monarchies, a weekly Q&A can give the workplace a democratic sheen. Zuckerberg grants workers a regular opportunity to raise concerns with him directly and, in return, gets a chance to sell employees on the idea that, despite the heavy criticism they receive externally, their work is making a positive impact on the world. Increasingly, he also has to remind them that the views of the company’s liberal-leaning workforce can be out of step with the more conservative population that Facebook serves.

With tens of thousands of employees and more than 3 billion constituents worldwide, Zuckerberg now finds himself pulled in all directions. Inside the company, his weekly Q&As now regularly spill over with outrage and dissent. Throughout the summer, Zuckerberg faced increasingly pointed questions about the company’s friendly relationship with President Donald Trump; the influence of its conservative head of policy, Joel Kaplan; and the rise of white supremacist organizations on the platform.

Google ended its weekly town halls in 2019 after a series of leaks, but at Facebook, they have continued — even as the pace of them has accelerated, revealing deep fault lines throughout the company. Once renowned for the loyalty employees showed its executive team, leaks from internal meetings began appearing with some regularity in The New York Times, BuzzFeed, The Information, and elsewhere.

"Can we build a quarantined Facebook city?"

The murders by police of George Floyd, Breonna Taylor, and other Black Americans earlier this year, along with a resurgent Black Lives Matter movement that swept the country in June, galvanized Facebook employees who had begun to question how their own work may contribute to racial inequality. After Zuckerberg decided to let stand a controversial Trump post that threatened to shoot protesters, they led the first virtual walkout in the company’s history.

For the rank-and-file employees who participated, the walkout marked an unusually public reckoning with Facebook’s power and responsibility. But while the comments of Facebook’s angriest employees sometimes made their way to the press, the issues raised internally were a jumble: criticism of the company’s content policies and how they are enforced, concerns about rising competitors, and a flood of questions about the company’s shift to remote work and how it would affect workers’ careers.

Still, other employees responded to months of conflict in isolation by asking whether they could all run away — from the pandemic, from everything — together.

Can we build a quarantined Facebook city? an employee had asked ahead of the July 31st Q&A. Like, buying an island and all of us working there?

Zuckerberg read the question out loud, laughing gently.

“Gosh, I don’t think that that would be good long-term,” he responded.  “I think it’s good to maintain connection to the rest of society.”

June

The post that turned Facebook upside-down in 2020 began on Twitter.

It was May 29th, and as protests over George Floyd’s murder swept Minneapolis, Trump posted a tweet that was cross-posted to his Facebook page. “Just spoke to Governor Tim Walz and told him that the Military is with him all the way,” Trump wrote. “Any difficulty and we will assume control but, when the looting starts, the shooting starts. Thank you!”

After a day of agonized internal debate, Zuckerberg made the final call to let the looting post stand, arguing in a public statement that citizens had a right to know if their own country planned to take military action against them. Trump cheered the decision.

For employees who had believed in Facebook’s democratizing potential and its mission to make the world “more open and connected,” the move felt like a betrayal. It’s honestly really hard for me to take seriously the words of support from our leadership this morning if we allow content like this on our platform, one wrote in an internal post obtained by The Verge. Whatever we are getting from not acting on this, is it worth allowing clear, violent threats against Black protesters?

Protesters march on Hiawatha Avenue while decrying the killing of George Floyd on May 26, 2020 in Minneapolis, Minnesota.
Photo by Stephen Maturen/Getty Images

Zuckerberg addressed employees on the Friday that he announced his decision, telling them he had been disgusted by the president’s remarks. But his role as CEO of Facebook requires him to remain nonpartisan in the way he enforces the company’s policies, he said. And it wasn’t as if the company would let Trump get away with everything — three weeks later, Facebook acted quickly to remove 80 Trump campaign ads for using Nazi imagery. But in the moment, some Facebook employees saw Zuckerberg’s decision as a failure to consider the moral dimension of Trump’s saber-rattling, and the issue has been raised in internal forums and Q&As with Zuckerberg ever since.

In its adherence to a strict if ever-changing set of nonpartisan community standards, Facebook had adopted what New York University professor and press critic Jay Rosen has called, in the journalistic context, “the view from nowhere” — a commitment to the belief that all politics are being practiced in good faith and are thus deserving of equal treatment. Sensitive to criticism that its policies are perceived to favor one political party over the other — and fearful of legislation that might arise from it doing so — the company strives to project an air of total impartiality.

“What we do is really try to not take a point of view,” the company’s chief operating officer, Sheryl Sandberg, told interns during a Q&A on July 7th.  “I have a very strong point of view on this president. It’s a personal point of view. It’s one I hold deeply. It’s not one that should enter into my judgments when I’m doing policy changes. … We have to be a neutral platform, and make those decisions coming from a place of rules and principles.”

But that same day, during an all-hands meeting on the release of a multiyear civil rights audit of Facebook, the company’s director of public policy for trust and safety, Neil Potts, told employees just the opposite.

“I don’t think we’re necessarily neutral,” Potts told employees, in response to a question about whether Facebook’s posture of neutrality was “incompatible” with “racial progress.” It was the most upvoted question in an employee poll during the event. Potts responded that Facebook’s commitment to removing hate speech, incitements to violence, and other problematic posts showed that its policies are principled. He noted that the community standards were crafted in consultation with activists and civil society groups around the world.

“We want to produce a product that is good for our community, and I don’t think that is incompatible with civil rights,” he said.

However it thought of itself, Facebook’s legalistic approach to policy enforcement offended some employees — and its decision on the Trump post shattered the long-standing norm against speaking out against the company publicly while still employed there. More than a dozen employees shared their dissent publicly, largely on Twitter. And hundreds of them, working remotely, staged a virtual walkout on June 1st — the most high-profile collective action among Facebook’s workers in the company’s 16-year history.

"Whatever we are getting from not acting on this, is it worth allowing clear, violent threats against Black protesters?"

In its aftermath, Zuckerberg held another virtual meeting with employees, describing a seven-point plan to address their concerns. Facebook would reexamine its content policies, consider adding new labels to problematic posts, and introduce a new information hub encouraging people to vote.

The CEO acknowledged that, as a result of his decision, Facebook’s public image had taken another hit. “Likely this decision has incurred a massive practical cost for the company to do what we think is the right step,” he said.

Meanwhile, a growing number of employees was questioning the influence of its most prominent conservative executives over how the company developed and enforced policies in cases like these. A series of reports in The Wall Street Journal, BuzzFeed, and elsewhere had drawn attention to the role played by Joel Kaplan, Facebook’s vice president of global public policy. A former policy adviser and deputy chief of staff to George W. Bush, Kaplan often appeared as a kind of bogeyman in these stories, intervening to stop Facebook from taking actions that would be perceived as hostile to conservatives.

Mark Zuckerberg (L) and Facebook’s Vice President of Global Public Policy, Joel Kaplan (R) chat after leaving a meeting with Senator John Cornyn (R-TX) in his office on Capitol Hill on September 19, 2019 in Washington, DC.
Photo by Samuel Corum/Getty Images

He had been a controversial figure in the company ever since he attended the 2018 Supreme Court confirmation hearing in support of Brett Kavanaugh, a close personal friend. Kaplan later apologized to his colleagues.

On June 18th, Facebook employees asked Zuckerberg if they could hear from Kaplan directly:

Many people feel that Joel Kaplan has too much power over our decisions. Can we get him on a Q&A to learn more about his role, influence, and beliefs?

Zuckerberg said the company would work to provide more information about the operations of its policy team. But he dismissed the idea that Kaplan has undue influence at the company, saying that Monika Bickert, the company’s head of policy management, plays a stronger day-to-day role in policy development. And Zuckerberg bristled at the implication that Kaplan’s party affiliation should disqualify him from the job.

“I’ve seen a bunch of comments internally that — that I have to say bothered me a bit,” Zuckerberg said.  “That basically asked whether Joel can be in this role, or can be doing this role, on the basis of the fact that he is a Republican, or has beliefs that are more conservative than the average employee at the company. And I have to say that I find that line of questioning to be very troubling. In my work with Joel, I’ve found him to be ... very rigorous and principled in his thinking.”

The controversy over Kaplan highlighted a growing and seemingly intractable gap within Facebook — between the values of its more progressive workforce and those of its user base at large.

“One of the things that we talk about a little bit less inside the company is that ... the community we serve tends to be, on average, ideologically a little bit more conservative than our employee base,” Zuckerberg said.  “Maybe ‘a little’ is an understatement. … If we want to actually do a good job of serving people, [we have to take] into account that there are different views on different things, and that if someone disagrees with a view, that doesn’t necessarily mean that they’re hateful or have bad intent.”

For Zuckerberg, the conservatism of his American user base had become more than a source of friction with his employees — it had also become a customer service issue. The No. 1 complaint that Facebook receives from its users is that the company removes too many of their posts, he said, for reasons that they often interpret as being politically motivated.

“I want to make sure that people here recognize that the majority of the negative sentiment that we have faced, measured by write-ins from our community, is actually generally coming from more conservative-leaning folks who are concerned about censorship,” Zuckerberg would tell employees later in the summer.

There was no obvious way to placate liberal employees and conservative users at the same time. And even as the company discussed the issue, a new threat stemming from Trump’s looting post emerged. As the month came to a close, a coalition of civil rights groups announced that it would stage an advertiser boycott of Facebook beginning on July 1st. Coca-Cola, Unilever, Verizon, and Hershey were among the largest to sign on.

The goal, according to organizers, was to force Facebook to build “permanent civil rights infrastructure” as well as to improve efforts to identify and remove hate speech from the platform. The participation of such prominent brands raised hopes among activists that they could force changes at the company by starving it of revenue while damaging Facebook’s public image along the way.

But while the company held virtual meetings with organizers to hear their concerns, executives faced little financial pressure to cede much ground. Data wins arguments, to use a phrase popular inside the company, and the boycott data told a clear story: Facebook simply was not suffering from the advertisers’ departure.

Employees could observe the effects, or lack thereof, in real time. A team spun up a database called the Advertiser Boycott Revenue Trends Explorer, and it charted the decline in revenue from the start of the campaign. Within weeks, McDonald’s cut $1.02 million in spending, Volkswagen cut $1.64 million, and Verizon pulled $2.14 million. The database, which ran to dozens of entries, showed that many big brands were siding with Facebook’s more liberal employee base against their content policy team.

But Facebook has more than 9 million advertisers, and the vast majority would not participate. The lost revenue amounted to a drop in the bucket. Zuckerberg predicted as much during a June 26th Q&A session.

“The bottom line is, we’re not gonna change our policies or approach on anything because of a threat to a small percent of our revenue, or to any percent of our revenue,” Zuckerberg said. (These remarks were first reported by The Information.)  “We’re gonna do what we think are the right things that we think are gonna serve the community best over time, including the policy changes that we announced this morning, and we’ll continue going on that, and my guess is all of these advertisers will be back on the platform soon enough.”

The data said Facebook could withstand the assault, and won the argument. Left open for the moment was the question of whether the boycott might have had a larger cost — and, if so, where in the data such a cost might appear.

July

What were the “right things” to serve the community, as Zuckerberg put it, when the community had grown to more than 3 billion people? For years, activists, lawmakers, academics, and journalists had pressured Facebook to take more responsibility for its outsized influence on public affairs. Through a quirk of timing, one of its biggest such efforts to hold itself accountable — a two-year audit of Facebook’s effects on civil rights — came just as the advertiser boycott was accelerating.

When the final report from auditors arrived, its findings were consistent with some of the complaints that had led employees to walk out in June: the company’s strained efforts at neutrality had enabled a torrent of discrimination across its services. While auditors acknowledged dozens of changes the company had made to improve its practices since 2018, their final judgment was unsparing.

Unfortunately, in our view Facebook’s approach to civil rights remains too reactive and piecemeal. Many in the civil rights community have become disheartened, frustrated and angry after years of engagement where they implored the company to do more to advance equality and fight discrimination, while also safeguarding free expression. As the final report is being issued, the frustration directed at Facebook from some quarters is at the highest level seen since the company was founded, and certainly since the Civil Rights Audit started in 2018.

“The community we serve tends to be, on average, ideologically a little bit more conservative than our employee base.”

On July 7th, the day before the auditors’ report was made public, Facebook held a company-wide meeting to discuss their findings and take employee questions. Neither Zuckerberg nor Sandberg, who had organized the civil rights task force, spoke on the call. Instead, it was led by Monique Dorsainvil, who leads strategic partnerships on Facebook’s policy team.

“While today’s audit report is not in response to the current moment, it does help create a roadmap for some of the unique work that we can look at and our role in tackling many of these issues head on,” Dorsainvil said in her opening remarks. She said the audit had been useful in helping Facebook rethink some of its policies, but she acknowledged that Facebook and the auditors had ultimately come to an impasse.

“There’s lots of areas where they disagree with our policy enforcement decisions like voter suppression and incitement to violence,” Dorsainvil said. “No amount of engagement is going to change their valid views on these positions.”

But after a half-hour of discussion, some employees were growing restless. I’m hearing a lot of defending the actions we’ve taken rather than discussing the outcomes of the report and what changes we’ll be making to overcome them, one employee had posted during the talk, in a comment that an executive on the call, Ashley Finch, noted was “getting a lot of traction.”

Finch, a director of strategic initiatives at Facebook who helped to oversee the audit, called it “a really fair question.” She said the audit itself covered many actions Facebook had already taken and urged employees to read it and take action.

“If everyone could please, please, please engage with the audit report themselves ... let’s really figure out where that daylight is between what the auditors are asking and what we’re doing,” she said.  “Let’s integrate these recommendations into our roadmaps …. you know, as long as they fit strategically with what we’re trying to do.”

Another employee asked a question about Kaplan’s dual role as one of Facebook’s top lobbyists and as the man overseeing its policy teams.

Is it a conflict of interest to have one policy executive influencing both staying on good terms with Trump and civil rights-related moderation?

Potts told employees that Facebook enforces its policies without regard to who breaks them.

“I don’t think that’s the case that we are ever co-opted or corrupted in any way,” he said. “I have never been in a situation where someone’s asked me to actually kind of change a policy in the moment specifically to address a quote-unquote political concern.”

In her Q&A with interns the same day, Sandberg noted that Facebook had been used to raise billions of dollars for charity and had also been an important tool for activists around the world.

“We don’t get credit for any of those movements,” she said.  “We don’t get credit for any of it. But the brave women who spoke out on Me Too, the brave people who spoke out on Black Lives Matter, the brave people who organized the Women’s March — they needed the tool. There’s a reason this is happening now and it didn’t happen before.”

Still, she told interns that her top priority overall was recruiting.

“Everyone knows today’s a big day,” she said.  “We have an advertiser boycott. We have a very important civil rights meeting. We are putting out our civil rights audit tomorrow. It’s the end of a quarter. There’s a lot going on — and in the middle of all of that, the most important thing we do is grow the company with the right people.”

Employees did not seem entirely convinced by this logic. The audit had raised the question of whether Facebook could advance civil rights in the United States more rapidly than Trump, who consistently used the platform to threaten protesters and undermine confidence in the legitimacy of the election, could undermine them. And so executives regularly took pointed questions from employees about Zuckerberg’s relationship with the president, and whether it had become too cozy.

During his July 16th Q&A, Zuckerberg sought to quell that criticism during his opening remarks.

“I think I’ve probably been the most outspoken CEO in the country against — on the many things that I disagree with this president on,” he said. “Whether it’s the immigration policies, which I think have not only been unfair but I think put the country at a huge disadvantage going forward compared to the opportunities that we should be going after. On climate change, where I think moves like pulling out of the Paris Agreement were a huge setback for the world. On things like his divisive and inflammatory rhetoric, that I’ve called disgusting — which I think was farther than pretty much any other corporate CEO who I’ve seen out there.”

Indeed, Facebook’s announcement that it would seek to register 4 million new voters this year was arguably defiant of Trump. (His campaign called it “nothing less than an attempt to ultimately benefit Biden and the Democrat Party.”) In any case, Zuckerberg warned, criticism of the company was likely to get worse as the campaign went on.

“I just think we should all steel ourselves,” he said. “The next few months are going to be a very tense period as we move towards this incredibly high stakes election.”

Zuckerberg then moved on to employees’ top question for the week: whether they could have more flexibility to work from home once Facebook’s offices reopened.

Neither the boycott nor the campaign receded fully into the background by the end of July. But as August approached, Facebook prepared for a new test of its mettle: an antitrust hearing long in the making, which ultimately took place on July 29th.

“We don’t get credit for any of those movements.”

Though it was rarely, if ever, brought up in weekly Q&As, Facebook faced separate antitrust investigations conducted by the Federal Trade Commission, the Justice Department, and a coalition of state attorneys general. The House Judiciary’s antitrust subcommittee had conducted its own year-long investigation into the competition practices of the tech giants, and it called their CEOs to testify and offer evidence.

During a Q&A on the 23rd, Zuckerberg struck a reassuring tone for employees, reminding them that he had appeared before Congress before, when he had emerged mostly unscathed. Facebook’s rivals faced more obvious antitrust questions than it did, he said. He predicted Congress would mostly focus on other issues, primarily around content moderation and whether the company is effective at it — or biased against conservatives.

Zuckerberg predicted right. While the committee did release documents underscoring the degree to which Facebook saw Instagram as a competitor before buying it, many of the questions Zuckerberg took hardly touched on antitrust at all. The CEOs of Amazon and Google took the brunt of the antitrust heat, just as he had said. And while Apple received the fewest questions, Cook was made to answer for the App Store’s effects on competition.

Zuckerberg testifies before the House Judiciary Subcommittee.
Photo by Mandel Ngan-Pool/Getty Images

The next day, Facebook reported its quarterly earnings, and the results were stellar. Aided by the COVID-19 pandemic and continuing stay-at-home orders around the world, usage of Facebook’s products grew 14 percent to 3.14 billion people. Ad revenue rose 11 percent over the previous year to $18.69 billion. Despite the gloom on display in the “revenue trends explorer,” the advertiser boycott hadn’t hurt Facebook’s business at all.

The steady hand of the founder, coupled with strong business results, helped explain why many Facebook employees retain a strong affection for Zuckerberg. For one of the world’s most powerful people, and also their boss, he could be quite self-deprecating. During a mid-July Q&A, employees who by then had become accustomed to seeing Zuckerberg only on video calls noted that his skin seemed to always look good, and asked him what he described as “a lot of questions on” his skincare routine.

Zuckerberg attributed his success to good lighting for his videoconferencing setup and on making a point to get rest and exercise. He encouraged employees to take vacation, even if it seemed inconvenient to their teams. “And I guess sunblock is probably important, too,” he added.

The next week, a paparazzi photo of Zuckerberg surfing in Hawaii while caked in sunblock went viral, and the internet — and Facebook — was flooded with memes.

Zuckerberg took it in stride.

“I’m not a person who’s under the illusion that I look particularly cool at any point with what I’m doing,” he said at the next week’s Q&A. “But when you’re eFoiling down the coast of Hawaii, and it’s beautiful and it just feels like it’s awesome — and then you come back online and you see that’s the photo, that’s what you look like — it’s like, OK. Alright. That’s maybe quite a bit more sunscreen than I thought I was wearing.”

“I’m not going to apologize for wearing too much sunscreen,” he added.  “I think that sunscreen is good, and I stand behind that.”

August

Zuckerberg kicked off the August 6th Q&A by congratulating his Reels team for launching the product globally. Outside the company, discussion of Facebook continued to be dominated by questions about its policies and how they are enforced. But inside, despite the strange working conditions created by the pandemic, the company was operating more or less as it always had: poring over analytics about how its products are used, using the information to boost engagement, and hunting for the next big thing. Zuckerberg himself set the example: he could rattle off stats for both Facebook and its top competitors, and often did during weekly Q&As.

By August, the data suggested that Reels could be one way forward for the company. The short-form videos, which live inside of Instagram, were designed to blunt the momentum of TikTok, which children were now using for an average of 80 minutes per day. Those 80 minutes on TikTok were minutes that people were not spending on Facebook’s products, and in the zero-sum calculus of social apps, that meant the app could someday evolve into an existential threat. During weekly Q&As, employees regularly asked Zuckerberg what he planned to do about it.

Zuckerberg had watched the rise of ByteDance’s social entertainment app closely from the beginning. Early on, TikTok struggled to retain its users, he told employees in June. But its retention rate improved as the months went on, he said, and by this summer, the impact of TikTok on Facebook’s own fortunes was becoming more clear.

On June 29th, India banned TikTok on national security grounds. A week later, in an internal post, a Facebook data scientist reported that usage of the company’s products was surging across India. With TikTok gone, Instagram’s daily users increased 9 percent and were spending 19 percent more time within the app. With TikTok gone, they posted 5 million more stories and sent 214 million more messages on Instagram alone.

In his remarks to employees, Zuckerberg made it clear that he did not want to see TikTok banned. A world in which the ruling party of any country can ban a social app for any reason poses an existential threat to Facebook’s ambitions to build a global network. And yet, the message from Facebook’s own data was undeniable: banning TikTok had been good for Facebook’s short-term business.

“And I guess sunblock is probably important, too.”

An August 7th memo to the company from longtime executive Fidji Simo captured the company’s data obsession in action. Simo had taken over leadership of the core Facebook app in 2019 after eight years at the company, telling Marie Claire upon her promotion, “My number-one priority is to keep people and their information safe.”

The structure of her memo, “Facebook App — August Update,” tells a somewhat different story. It leads with a discussion of “engagement and sentiment trends,” covering how often people use Facebook and which facets of the service are growing. Simo reported that the average Facebook user visited the app 18.4 times a day the previous month, spending a cumulative 59.6 minutes there. Sharing was up 15.6 percent year over year. Groups, a major focus for the company, had been a big success: posts were up 31.9 percent.

A section on product milestones followed; July saw the 10 billionth live video broadcast on Facebook. The next series chronicled the month’s product launches, including employee testing of a new product called Neighborhoods meant to ape the rival social network Nextdoor. Only after that did a one-paragraph section on “integrity” outline the month’s efforts around protecting users from abuse and fake accounts.

“What they care about is metrics, mostly,” one employee told me this summer. Occasionally, ethical concerns would override product decisions, the employee said. Instagram might test hiding the likes on a post to see if it made people feel better, for example, or a hub of voter information might be inserted into the News Feed, reducing the available space for ad inventory.

“Everything else is usually just, how can we maximize this by 0.1 percent?” the employee said. “When we make a change, now that we’ve run the experiment — how did this one metric improve? If it didn’t improve, or regress, how do we move it back as fast as possible?”

Employees’ raises and promotions are closely tied to their ability to boost engagement within Facebook’s products. In their careers, as in Simo’s monthly memo, the engagement metrics come first.

“Everything else is usually just, how can we maximize this by 0.1 percent?”

Just as Zuckerberg had said, August proved to be tense even by the heightened standards of 2020. For the first half of the month, Facebook muddled through. But the aftermath of a police shooting in Kenosha, Wisconsin, on August 23rd once again threw the company into turmoil.

After police shot Jacob Blake seven times in the back, protests swept Kenosha, and a 17-year-old was charged with murder after allegedly shooting three protesters, two of whom died. The afternoon before the murders, a 3,000-member Facebook group calling themselves the Kenosha Guard had advertised an event on Facebook encouraging an armed response to the unrest.

Facebook said the alleged shooter had not followed the page or been invited to the event. But The Verge reported that moderators had rejected reports from Facebook users that the event violated its community standards in the hours before the shooting. Later, BuzzFeed News revealed that Facebook had received 455 separate reports of the event.

In response, Zuckerberg posted a portion of that week’s Q&A publicly, as he sometimes did when they had already leaked or when he had news to announce. In the public video, he described the failure to remove the event as an “operational mistake” from its outsourced moderation teams, one that owed partially to the fact that militias had not been banned officially on Facebook until a couple of weeks prior. (The company removed them as part of a larger effort to root out the right-wing conspiracy movement QAnon, whose popularity had exploded on Facebook in 2020.)

Screenshot of the video Zuckerberg shared on his profile on Facebook.

Since spring 2019, when Zuckerberg said Facebook would shift to emphasize private messaging and groups over public posts, academics and journalists had warned of the dangers of funneling more conversations into private online spaces. Even before that shift, researchers had found that Facebook unwittingly helped the anti-vaccine movement grow by recommending groups about the subject to new mothers.

An engineer who worked on groups told me they found the group recommendation algorithm to be the single scariest feature of the platform — the darkest manifestation, they said, of data winning arguments.

“They try to give you a mix of things you’re part of and not part of, but the aspect of the algorithm that recommends groups that are similar to the one that you’re part of are really dangerous,” the engineer said. “That’s where the bubble generation begins. A user enters one group, and Facebook essentially pigeonholes them into a lifestyle that they can never really get out of.”

In the Kenosha shooting and its aftermath, many of the fears about Facebook’s role in public life crystallized into one. A product built to promote the most engaging posts above all, with their contents often reviewed only after other users report them. An outsourced content moderation team that leaves vital questions of speech and safety in the hands of low-paid workers. A frequently changing set of content policies that often leave moderators confused and error-prone. A worldview that aspires to be neutral, but could really only be described in terms of data: of what went viral, of what was removed, of who was banned.

And all of it ultimately depended on the judgment of one person: a CEO trying to hold the center in a republic of 3.14 billion people and in a country that seemed to grow more polarized by the day.

To some employees, the events of this summer were worth quitting over — sometimes prominently, giving anguished interviews on their way out. Others redoubled their focus on the urgent tasks at hand: preventing election interference, rooting out dangerous organizations, or simply improving their product’s metrics for the next half.

The Facebook employees I’ve spoken to don’t feel a particular way about the company, so much as they felt every way about it. They see the good and the bad, the reassuring and the terrifying. Most sought to pull Zuckerberg toward more decisive action, and a more progressive politics — while Facebook’s user base pulled him in the opposite direction. The resulting stalemate seemed to satisfy no one.

At the end of July, employees asked Zuckerberg how prepared he felt for the 2020 election. He said he was worried: about COVID-19 affecting turnout, about how long the country would have to wait after Election Day to learn who had won, and about what might happen in the meantime.

“There’s potentially some chance that a lot of people take to the streets and then that ends up being a violent period,” he said,  “or at least that there is some violence.” Like many of Zuckerberg’s predictions to his staff this summer, it was both plausible and unsettling — a dark warning about the outside world, softened by the reassurance that Facebook itself would endure.

In any case, by the end of August, the violence Zuckerberg predicted had already arrived — first advertised on Facebook and then documented there, in a stream of highly engaging videos, text posts, and memes. America’s democracy was fraying, but despite a bruised public image, Facebook itself had never been stronger. The products were launched, and the metrics were up. Whatever was happening in the wider world, from the viewpoint of the data — the view from nowhere — it had all been a grand success. ■