clock menu more-arrow no yes
Children’s Online Safety-Facebook Whistleblower Photo By Tom Williams/CQ-Roll Call, Inc via Getty Images

Filed under:

Everything you need to know from the Facebook whistleblower hearing

Congress will hear new concerns about child safety on the world’s largest social network

On Tuesday, Facebook whistleblower Frances Haugen is appearing before a Senate Commerce subcommittee in what promises to be one of Facebook’s toughest congressional hearings in years. After serving as a source for a string of bombshell reports from The Wall Street Journal, Haugen went public on Sunday with concerns about Instagram’s mental health impacts on its youngest users, drawn from internal Facebook reports.

The result has been a new focus on child safety — a particularly sore point for Facebook. The company has disputed claims that Instagram exacerbates body issues in teenage girls, but the broader concerns about algorithmic amplification of harmful content have been harder to dismiss. In a hearing last week, Sen. Richard Blumenthal (D-CT) shared the results of a test in which his own staff was bombarded with Instagram posts related to eating disorders and self-harm after creating a dummy account posing as a teenage girl. Today’s hearing is expected to focus on the same topics, with direct testimony from Haugen on the company’s decisions.

We’ll be updating this post with everything that happens during the hearing — every question from lawmakers, every important quote, and every piece of new information from Haugen. Stay tuned.


In his opening remarks, Sen. Richard Blumenthal (D-CT) described how Facebook “has put profits ahead of people.” He noted how the platform’s algorithmic feeds can amplify insecurities in its younger users.

“I hope we will discuss whether there is such a thing as a safe algorithm,” Blumenthal said.

Blumenthal also called on Facebook CEO Mark Zuckerberg to return to Congress to testify on behalf of the Wall Street Journal’s recent revelations on child safety. Facebook has vigorously contested many of Haugen’s claims, but has done so through surrogates like global head of safety Antigone Davis or public relations lead Nick Clegg.

“Rather than taking responsibility and showing leadership, Mr. Zuckerberg is going sailing,” Blumenthal said.


Sen. Marsha Blackburn (R-TN) tackled Facebook’s business model in her first remarks during the hearing. “Facebook is not interested in making significant changes to improve kids’ safety on their platforms, at least not when that would result in losing eyeballs on posts or decreasing their ad revenues,” she said. “Follow the money.”

Following up on last week’s “finsta” discussion, Blackburn says that Facebook turns a blind eye to these private accounts as a means of boosting its active user numbers. Children can use these private accounts to interact with other people and the platform without their parents’ approval, she said.


Whistleblower Frances Haugen said Facebook has “repeatedly” misled the public about “what its own research reveals about the safety of children and the efficacy of its artificial intelligence systems as a role in spreading divisive and extreme messages.”

Haugen also called on Congress to take regulatory action to change Facebook’s business incentives to amplify harmful content to its users. She also encouraged lawmakers to push for further transparency into the company to tackle its “closed design.”

“It is unaccountable until the incentives change,” Haugen said. “Facebook will not change.”

But she also showed optimism that the problem can be solved if the government intervenes. “These problems are solvable. A safer, free-speech-respecting social media is possible,” Haugen said. “Facebook can change, but it’s clearly not going to do so on its own.”


Outside of Facebook’s business model, Haugen identified several structural issues that make it more difficult for the company to react to scandals. “Facebook is stuck in a cycle where it struggles to hire. That causes it to understaff projects, which causes scandals, which then makes it harder to hire,” Haugen said.

“A pattern of behavior I saw at Facebook was that, often problems were so understaffed that there was an implicit discouragement from having better detection systems. So for example, I worked on the counterespionage team, and at any given time, our team could only handle a third of the cases we knew about. We knew that if we built even a basic detector, we would likely have even more cases.”

Haugen also pointed out how Facebook’s engagement numbers are often the deciding factor in developing its services. “Mark has built an organization that is very metrics-driven. It is intended to be flat. There is no unilateral responsibility,” she said. “The metrics make the decision.”


In her 60 Minutes interview Sunday, Haugen described how Facebook’s Civic Integrity group dissolved after the 2020 presidential election. Those processes were later reinstated as an emergency decision during the January 6th riots at the US Capitol. Sen. Amy Klobuchar (D-MN) asked Haugen why Facebook decided to disband the integrity group.

“Facebook has been emphasizing a false choice,” Haugen told Klobuchar. “They’ve said the safeguards that were in place before the election implicated free speech.”


Sen. John Thune (R-SD) has previously introduced legislation to amend Section 230 of the Communications Decency Act. On Tuesday, Thune asked Haugen whether a change to that law could encourage Facebook to change its algorithms in a way that decreases user harm.

“I think if we had appropriate oversight, or if we reformed Section 230 to make Facebook responsible for the consequences of their intentional ranking decisions, I think they would get rid of engagement-based ranking because it is causing teenagers to be exposed to more anorexia content. It is pulling families apart, and in places like Ethiopia, it is literally fanning ethnic violence,” Haugen said.


In discussion with Sen. Jerry Moran (R-KS), Haugen came out in support of Section 230 reform that relates to algorithms even though these reforms can be “very complicated.”

“Companies have 100 percent control over their algorithms, and Facebook should not get a free pass on choices it makes to prioritize growth and virality and reactiveness over public safety,” Haugen said. “They’re paying for their profits right now with our safety.”


One of the big themes of the hearing has been that committee members want to respond to this by passing actual legislation — something that didn’t happen after the Cambridge Analytica hearings.

After his questions with Haugen, Sen. Ed Markey (D-MA) put that sentiment in stark confrontational terms:

Here’s my message for Mark Zuckerberg: Your time of invading our privacy, promoting toxic content, and preying on children and teens is over. Congress will be taking action. You can work with us or not work with us, but we will not allow your company to harm our children and our families and our democracies any longer. Thank you, Ms. Haugen. We will act.


Haugen’s most consistent point is that optimizing Facebook leads to user harm in unpredictable ways — so Sen. John Hickenlooper (D-CO) responded with a tricky question: will Facebook still be profitable without optimization?

It was an easy one to answer. Haugen said Facebook is currently making roughly $40 billion a year in profit. (It’s actually even higher if you go by the latest earnings.)

“The changes I’m talking about today wouldn’t make Facebook an unprofitable company,” she told the committee. “It just wouldn’t be a ludicrously profitable company ... People would consume less content on Facebook, but Facebook would still be profitable.”


Sen. Ted Cruz (R-TX) led Haugen toward a recurring theme in his hearing questions: alleged political censorship on Facebook. Haugen gently redirected the question. “A lot of the things that I advocate for are around changing the mechanisms of amplification, not around picking winners and losers in the marketplace of ideas,” she said, referring to features like asking someone to read an article before they share it.

“Small actions like that friction don’t require picking good ideas and bad ideas; they just make the platform less twitchy, less reactive. And Facebook’s internal research says that each one of those small actions dramatically reduces misinformation, hate speech, and violence-inciting content on the platform.”


In questioning from Sen. Cynthia Lummis (R-WY), Haugen called for more transparency. “Just disclosing the most popular content on the platform, including statistics around what factors went into the promotion of that content, would cause radically more transparency than we have today on how Facebook chooses what we focus on, how they shape our reality,” said Haugen, calling out “a pattern of behavior of Facebook hiding behind walls and operating in the shadows.”


Sen. Rick Scott (R-FL) referenced his DATA Act, which would require more transparency from social networks. Haugen seemed skeptical that transparency alone would solve the problems with Facebook’s algorithmic ordering of content — saying that if people were offered the option to pick a feed that was chronologically ordered or engagement-driven, they might still “choose the more addictive option.”


Haugen told Sen. Todd Young (R-IN) she was against breaking up Facebook from her perspective as an algorithmic specialist. “Even looking inside of Facebook itself ... you see the problems with engagement-based ranking repeat themselves. Our problems here are about the design of algorithms, of AI, and the idea that AI is not intelligent,” she said. “If you split Facebook and Instagram apart, most advertising dollars will go to Instagram, and Facebook will continue to be this Frankenstein that is endangering lives around the world; only now, there won’t be money to fund it.”


Sen. Amy Klobuchar (D-MN) asked about vaccine misinformation. Haugen was skeptical. “I do not believe Facebook as currently structured has the capability to stop vaccine misinformation because they’re overly reliant on artificial intelligence systems that they themselves say will never get more than 10 to 20 percent of content,” she said.


Sen. Ed Markey (D-MA) asked about whether Congress should ban quantitative feedback that encourages children to compare themselves against each other on sites like Instagram. Haugen said that without removing qualitative metrics like comments, simply removing Likes or other quantitative measures isn’t very effective because kids are able to use comments to gauge popularity. “Teenage girls are smart,” she said. But she did “strongly encourage” banning ads targeted at children.


The hearing finished with some words from Haugen on the “false choices” on issues like censorship versus safety or privacy versus oversight. “The fact that we’re being asked [about] these false choices is just an illustration of what happens when the real solutions are hidden inside of companies,” she said. “We need more tech employees to come forward through legitimate channels like the SEC or Congress to make sure that the public has the information they need in order to have technologies be human-centric, not computer-centric.”


Related:

Policy

A new Facebook whistleblower has come forward with more allegations

Policy

Lyft’s first-ever safety report reveals over 4,000 assaults between 2017-2019

Policy

Feds reportedly take down top ransomware hacker group REvil with a hack of their own

View all stories in Policy