Skip to main content

Facebook’s big QAnon crackdown might have come too late

Facebook’s big QAnon crackdown might have come too late

/

After festering on the platform for years, the movement is now bigger than any one social network

Share this story

Illustration by Alex Castro / The Verge

Before the Tide Pods challenge was a public health crisis, it was a joke. The laundry detergent capsules, which were originally released in 2012, evolved over time to look strangely delicious: lush green and blue gels, swirled around one another attractively, all but daring you to eat them. This led to many jokes about Tide Pods maybe secretly being candy, and it might have stopped there — but then people actually started eating them. The “Tide Pods challenge” surged on social networks in 2018, and ultimately more than 10,000 children were reported to have been exposed to whatever extremely inedible substance is actually inside Tide Pods. Of teenagers who were affected, more than a quarter of the cases were intentional, the Washington Post reported at the time.

Eventually platforms banned the Tide Pods challenge, and the mania around their consumption subsided. But the story posed questions to platforms like Facebook, YouTube, and Twitter that they have struggled to answer ever since. When do you start to take a joke seriously? When does dumb talk cross the line into something dangerous? When does a lunatic conspiracy theory cross the line from a shitpost into a potential incitement to violence?

When, in other words, was the right time to ban any talk of the Tide Pod challenge?

Looking back, it seems clear that the answer is “sooner.” But “sooner” is still not an answer to when.

I thought about all this today reading about Facebook’s latest purge of accounts related to QAnon, the fringe theory that Donald Trump is an intelligent person working in secret to purge the country of Satanist pedophiles, while using lieutenants to send coded messages to 4chan users. (Ben Collins and Brandy Zadrozny wrote a definitive piece on QAnon’s origins as a grift for NBC News in 2018.) On its face, this theory seems no less a joke than the idea that TidePods secretly taste delicious. But as with the laundry detergent, thousands of Americans have now been poisoned by QAnon, and the consequences seem likely to be far more dire, and long-lasting.

First, though, the purge. Here are Collins and Zadrozny on Wednesday for NBC News:

Facebook on Wednesday banned about 900 pages and groups and 1,500 ads tied to the pro-Trump conspiracy theory QAnon, part of a sweeping action that also restricted the reach of over 10,000 Instagram pages and almost 2,000 Facebook groups pushing the baseless conspiracy theory that has spawned real-world violence.

Facebook also took down thousands of accounts, pages and groups as part of what they called a “policy expansion,” seeking to limit violent rhetoric tied to QAnon, political militias and protest groups like antifa.

Twitter made a similar move last month, banning 7,000 accounts and putting restrictions on 150,000 more. Both moves fell short of a full ban on discussing QAnon, though Facebook’s move to prevent QAnon groups from being recommended to users could cut off a key avenue for recruitment of new adherents. “While we will allow people to post content that supports these movements and groups, so long as they do not otherwise violate our content policies, we will restrict their ability to organize on our platform,” Facebook said in a blog post

In the New York Times, Sheera Frenkel reported that Facebook began scrutinizing the movement more closely in May, when QAnon groups played an essential role in promoting the “Plandemic” hoax. But by that time, Q adherents were already running for Congress, and several had been linked to real-world violence. Frenkel writes:

In New York, a woman who had cited QAnon theories as a reason she wanted to “take out” the Democratic presidential nominee Joseph R. Biden Jr., was arrested on May 1 with dozens of knives in her car. The group has been linked to more than a dozen violent incidents over the last year, including a train hijacking; last month, a QAnon supporter rammed a car into a government residence in Canada.

The spiking activity on its network, combined with real-world incidents, pushed Facebook to discuss policy changes to limit QAnon’s spread, the two employees said. But the conversations stalled because taking down QAnon-related groups, pages and accounts could feed into the movement’s conspiracy theory that social media companies are trying to silence them, the people said.

There are likely other reasons it took Facebook a couple of months to roll out these bans, and the company’s recent action against Boogaloo groups offers some clues. In that case, as in QAnon, the group’s boundaries are ambiguous, and ever shifting. Part of the appeal of QAnon is that many of its core messages are written in code, giving it the feel of an augmented reality game. But coded messages are harder to discern, particularly by policy teams that have not invested in unscrambling them. It’s easier to maintain the ironic detachment that defined our early reactions to the Tide Pods challenge — no one actually believes this stuff, right? — than it is to take any kind of preemptive action.

But in 2020 we know what happens when you let a movement fester. We have seen Reddit ignore its most racist forums until they spun out into thriving standalone communities. And we have now seen QAnon, which Facebook was recommending in its group recommendation algorithms until Tuesday, evolve into something like a new religion. Notably, Reddit banned QAnon forums starting in 2018 — long before it even banned hate speech — after the forums were found to be inciting violence.

Someone asked the president about QAnon on Wednesday, and he replied that he didn’t know all that much about them, but he appreciated that they seem to like him, and also that he is “saving the world.” Trump has fanned the flames of QAnon for years, retweeting dozens of accounts linked to the theory, and demonstrating that the group’s rise is much more than a problem of content moderation. When the president is praising a group that his own FBI has designated a domestic terrorism threat, the solutions are not at the level of platform policy.

Still, Facebook made it clear today that, like other networks before it, the company does consider QAnon a problem. As with Tide Pods, it seems likely that we’ll all look back and wish the company had taken Q seriously sooner. And if we want platforms to do better at managing whatever the next threat is that bubbles up, we would do well to reflect on when exactly that should have been.

Pushback

Yesterday I included a link to a story in Time about a request from the Gambian government to help investigate the genocide in Myanmar, where the United Nations has said Facebook contributed to the incitement of violence. Facebook wrote me last night to say that to comply with the request would violate users’ privacy:

One of the things that Matthew doesn’t mention is the fact that we are actively working with the UN’s Independent Investigative Mechanism for Myanmar who are collecting evidence for any future proceedings. We recognize the extraordinary gravity of the atrocities in Myanmar, which is why we’re making a voluntary disclosure to the IIMM, which is required by its mandate to help all courts and tribunals seeking accountability for Myanmar.

And a little more background on why we opposed The Gambia’s request: we have an obligation under the SCA not to release certain data to third parties, including the US and foreign governments, unless the user consents, there’s an emergency, the requesting party has a court order or the foreign government has a CLOUD Act agreement with the US. This means that for US companies to respond to a government’s request for most user data, those government need to have a Mutual Legal Assistance Treaty with the US and use the MLAT process to make the request.

The Gambia doesn’t have either kind of agreement with the US, and the SCA doesn’t include exceptions for international justice efforts.

It’s true that there are lots of cases where we wouldn’t want to see Facebook indiscriminately handing over user data to governments, no matter how serious-seeming the request. So this feels like useful context to share.

The Ratio

Today in news that could affect public perception of the big tech platforms.

🔼 Trending up: Facebook is supporting Black-owned businesses in the US with $40 million in grants. It’s also allowing them to identify their page as a Black-owned business so people can find and support them more easily. (Facebook)

Governing

As Facebook executives promised to crack down on health misinformation, its algorithm appears to have fueled traffic to a network of sites sharing dangerously inaccurate news. A report by the nonprofit Avaaz found that pages from the top 10 sites peddling pandemic conspiracy theories received almost four times as many views on Facebook as the top 10 reputable sites for health information. (This is the unfortunate counterpoint to yesterday’s column here about the Plandemic sequel flop.) Emma Graham-Harrison and Alex Hern at The Guardian share some top findings:

It found that global networks of 82 sites spreading health misinformation over at least five countries had generated an estimated 3.8bn views on Facebook over the last year. Their audience peaked in April, with 460m views in a single month.

“This suggests that just when citizens needed credible health information the most, and while Facebook was trying to proactively raise the profile of authoritative health institutions on the platform, its algorithm was potentially undermining these efforts,” the report said.

A relatively small but influential network is responsible for driving huge amounts of traffic to health misinformation sites. Avaaz identified 42 “super-spreader” sites that had 28m followers generating an estimated 800m views.

A top executive at Facebook in India is asking police to investigate death threats she received after the Wall Street Journal published a story saying she intervened to keep anti-Muslim hate speech online from politicians of India’s ruling Bharatiya Jana Party. How on earth does Facebook square its mission of “giving people a voice” with a top policy executive making a criminal complaint over a journalist criticizing them in a Facebook post? This seems ludicrous, and no one I talked to at Facebook on Wednesday could explain it to me. (Pranav Dixit and Ryan Mac / BuzzFeed)

Related: Facebook employees are now raising questions about whether content moderation practices are being followed by the India team in light of the allegations. A small group of employees penned an open letter demanding company leaders denounce “anti-Muslim bigotry” and ensure more policy consistency. (Aditya Kalra and Munsif Vengattil / Reuters)

India’s antitrust watchdog dismissed a case against WhatsApp, saying the company did not abuse its dominant position to expand in the country’s digital payments market. The case, filed in March, alleged that WhatsApp was bundling its digital payment service in its messaging app, which already had a large user base. (Reuters)

Inside the Boogaloo, America’s extremely online extremists. To understand the movement, this piece says, you first need to understand the militia movement that took root in the 1990s. But while militias are waiting for an imminent war, Boogaloo adherents seem intent on making the war happen. (Leah Sottile / The New York Times Magazine)

Silicon Valley executives are rallying behind Kamala Harris as Joe Biden’s VP pick. While many tech workers supported more progressive candidates like Bernie Sanders and Elizabeth Warren, their bosses are relieved to have someone with closer ties to the valley. (Eric Newcomer / Bloomberg)

President Trump said he supports Oracle buying TikTok. Oracle has closer ties to the White House than most other parties involved in the bidding, including Microsoft. Corruption in plain sight. (Aaron Tilley and Georgia Wells / The Wall Street Journal)

A TikTok ban is overdue, this opinion writer argues. The privilege of accessing the open internet should extended only to companies from countries that respect that openness themselves. (Tim Wu / The New York Times)

Apple pulled more than 47,000 apps from the Chinese App Store earlier this month as tensions continue to rise between the US and China. The company recently eliminated a loophole that previously allowed paid games and games with in-app purchases to be sold even though they were still awaiting approval from Chinese regulators. (Jay Peters / The Verge)

WeChat has helped Tibetan refugees keep in touch with their families. At the same time, its potential to be used as a surveillance tool have been causes of concern, particularly among Tibetan activists. (Tsering D. Gurung / Rest of World)

Taiwan is planning to ban mainland Chinese streaming services iQiyi and Tencent Holdings from operating on the island. The move follows the US and India placing restrictions on Chinese tech companies amid heightened political tensions. (Iris Deng, Yujie Xue in Shenzhen and Josh Ye / South China Morning Post)

Taiwan accused Chinese hackers of infiltrating government agencies to try and access sensitive data on citizens. The revelation comes as Taiwan has been caught up in the escalating struggle for global influence between the US and China. (Debby Wu / Bloomberg)

Data gleaned from two Twitter employees who allegedly spied on behalf of the Saudi government was later used to harass or arrest Saudi dissidents. Human rights organizations say they have identified six Saudi citizens who ran anonymous or pseudonymous Twitter accounts critical of the government who have been arrested. Chilling. (Ryan Gallagher / Bloomberg)

Herman Cain’s Twitter account is bringing up uncomfortable questions around what should happen to a public figure’s social media profiles after they die. Should the account remain verified, or should it lose that status to better reflect the memorialized state of the account? Personally I hope to keep tweeting long after I’m dead, but just super generic stuff like “Thread” and “That’s it, that’s the tweet.” (Tamara Kneese / Slate)

Industry

The stocks of Apple, Amazon, Alphabet, Microsoft and Facebook now constitute 20 percent of the stock market’s total worth, a level not seen from a single industry in at least 70 years. This dominance is propelled by the companies’ unprecedented reach into our lives. Here are Peter Eavis and Steve Lohr at The New York Times:

Amazon’s business, already towering over competitors in e-commerce and cloud computing, has become even more important to businesses and households. Its stock is up over 50 percent from its pre-pandemic high, underscoring just how much investors think it has benefited from the disruption.

Critics say the companies have grown in part because of a range of anticompetitive practices. European regulators are investigating whether Apple’s App Store breaks competition rules. American regulators are looking at whether large tech firms committed antitrust abuses when acquiring other companies. Some antitrust scholars believe the rise of industry-dominating companies has led to stagnant wages and increased inequality. Last month, tech chief executives were grilled by members of the House Judiciary antitrust subcommittee.

Zoom is coming to the Amazon Echo Show, Facebook Portal, and Google Nest Hub Max later this year. It’s a big expansion for the video conferencing app, and a shift for the tech giants that have previously stuck to their own, in-house video chatting solutions on their smart displays. (Chaim Gartenberg / The Verge)

Instagram is bringing QR codes to the app. The idea is that businesses can print their QR code and have customers scan it to open their Instagram account easily. (Ashley Carman / The Verge)

Instagram is now placing ads at the end of the feed, where the “You’re All Caught Up” notice sits. It’ll also suggest new organic posts for users to view. (Sarah Perez / TechCrunch)

QAnon is spreading among Instagram influencers, some of whom have latched on to the theory about child trafficking. The conspiracy theory is sprinkled in beside regular lifestyle content. It will be interesting to see how Facebook’s QAnon purge affects Instagram. (Kaitlyn Tiffany / The Atlantic)

New networking apps are capitalizing on the remote work trend and trying to speed up networking. Personally I have decided to let my professional networks wither and die during this time! (Ann-Marie Alcántara / The Wall Street Journal)

Black founders and CEOs say they faced biased assumptions, racism and harassment as they’ve tried to pitch their companies to investors. One founder says he was asked, “Were your grandparents slaves?” during an initial meeting. (Emily Birnbaum / Protocol)

Those good tweets

Talk to us

Send us tips, comments, questions, and conspiracy theories: casey@theverge.com and zoe@theverge.com.