A few years ago, Facebook became aware that Russia kept posting misinformation all over the network. The misinformation was designed to rile people up and make them share with their friends, and because people are generally pretty easy to rile up, Russia’s strategy was very successful. Some prominent political scientists believe that the country’s election interference, both on and off Facebook, ushered Donald Trump into office. And we’ve spent a good portion of the past three and a half years arguing about it.
About a year after the election, Facebook introduced a tool to let people know if they had unwittingly interacted with the Russian troll army. If you liked the page of a troll in disguise, you could visit an obscure part of Facebook and it would tell you. The tool would not tell you if you had viewed any of the page’s posts, or even if you had shared them. Alex Hern wrote about this flaw at the time in The Guardian:
Facebook will not tell those users about their exposure to misinformation, although the company has not said whether it is unable, or simply unwilling, to provide that information. A source close to the company described it as “challenging” to reliably identify and notify everyone who had been incidentally exposed to foreign propaganda.
Fast-forward to today, when the misinformation we’re worried about primarily has to do with COVID-19. Over the past few weeks, we’ve talked about hoaxes attempting to link the coronavirus to new 5G networks, dangerous fake “cures” based on drinking bleach, and so on. Reporting has consistently found these sorts of articles racking up thousands of shares on Facebook. Even more than Russian misinformation, the COVID-19 hoaxes pose clear public health risks. So what should Facebook do about it?
On Thursday, the company said it would invite people who had shared a hoax to visit a page created by the World Health Organization debunking popular COVID-19 myths. Here’s Guy Rosen, Facebook’s vice president of integrity, in a blog post:
We’re going to start showing messages in News Feed to people who have liked, reacted or commented on harmful misinformation about COVID-19 that we have since removed. These messages will connect people to COVID-19 myths debunked by the WHO including ones we’ve removed from our platform for leading to imminent physical harm. We want to connect people who may have interacted with harmful misinformation about the virus with the truth from authoritative sources in case they see or hear these claims again off of Facebook. People will start seeing these messages in the coming weeks.
If you didn’t read the above paragraph closely, you might assume Facebook’s system would work something like this: You share an article that says something like, “huffing macaroni and cheese fumes cures coronavirus,” that article gets debunked by an independent fact checker, and then Facebook links you to the WHO’s page about macaroni and cheese myths. Maybe there would even be a message that said something like, “Just so you know, huffing macaroni and cheese does not cure coronavirus. Click here for more.”
But we have learned that people are hard-headed and do not enjoy being told that they have been duped. There was a famous moment after the 2016 election when Facebook began labeling false posts as “disputed” and discovered that doing so made people share them more. And so the company has taken a different approach here.
A few weeks from now, people who have shared mac-and-cheese-cured-my-COVID type posts will see a big story in the News Feed. It is not labeled “Hey, you have been duped.” Rather, it says: “Help friends and family avoid false information about COVID-19.” It then invites them to share a link to the WHO’s myth-busting site, as well as a button that will take the user to the site directly.
The goal of this type of approach is to make people less defensive about the fact that they may have been wrong, and try to smuggle some good information into their brains without making them feel dumb about it. The appeal to helping friends and family is also a nice touch. Who doesn’t want to help their friends and family? And Facebook is putting the information directly into the News Feed — no need to visit some arcane help center buried beneath layers of taps. (If you share a post that gets contains misinformation and is removed, you’ll also get a separate notification about that.)
But this approach also has downsides. If you do want to know if you’ve accidentally shared a lie to all your friends, this tool won’t help you. And the WHO myth-busting page currently debunks 19 different hoaxes — what are the odds you’re going to scroll all the way down to the one you accidentally shared and read it? What about next month, when that list has grown to 40?
This is not a small problem. Avaaz, a human rights group that tracks misinformation closely, published an in-depth report this week that examined 100 pieces of misinformation, written in six languages, that were shared on Facebook. It found that those posts were shared more than 1.7 million times and seen an estimated 117 million times. (Vice talks to the authors.)
The authors of the Avaaz report argue that Facebook should inform each person who has viewed coronavirus misinformation about exactly what they got wrong. The group even conducted a test of this system that it says shows something like this can work:
In order to test the effectiveness of corrections, a hyper-realistic visual model of Facebook was designed to mimic the user experience on the platform. Then a representative sample of the American population, consisting of 2,000 anonymous participants, chosen and surveyed independently by YouGov’s Academic, Political, & Public Affairs Research branch, were randomly shown up to 5 pieces of false news that were based on real, independently fact-checked examples of false or misleading content being shared on Facebook.
Through a randomized model, some of the users, after seeing the false news, were shown corrections. Some users saw only the false or misleading content, and some saw neither. Then the surveyed participants answered questions designed to test whether they believed the false news.
Avaad said its study showed that belief in misinformation declined at least 50 percent in study participants.
Rosen told me that calling out these hoaxes with a special message might give them more visibility than they originally had, amplifying the misinformation. Maybe you scrolled by a piece of misinformation without internalizing its contents; if Facebook puts a big red box in the News Feed that says “by the way, this is false,” the effects could be counterproductive.
Still, he said, Facebook is testing the use of language that more explicitly says that a person is seeing the WHO messages because they saw misinformation. The goal is to provide the most effective messaging possible, he said.
One possibility I see is to offer different interventions based on whether someone simply saw a hoax, or actively commented on or shared it. People who share hoaxes arguably deserve a stronger response than someone who simply saw something — or maybe even just thumbed past it — in their feed.
Compared to its early work on the Russian troll problem, Facebook has taken a refreshingly interventionist approach to stopping the spread of COVID-19 misinformation. But it also remains unclear which of those interventions actually work. Given the risks to public health, here’s hoping that Facebook learns quickly.
The Interface Live!
On Tuesday, we announced that the next edition of our Interface Live series featuring me in (live-streamed) conversation with Sarah Frier, author of No Filter: The Inside Story of Instagram. The event takes place April 21st at 5:30 p.m. PT, and you can register here. It’s free, but you do have to RSVP — and we’re now almost at capacity. If you’d like to join, please RSVP today!
Today in news that could affect public perception of the big tech platforms.
⬆️ Trending up: Facebook rolled out a new alert on Facebook and Instagram to tell people who’ve lost their job and health insurance due to COVID-19 to check out HealthCare.gov to see if they’re eligible for coverage through the Affordable Care Act.
⬇️ Trending down: A YouTube video falsely accusing Dr. Anthony Fauci of being part of the deep state has gone viral, with more than 6 million views in a week. The video also advises people to treat COVID-19 with vitamin C — a claim that wasn’t backed up by science. Not good, YouTube.
⬇️ Trending down: An outbreak of coronavirus infections at an upscale Whole Foods in Washington DC has highlighted how dangerous it is for grocery store workers during the pandemic. At least six employees have the virus, but the company won’t close the store. Workers are instead free to take leave without penalty through the end of April — but it’s unpaid.
⬇️ Trending down: Construction on Amazon’s HQ2 in Pentagon City, Virginia is moving forward during coronavirus pandemic. Some neighbors said they’ve been surprised to see the job site has remained busy. What’s the argument that this is essential business?
President Trump released guidelines for easing social distancing. (Washington Post)
Getting a coronavirus test still depends heavily on what state you live in. (Colin Lecher, Maddy Varner, and Emmanuel Martinez / The Markup)
Facebook has canceled all in-person conferences through June 2021. Many others will follow. Including most of the entertainment and professional sports industry, I imagine. (Queenie Wong / CNET)
Jeff Bezos said mass testing around the world for the coronavirus is needed to “get the economy back up and running.” In a letter to Amazon shareholders, he also announced plans to test all Amazon workers, even those not showing symptoms. You can read the full letter here. (Annie Palmer / CNBC)
Amazon is redesigning its website to encourage shoppers to buy less, in an effort to keep up with surging demand. It removed most of its popular recommendation widgets and canceled Mother’s and Father’s Day promotions. (Dana Mattioli / The Wall Street Journal)
Now that presidential campaigns are mostly happening remotely, reporters don’t have access to face-to-face interviews with swing voters or political operatives. “There is no more campaign trail,” says this author. (Michael M. Grynbaum / The New York Times)
The Anti-Defamation League warns white supremacists are targeting Jewish groups on Zoom. While the company has rolled out new security measures, the ADL says it still must do more to protect people. (Zoe Schiffer / The Verge)
Alphabet CEO Sundar Pichai talks to Time about the role he thinks Big Tech should play in the coronavirus pandemic. Another journalist sits down with Google’s CEO for an interview hoping for some news and comes away empty-handed. What’s the point? (See also.) (Nancy Gibbs / Time)
Across Asia, hackers, web developers, and students are collaborating to track Covid-19 data. A site in South Korea has become one of the country’s leading sources of accurate, up-to-the-minute tallies of confirmed infections and places where infected people have traveled. (Sheridan Prasso and Sohee Kim / Bloomberg)
In the quarantine age, Cameo has become the gig economy for niche celebs. Some are using it to compensate for a loss of income due to the pandemic. Others say they’re just bored. (Zach Schonfeld / Vice)
Across the games industry, developers are adjusting to a new work-from-home mandate. Some struggle to stay motivated as they grapple with isolation, others say their routines remain largely unchanged. (Megan Farokhmanesh / The Verge)
Total cases in the US: At least 662,441
Total deaths in the US: More than 30,000
Reported cases in California: 27,250
Reported cases in New York: 222,284
Reported cases in New Jersey: 75,317
Reported cases in Massachusetts: 32,181
Reported cases in Michigan: 29,119
⭐ How India, the world’s largest democracy, became the world’s largest experiment in social media and WhatsApp-fueled terror. The country has seen a sharp rise in right-wing Hindu vigilantism and violence against Muslims over the past few years. Here’s Mohammad Ali at Wired:
But what seemed very real was that even if social media platforms hadn’t created the mass delusions of Hindu extremism, they had provided a shockingly efficient infrastructure for their spread. India has 400 million WhatsApp users and 260 million users of Facebook, and it is the largest global market for both platforms. Facebook has come under heavy fire in India for uneven enforcement of its community standards against hate speech and misinformation. A 2019 report by the NGO Equality Labs found that Islamophobic posts often stayed up on the platform. In a particularly chilling example, Equality Labs found a huge number of Indian Facebook posts targeting Muslim Rohingya refugees from Myanmar, who had already been the victims of one social-media-fueled ethnic cleansing in their home country. The Indian pages called the Rohingya “cockroaches” and posted fake videos that purported to show them cannibalizing Hindus—clear violations of Facebook’s standards.
Joe Biden isn’t very good at the internet. That’s a liability heading into a general election with Donald Trump. (Kevin Roose / New York Times)
⭐ Libra is pulling back from its ambitions to create a global digital currency in a bid to appease global regulators. The Libra Association, set up last year by Facebook, now plans to develop a handful of stablecoins each representing a different fiat currency. I would be lying if I told you I understood what that means, or what Libra is, or why it is moving forward. Brady Dale at CoinDesk reports:
The pivot, announced Thursday, represents a major concession to governments and central bankers around the world who balked at Libra’s original plan, partly out of concern it could undermine their monetary sovereignty.
”The journey since the original white paper was released has really provoked an important conversation around the world about, ‘How do we appropriately regulate digital payments and digital currencies?’” Libra Association vice-chairman Dante Disparte said in an interview.
YouTube ad prices have dropped 20 percent during the pandemic. There’s simply more inventory than there are advertisers. (Max Willens / Digiday)
Tim Cook talked up Apple’s strengths during an all-hands meeting about the pandemic with employees. (Mark Gurman / Bloomberg)
Amazon might offer health insurance to its sellers as part of a gradual push into healthcare. (Jason Del Rey / Recode)
The United Nations backtracked on a deal made with Chinese tech giant Tencent to provide videoconferencing services for the organization’s 75th anniversary. The move follows backlash from US officials and critics who claimed the arrangement rewarded a company that has enabled Beijing’s digital surveillance efforts and stifled free speech. Tencent is a big investor in Snap, among other US companies. (Colum Lynch and Robbie Gramer / Foreign Policy)
Google is launching a new Kids section for the Google Play Store, which will offer a selection of “Teacher Approved” apps. Each app includes a list of reasons why teachers like it, including age-appropriateness, and what it’s trying to teach children. (Jon Porter / The Verge)
Google integrated its videoconferencing tool Meet with Gmail in order to better compete with Zoom. Now business and education users can take calls directly from the email app. (Paresh Dave / Reuters)
Verizon is buying the video conferencing platform BlueJeans as workers increasingly rely remote working tools to connect during the coronavirus pandemic. Verizon will pay about $400 million in the deal. BlueJeans is Facebook’s internal video chat app of choice, for what it’s worth. (Lauren Feiner / CNBC)
TikTok amped up its parental controls with a feature that lets parents remotely set restrictions on their kids’ accounts. The new feature, called Family Pairing, allows parents disable direct messages, turn on restricted content mode, and set screen time limits. (Jacob Kastrenakes / The Verge)
Animal Crossing fans are using the game to take people on Tinder dates. (Patricia Hernandez / Polygon)
Instagram’s favorite psychologist, Dr. Nicole LePera, is part of the #selfhealer movement that teaches people to reject established science and question traditional psychotherapy in order to heal themselves from within. (Katie Way / Vice) That doesn’t always work!
Things to do
Stuff to occupy you online during the quarantine.
Anonymously AirDrop a photo telling people to get further away from you if they’re not social distancing. Hilarious, although perhaps less effective than just telling them.
Listen to a new podcast from New York Times reporter and friend of the newsletter Kevin Roose about the internet. Among other things, Rabbit Hole will delve into how YouTube can radicalize viewers.