A new peer-reviewed study from researchers at New York University and the Université Grenoble Alpes in France will show that misinformation got six times as much engagement on Facebook as real news, The Washington Post reports.
The study looked at posts from the Facebook pages of more than 2,500 news publishers between August 2020 and January 2021. Researchers found that the pages that post more misinformation regularly got more likes, shares, and comments. This increased engagement was seen across the political spectrum, but the study found that “publishers on the right have a much higher propensity to share misleading information than publishers in other political categories,” according to The Washington Post.
The researchers will share the study as part of the 2021 Internet Measurement Conference in November. But it could be released prior to that, researcher Laura Edelson tells The Verge.
Facebook maintains that only looking at engagement doesn’t tell the whole story
A Facebook spokesperson pointed out to the Post that the study only looks at engagement, and not “reach” — which is the term the company uses to describe how many people see a piece of content on Facebook, regardless of whether they interact with it.
Facebook does not make reach data available to researchers, though. Instead, they and others who want to understand and quantify the social media platform’s misinformation problem — including these researchers — have often turned to a tool called CrowdTangle, which is owned by Facebook.
But in August, Facebook cut off this group of researchers’ access to this data (as well as to the library of political ads on the platform). Facebook said that continuing to give third-party researchers access to the data could violate a settlement with the Federal Trade Commission that it entered into following the Cambridge Analytica scandal — a claim the FTC said was “inaccurate.”
CrowdTangle is the tool that New York Times tech columnist Kevin Roose used to make regular lists of posts that got the most engagement on Facebook — a practice that reportedly riled up top employees inside the company, because the lists were regularly dominated by right-wing pages that post a lot of misinformation.
In an effort to bat down claims that misinformation is a problem on Facebook, the company released a “transparency report” in August that laid out the most-viewed posts on the platform during the second quarter of the year, from April to June. Just days later, though, The New York Times revealed that Facebook had first scrapped plans to release a report about the first quarter because the most-viewed post between January and March was an article that wrongly linked the coronavirus vaccine to a Florida doctor’s death — a post that was used by many right-wing pages to sow doubt about the efficacy of the vaccines.