This week’s news that a Russian firm linked to the Kremlin bought ads during the US presidential election came as a surprise. Facebook found thousands of ads, placed at a cost estimated at $100,000, generated by accounts tied to a St. Petersburg company called the Internet Research Agency. The impact of Russian political ad spending, which may have violated US law, is devilishly hard to sort out. While $100,000 can buy thousands of ads, the Trump campaign alone spent an estimated $90 million on digital advertising. At the same time, the disclosure appears to represent tangible evidence of Russian interference in the election.
So, how much did Russia’s political ad buys matter? The people in the best position to answer this question work at Facebook, which alone has access to the relevant data. But Facebook’s inconsistent statements, its history of errors in reporting on its own ad platform, and its reluctance to share relevant data about Russian hacking have added to its credibility gap.
Some have called Facebook “brave” for disclosing the ad buys. “They probably could have buried this, and they did the right thing by coming forward,” Clint Watts, a senior fellow at the Foreign Policy Research Institute, told The Washington Post. But beyond a blog post yesterday and some tersely worded statements to reporters, Facebook has done little to put Russia’s purchase of political advertising in perspective. A foreign country’s use of Facebook’s vaunted targeting capabilities in an effort to sway a close election is deeply disturbing. And yet Facebook, in its public statements so far, has been coy about sharing details.
Facebook’s reluctance to share relevant data about Russian hacking has added to its credibility gap
“The vast majority of ads run by these accounts didn’t specifically reference the US presidential election, voting or a particular candidate,” said Alex Stamos, Facebook’s chief security officer, in a blog post. And while that may be true, there’s room to remain skeptical.
Until this week, Facebook’s official position was that the company was unaware of any cases of Russian agents buying ads designed to influence US politics. As recently as July 20th, more than eight months after the election, the company said, “We have seen no evidence that Russian actors bought ads on Facebook in connection with the election.” It was the latest in a long series of statements about its role in the election that the company has later disavowed. In the election’s immediate aftermath, CEO Mark Zuckerberg famously said the idea that misinformation spread on Facebook had influenced the election was “crazy.”
Then on Wednesday, the company told congressional investigators that it had found thousands of suspicious ads. As reported by the Post, Facebook found 470 likely fraudulent profiles and pages, which it linked to the promotion of 3,300 ads designed to sway public opinion in America. The most likely culprit behind the ads was the Internet Research Agency, which Facebook described as a “troll farm” that is known to have promoted pro-Kremlin propaganda.
Facebook would likely say it has made a good-faith effort to tell the truth about its role in the election as it understood it at the time, and that its statements have evolved along with that understanding. (The company didn’t respond to requests for comment.) It has taken several steps to stop the spread of misinformation on the platform, such as bringing on third-party fact-checkers and allowing them to mark hoaxes as “disputed.” But the company’s ongoing naïveté about its influence continues to generate concern — as does the fact that it took a congressional investigation for the truth about Russia’s political advertising to come out.
This week, an analyst said Facebook overstated the number of people its ads reach by at least 10 million
Independently of the Russia inquiry, Facebook has suffered a series of embarrassing revelations this year about its advertising metrics, requiring the company to apologize at least three times. In one case, it admitted to overestimating the average viewing time for video ads for two years; in another, it admitted to inflating the number of visitors to businesses’ pages. Another glitch resulted in some advertisers being overcharged. On the same day as the Russian ad buy revelation, an analyst said Facebook overstated the number of people its ads reach by at least 10 million, using US Census data as a reference point. (Facebook said the estimates “are not designed to match population or census estimates.")
On one level, the glitches have little to do with Facebook’s ability to track the purchase of ads by Russian agents. But they suggest a sloppiness with data practices that places anything Facebook says about ad-buying efforts — the number of ads bought by fake pages, for example — under a cloud of suspicion. The fact that Facebook initially found no evidence of Russian involvement in the ad-targeting scheme has lessened the credibility of its subsequent disclosures.
Finally, and perhaps most consequentially, there's Facebook's reluctance to share relevant information about political advertising on the platform. The company refused to release any of the Russian ads it found, saying that doing so would violate its data policy and federal law. (It did not specify which law. In fact, political ads that air on television and radio are required to be made available for public inspection.)
The reluctance to share the ads generated criticism from academics, advocates of open government, and Democratic Sen. Mark Warner of Virginia, who said the Russian ad disclosure suggested that new regulations are needed. On Thursday, Federal Election commissioner Ellen L. Weintraub called for regulations that “ensure that the American people know who is paying for the internet political communications they see.” A hearing is scheduled, and Weintraub is seeking comment from the public and from internet companies including Facebook.
The company has faced sustained criticism of its so-called “dark ads,” which advertisers can serve to individual users without leaving any trace behind. There are no permanent links to the ads, which appear in users’ News Feeds, and they disappear once users thumb past them. As Erika Franklin Fowler, director of the Wesleyan Media Project, told BuzzFeed: “If candidates (and outside groups) can say different things to different voters, it is harder to hold them accountable for campaign promises.”
“If candidates (and outside groups) can say different things to different voters, it is harder to hold them accountable for campaign promises.”
Facebook is not the only platform that deserves scrutiny for its role in the election. A meticulous investigation by The New York Times found that bots and sock puppet accounts on Twitter played a large role in promoting criticism of Hillary Clinton and spreading pro-Kremlin misinformation. The company is expected to submit itself to congressional questioning of its own sometime soon. (Google, for its part, says it has still found no evidence that Russian agents purchased political ads designed to sway the presidential election.)
Now, nearly a year after the election, it seems clear that Facebook played a greater role in the election’s outcome than any other tech platform. At first, attention was focused on how bad actors used Facebook to achieve massive distribution for hoaxes and misinformation. Now, scrutiny is shifting to the company’s advertising platform — the core of Facebook’s business, where it has consistently misreported relevant data.
As the investigations move into their next phase, Facebook owes us a more complete accounting of how its targeting features and dark ads have been used to manipulate public opinion. And to the extent that those features are being used illegally, it owes us at least as much action as it has taken to stop the spread of hoaxes. But first, Facebook has to tell us the whole story. And so far, that’s been all too difficult to come by.