Facebook’s newest measure to combat disinformation and misuse of its services is a push to rid the social network of malicious, potentially state-sponsored “information operations.” The company, which published a report on the subject today, defines these operations as government-led campaigns — or those from organized “non-state actors” — to promote lies, sow confusion and chaos among opposing political groups, and destabilize movements in other countries. The goal of these operations, the report says, is to manipulate public opinion and serve geopolitical ends.
“Our mission is to give people the power to share and make the world more open and connected. Yet it is important that we acknowledge and take steps to guard against the risks that can arise in online communities like ours,” writes authors Jen Weedon and William Nuland, members of Facebook’s Threat Intelligence Team, and Alex Stamos, Facebook’s chief security officer. “The reality is that not everyone shares our vision, and some will seek to undermine it — but we are in a position to help constructively shape the emerging information ecosystem by ensuring our platform remains a safe and secure environment for authentic civic engagement.”
The actions go beyond the posting of fake news stories. The 13-page report specifies that fake news can be motivated by a number of incentives, but that it becomes part of a larger information operation when its coupled with other tactics and end goals. Facebook says these include friend requests sent under false names to glean more information about the personal networks of spying targets and hacking targets, the boosting of false or misleading stories through mass “liking” campaigns, and the creation propaganda groups.
The company defines these actions as “targeted data collection,” “false amplification,” and “content creation.” These tactics line up with the supposed operations of Russia, which is believed to have used fake accounts and groups to spread the WikiLeaks data dumps regarding Hillary Clinton and the Democratic National Committee during the US presidential election last summer. Facebook does not mention either Russia or WikiLeaks, but it does say its data “does not contradict” the findings of the Director of National Intelligence and Department of Homeland Security, which in January officially blamed Russia for the hacking of the DNC.
Facebook plans to target these accounts by monitoring for suspicious activity, like bursts of automated actions on the site, to enact mass banning of accounts. The company is also increasing its protections against manual creation of large numbers of false accounts and leaning on machine learning techniques to spot other abusive and telling behavior. For instance, Facebook targeted 30,000 fake French accounts prior to the country’s presidential election earlier this month using similar methods.
Although the report often attributes this behavior to one side or another in the geopolitical landscape, like one candidate or political party versus another or one country versus an known enemy state. However, Facebook is also seeing activity on both sides, suggesting that independent groups of bad actors are interested simply in causing chaos using social media. The directors of networks of fake accounts may have a longer-term objective of purposefully muddying civic discourse and pitting rival factions against one another,” the authors write.
The report is far from a refutation or reversal of CEO Mark Zuckerberg’s claim in November that his company did not sway the US presidential election. It is, however, part of an ongoing and tacit admission from the chief executive that Facebook is now so powerful and influential than it can be used in ways outside his control. Since the beginning of the year, Zuckerberg has been softening his stance on the political influence of Facebook, and he’s gone to great lengths to try and reconcile his company’s global vision with the reality of how Facebook is being used around the world.
This culminated in Zuckerberg’s new 5,800-word manifesto posted in February that outlined his aim to use Facebook a a way to build the “social infrastructure for community.” Whether that mission can be achieved — and whether Facebook can truly be a productive force for civic engagement — with the current iteration of Facebook will be the company’s most important test moving forward.
Still, it appears Facebook is coming around to realizing that it’s going to take a significant effort from all sectors of the company to combat both fake news and the broader “information operations” of bad actors around the globe. The company has already kicked off a mission to curtail the spread of disinformation that exists purely to churn up clicks or drive up revenue. It’s doing so through a mix of fact-checking partnerships and tools to better educate users on what’s real and what’s deliberately false or misleading. Now though it seems Facebook is taking it one step further and getting into the realm of identifying and fighting government campaigns to misuse its services.