Earlier this week, The Daily Beast published a story with a counterintuitive finding. As best as researchers could tell, there was no evidence that Russia is trying to interfere in the US midterm elections:
Russian social media trolls are, of course, still promulgating fake news and slapping frantically at America’s hot buttons—tweeting wildly in favor of Brett Kavanaugh’s confirmation, according to researchers, and pushing a counter-protest against last summer’s white supremacist Unite the Right 2 rally. The GRU is still hacking into computers in the U.S. and everywhere else. But so far, Russia-watchers say the trolls haven’t delved into the nitty gritty of 35 Senate campaigns and 435 House races. Nor has the GRU engineered the type of damaging email dumps that tent-posted the 2016 election circus.
There’s still plenty of disinformation to be found online, of course. And according to Facebook, much of it is homegrown:
Today, we’re removing 559 Pages and 251 accounts that have consistently broken our rules against spam and coordinated inauthentic behavior. Given the activity we’ve seen — and its timing ahead of the US midterm elections — we wanted to give some details about the types of behavior that led to this action. Many were using fake accounts or multiple accounts with the same names and posted massive amounts of content across a network of Groups and Pages to drive traffic to their websites. Many used the same techniques to make their content appear more popular on Facebook than it really was. Others were ad farms using Facebook to mislead people into thinking that they were forums for legitimate political debate.
As Sheera Frankel reported in the New York Times, was the largest purge of domestic bad actors on Facebook to date. But while we often focus on foreign efforts to disrupt our elections, the bulk of the influence campaigns, as you would expect, originate here. “If you look at volume, the majority of the information operations we see are domestic actors,” Nathaniel Gleicher, Facebook’s head of security, told Frankel.
What kind of information operations were these people running? Frankel traces the rise of Right Wing News, which had 3.1 million followers, and was recently posting false stories stating that lawyers for Christine Blasey Ford had been bribed in connection with her testimony about Supreme Court nominee Brett Kavanaugh.
Often, though, the discourse on these pages was within the mainstream of political discussion, even if hyper-partisan in nature. Lizza Dwoskin and Tony Romm have more examples in the Washington Post.
The domestic pages and accounts Facebook removed Thursday had a strong political bent. Nation In Distress, which claimed to be the early Trump supporter, recently shared a link to a story that had called Rep. Maxine Waters “demented.” Founded in 2012, it had amassed more than 3.2 million likes and over 3 million followers, as of Thursday morning, before it was taken down. The page linked in its “about” section to a website called “America’s Freedom Fighters,” which posted content and duplicated press releases that appeared to be written by others about violent crimes and gun rights -- all alongside a sidebar of ads. An administrator for the site declined to comment.
On the opposite end of the political spectrum, Reverb Press posted attacks on President Trump and referred to Republicans as “cheating scumbags” to its over 700,000 followers. A second left-leaning page, Reasonable People Unite, posted a screenshot from a Twitter user who said, “Somewhere in America, a teenage girl is listening to her parents defend Brett Kavanaugh and she is thinking to herself, if something like that happens to me, I have nowhere to go.”
Administrators for these pages, as you might expect, have complained that they did nothing wrong. But their pages weren’t purged for posting misinformation: they were purged for violating Facebook’s policies around “inauthentic behavior,” which can include using fake accounts to administer pages. If there’s less misinformation on Facebook tomorrow than there was yesterday, it’s essentially due to a technicality. If the administrators of these hyper-partisan pages would simply stop creating fake accounts, misinformation might thrive on Facebook as never before.
Meanwhile, by the end of the day, Russia was again making headlines for its abuses of Facebook. The company said it had disabled “dozens” of accounts and profiles belonging to a Russian company known as SocialDataHub. And what did SocialDataHub do? According to Reuters, it scraped public and private data to provide “state services with the means to identify people by analyzing social media users’ photographs.” Perhaps there’s a benign explanation for what Russian state services hope to do with illicitly obtained Facebook data, but it’s hard to imagine what that might be. The company’s CEO, Artur Khachuyan, reportedly compared the firm’s work to that of Cambridge Analytica.
CNET got in touch with Khachuyan, who promised them that everything the company is doing is on the up-and-up. If you can make heads or tails out of any of this, let me know. (I can’t.)
”No one just downloaded Facebook profiles, especially the data of citizens of other countries, except Russia,” Khachuyan said. “In Russia, such work is permitted by federal law No. 152 (this is analogous to the GDPR).”
Khachuyan also said Fubutech, an affiliated business that works with governments, is developing scraping technology for government clients but doesn’t scrape the data itself. Finally, he said that he teaches journalism courses that include using scraped data. Though the classes are focused on using public data from governments, some students scraped social media data, a Social Data Hub representative said.
Just another day on the world’s biggest social network. There are 25 days until the midterm elections.
I like Sen. Mark Warner, but so much of the Virginia senator’s act seems to be loudly harrumphing about matters in which he exerts little influence. But perhaps the Democrats will retake Congress someday — someday soon, even! — and so I will continue reading his interviews. He talked to Franklin Foer in a story published Thursday:
WARNER: I saw from my three girls, from my own family’s basis, the almost addictive tendencies of devices. I saw how an unedited public square often allowed extremist voices to connect with other extremists in a way that multiplied their volume, far beyond what they actually represent. I saw the proliferation of bots manipulating political conversation. And then there was also a point when I got pretty pissed off. I was seeing evidence of foreign intervention on the social-media platforms. I’ve met with Mark Zuckerberg a half-dozen times, decent enough guy, but there was such an arrogant kind of response to complaints: “Anybody who’d say that doesn’t understand.”
Makena Kelly looks at Congress’ mounting interest in the Google+ case:
“Particularly in the wake of the Cambridge Analytica controversy, consumers’ trust in the companies that operate those services to keep their private data secure has been shaken,” today’s letter reads. “At the same time that Facebook was learning the important lesson that tech firms must be forthright with the public about privacy issues, Google apparently elected to withhold information about a relevant vulnerability for fear of public scrutiny.”
ProPublica highlights some instances in which Facebook failed to properly political ads as such.
Here’s another distressing case of a fake-news law being used to suppress dissent. (Fathy reported being sexually assaulted by police officers.)
Five months after her initial arrest, Fathy has become the latest in a growing list of Egyptian dissidents prosecuted under the government’s “fake news” laws. The actress turned activist faces two years in prison and a fine of $562for speaking out on sexual harassment and criticizing the government.
“This is an outrageous case of injustice, where the survivor is sentenced while the abuser remains at large,” Najia Bounaim, Amnesty International’s North Africa campaigns director, said in a statement. “It is currently more dangerous to criticise the government in Egypt than at any time in the country’s recent history. Egyptians living under President al-Sisi are treated as criminals simply for peacefully expressing their opinions.”
Here’s an extremely telling media-in-2018 story, courtesy this month’s Newswhip rankings of top publishers. The No. 1 publisher is Unilad, with 32 million engagements. And what has it gotten for all those engagements? Well, it has been “put into administration,” a kind of UK version of bankruptcy. But hey, congratulations on all those likes and shares.
Will Oremus recently tried to explain the phenomenon of “lad” content on Facebook:
To American eyes, the dominance of sites such as LADbible and Unilad on Facebook might be as surprising as it is disconcerting—particularly since Facebook has very publicly vowed to improve the quality of stories that perform best in its news feed. These sites’ presence at the top of NewsWhip’s list prompted head-scratching even from seasoned pundits, such as FiveThirtyEight Editor-in-Chief Nate Silver.
But LADbible and Unilad are as familiar to blokes in England as they are foreign to the U.S. media elite. There isn’t a perfect U.S. equivalent, but imagine Upworthy rewritten by the Barstool bros, or a whole website comprised of “Around the Web” headlines, and you’ll get a decent picture. Once known for blatant misogyny, both sites embarked on an effort to tone it down a few years ago, with LADbible earning profiles by BBC News and BuzzFeed in the process. Nowadays they mix celebrity gossip and feel-good stories with hastily rewritten press releases, the occasional outrage-bait, and sundry viral fluff. Also: lots and lots of dogs.
Please — hasn’t Taylor Swift suffered enough already?
Taylor Swift has become the focus of partisan memes and fake news after she endorsed a Democratic candidate in Tennessee in an Instagram post, and encouraged her fans to register to vote.
Swift’s post caused partisan news sites and Facebook pages, as well as at least one hoax creator, to kick into high gear in an attempt to generate engagement and revenue through posts and memes.
You may remember when Facebook introduced Portal earlier this week, and journalists all said a variation of “not now, Facebook!” In response to those concerns, Facebook put its head of health research on stage asking “for large-scale access to more granular data on patients’ social and behavioral characteristics,” months after a similar project was shut down as part of the Cambridge Analytica fallout:
Abnousi advocated for large-scale access to more granular data on patients’ social and behavioral characteristics, which he said far outweighed the three other key factors impacting mortality rates: genetics, exposure to risks such as asbestos and access to quality health care. He didn’t specifically call for using Facebook or Instagram user data for these purposes.
“The primary driver of health outcomes in the United States are social and behavioral variables,” he said. “Really understanding what these social determinants of health are should be our primary area of focus.”
Founder Andrey Andreev says Bumble could go public. If so, it could provide a counterweight to the Match Group, which tends to snap up every new dating app as soon as it reaches a critical mass. The most amazing detail in this story is that while Bumble will be the brand post-IPO, it’s technically a subsidiary of something called Rimberg International.
Jessica Toonkel has a fun piece about how former MTV executives have flocked to tech media companies, including Facebook, YouTube, and Spotify. Among others, Facebook’s vice president of global marketing, Carolyn Everson, is an MTV alum.
Investors read Friar’s departure as a bad sign for Square, where she is assumed to be doing lots of day-to-day management while her boss, Jack Dorsey, focuses on Twitter. But there was also a little-discussed upside here, which is that one of the big social networks now has a woman as CEO. About time.
Oculus co-founder Palmer Luckey revealed yesterday that he was fired, though he didn’t say why. I think I know why, but I only have one source! So if you’ve heard something, get at me.
I meant to highlight this on Tuesday, but Dani Deahl has a retrospective on the truly staggering number of ways Google tried to enforce Google+ usage:
In 2011, the Google Bar was a black strip that ran across the top of the screen on all of Google’s web properties. That quickly shifted into a more clean integration with a drop-down menu attached to Google’s logo that let you switch between things like Maps, YouTube, and search. No matter how it looked though, one thing remained the same: it prioritized sharing content with Google+ and came with a notifications icon that let you know whenever there was new activity on your Google+ account.
Your Facebook Group can now launch a giant group chat to talk about the rare disease that is slowly killing you. Or something else! You get to choose.
Facebook introduced 3D photos, which it described as “a new way to share your memories and moments in time with a fun, lifelike dimension in both News Feed and VR.”
Peter W. Singer and Emerson and Emerson Brooking have a new book out, LikeWar, about our troubled information ecosystem. Foreign Policy has an excerpt:
Such online skirmishes may appear insignificant compared with real fights conducted with real weapons, but they have become just as important. As Gen. Stanley McChrystal, the highly decorated former commander of Joint Special Operations Command, stated at a military conference in 2017, for the foreseeable future what happens on social media will be crucial to the outcome of any debate, battle, or war. The reason, he explained, is that battles are now being waged over truth itself. In these fights, “the line between reality and perception will be blurred,” he said. “Separating fact from fiction will be tough for governments but almost impossible for populations.”
Here’s a good little thread from a disinformation researcher about why authoritarian governments have an easier time with the internet than democracies. You can probably intuit most of it, but I found it useful to see Ovadya spell it out:
I've been thinking recently about what I call "ungovernable spaces".— Aviv Ovadya (@metaviv) October 10, 2018
Online environments like fully encrypted chat.
Places where everyone has full autonomy—where governance can even be *mathematically* impossible, or at least only possible at the boundaries.
And finally ...
I mean, on one hand, I don’t believe this. Jack Dorsey is a very rich man, and he runs two companies, and surely he has a computer. Or, like, access to a computer. That said, his official answer to the question “do you use a computer?” is: “no.”
I don’t like to tell people how to live their lives when it comes to computers, but I think it’s something Dorsey should consider. The COO of Square just quit. Maybe use hers!
Talk to me
Send me tips, comments, questions, and domestic disinformation campaigns: email@example.com.