Skip to main content

Facebook's editorial purge has completely backfired

Facebook's editorial purge has completely backfired

/

Algorithms are flawed

Share this story

It’s been a rough few days for Facebook’s Trending Topics. The company announced on Friday that it’s made the news list more automated by removing article descriptions and leaning more heavily on algorithms to produce it. The company also reportedly fired as many as 18 editorial contractors responsible for writing descriptions and ensuring the accuracy of the sources it uses. Now we’re watching the immediate — and disastrous — effects unfold.

First, there were the hilarious mix-ups. Yesterday a video of a dog reacting to seeing its owner for the first time in two years went viral. The top headline, from something called iHeartViral.com, told people they just had to watch it. And yet the Trending Topics module put the video under the headline Watch Dogs 2 — an upcoming video game about hacking and cybersecurity.

Facebook let a fake news story about Megyn Kelly sit in Trending Topics for hours

Yesterday, a more serious error occurred: a fake news story about Fox News commentator Megyn Kelly’s supposed secret affinity for Hillary Clinton blew up on Facebook, landing it on top of the Trending list. Not only did the engineers (or algorithms) responsible for Trending fail to realize the story was false — it came from a partisan libertarian source called End the Fed — but Facebook also left it in the Trending module for hours to collect likes and comments. All the while, the company unwittingly gave enormous exposure to a damaging piece of false information.

So over the course of a single weekend, Facebook’s attempts to reorient one of its most prominent news products backfired in a big way. In its efforts to rely more heavily on algorithms, the company exposed a central vulnerability in its approach to news. All it took was 72 hours, and a false news story in the heat of a polarizing US election season, to show how fallible its algorithms can be.

Facebook says humans are still involved in the story selection process for Trending Topics, but it’s never been more opaque about what role they play. Before, Facebook had a set of guidelines that instructed human curators how to analyze fast-moving news events, and then contextualize the coverage of those events on regional, national, and international scales. This involved closely watching mainstream sources like The New York Times, CNN, and The Washington Post. These contractors were also instructed to analyze headline styles, sourcing of quotes and facts, and to avoid sensationalism.

The changes instituted on Friday didn’t throw all of that away; Facebook has been slowly stripping away the human element of Trending Topics for months now. Rather, it marked the moment Facebook decided its algorithmic approach was more favorable, or perhaps more cost-effective and less damaging. But in shifting the reins to engineers, the company has minimized the kind of news judgment typically exercised by journalists and editors. Now, just a few days later, we’re realizing just how important that human element was.

Facebook is a news source for 44 percent of US adults

This goes beyond embarrassing stumbles. Facebook is a news source for nearly half of the overall adult population of the US, and a primary one for tens of millions of people around the world. Spreading misinformation is a common tactic for partisan political machines and big corporations alike. And it’s becoming increasingly easy to do so on social networks, in part thanks to changes like the ones Facebook is making.

As John Herrman wrote in The New York Times Magazine last week, Facebook has become populated by a "new and distinctive sort of operation that has become hard to miss: political news and advocacy pages made specifically for Facebook, uniquely positioned and cleverly engineered to reach audiences exclusively in the context of the news feed." At stake is more than just the spread of a hoax. It’s the possibility that partisan spin masters will game Facebook to shape political opinion with false or exaggerated claims.

Facebook says it stripped down Trending Topics to help scale it to more users around the world, something it can’t do when it has to pick headlines and write descriptions manually. But it’s hard not to believe that recent accusations of political bias have influenced its decision. In May, Gizmodo reported accusations from former contractors saying the Trending team routinely suppressed news from conservative sources. Those claims ignited a national conversation about Facebook’s role and influence in journalism. The company has spent the months since trying to repair its image.

Facebook is still reacting to allegations of liberal bias

"We looked into these claims and found no evidence of systematic bias," the company said in its statement on Friday. "Still, making these changes to the product allows our team to make fewer individual decisions about topics. Facebook is a platform for all ideas, and we’re committed to maintaining Trending as a way for people to access a breadth of ideas and commentary about a variety of topics."

The company has since come out and explained exactly what occurred with regards to the Megyn Kelly story. Apparently, the piece began gaining traction in the more conservative sectors of the site. It checked the right boxes Facebook uses to flag stories that are generating a lot of conversation on its platform, at which point a human not well versed in political news chose to pin it to the Trending list. Not until re-reviewing the source did the Trending team realize the error.

"This was a mistake for which we apologize, and it has been corrected," Justin Osofsky, vice president of global operations at Facebook, said in a statement obtained by Erik Wemple at The Washington Post. "We’re working to make our detection of hoax and satirical stories quicker and more accurate." This statement offers more evidence for the idea that algorithms are far from neutral. They can neither detect satirical tones in headlines nor scan articles to determine where the source material is coming from.

So how should Facebook proceed? It could choose to eliminate the Trending list entirely. The feature has been a reliable source of PR headaches, despite having a relatively small footprint on the desktop version of the site. (It’s even less visible on Facebook’s mobile apps.) In the grand scheme of things, Facebook does not need Trending Topics. If it's is concerned only with being a "platform for all ideas," as CEO Mark Zuckerberg says, then it may bear no responsibility for what its users share, like, or comment on.

Facebook could create a more personalized Trending Topics list to avoid future failures

The other, more sensible option is to improve its safeguards and maybe create a more personalized Trending list. Facebook could ask users to select the news sources they wish to read, and only show them those pages’ most popular items. It could also employ a small team of human curators to monitor the news and choose a few stories per day in sections like politics, sports, and entertainment. That’s similar to how teams at Apple and Flipboard maintain curated news apps. Perhaps then you could toggle between different versions of the Trending list, just as you’re able to switch the News Feed from "Top Posts" to "Most Recent."

One day the news might be delivered by algorithms so smart they understand the difference between news and opinion, and can exercise something approaching sound editorial judgment. Should that day arrive, Facebook really will be the king of news.

Until then, we’re left with algorithms performing a job they’re still ill-equipped to handle, with humans perpetually left to clean up the mess after the fact. Meanwhile, thousands of people are now convinced Megyn Kelly is voting for Hillary Clinton in November. If this is progress, it sure doesn’t look like it.


How Facebook decides what’s trending