Skip to main content

Facebook admits its trending news algorithm needs a lot of human help

Facebook admits its trending news algorithm needs a lot of human help

/

Company denies censoring conservative topics

Share this story

Sean O'Kane

Facebook is starting to open up about how a story starts trending, following accusations that it's been censoring conservative topics. In a post today, Facebook search VP Tom Stocky explains that human editors actually have a key role in determining what gets placed in the News Feed's Trending Topics box. "Reviewers are required to accept topics that reflect real world events, and are instructed to disregard junk or duplicate topics, hoaxes, or subjects with insufficient sources," Stocky writes.

On one hand, that means Facebook's editors have a blanket mandate to vet any topic its algorithm identifies as trending; this is meant to prevent biases from slipping in and accurately represent visitor interest. But it also means that Facebook's editors have the discretion to determine whether a topic is worthy or "junk."

"We take these reports extremely seriously"

That sounds like an easy enough job — we've all seen plenty of viral stories we'd consider "junk" on Facebook — but it inherently gives Facebook's editors a meaningful degree of input on what starts trending. They're also required to confirm whether a story appears sufficiently truthful. At a minimum, this should help to prevent the many fake news stories that still plague Facebook from showing up to millions of people.

But taken together, it's an admission that Facebook's algorithms aren't perfect; what we're looking at is a cleaned-up version of the algorithm, filtered by humans. That's what allows accusations of censorship to slip in, like those reported by Gizmodo yesterday. Stocky's explanation today backs up much of what was reported by Gizmodo in a look at the inner workings of Facebook's Trending Topics last week.

Facebook is now explicitly denying that any of its Trending Topics editors prevented conservative topics from appearing. "We take these reports extremely seriously, and have found no evidence that the anonymous allegations are true," Stocky writes. He says that Facebook does not permit discrimination "against sources of any ideological origin" and that editor actions are reviewed.