Skip to main content

Exclusive: Facebook says it can’t serve a billion people if it’s biased about the news

Exclusive: Facebook says it can’t serve a billion people if it’s biased about the news

/

Why editorial efforts at the company aren't likely to expand

Share this story

This week’s controversy over how Facebook programs its Trending Topics section has highlighted the company’s growing power over the distribution of news — and the fear that it might one day use that power toward partisan ends. Facebook denied a Gizmodo report in which an anonymous former contractor said he was told not to feature conservative-leaning topics inside the widget. But the revelation that Trending Topics were chosen with editorial input raised new questions about Facebook’s relationship with journalism. Is it a neutral platform, as it has long asserted, or it is gradually developing a set of editorial principles?

Thursday marks the first anniversary of Instant Articles, Facebook’s quick-loading news format. Before the controversy over Trending Topics unfolded, I had pitched Facebook on a discussion about both Instant Articles and the company’s thinking around editorial values. In response, the company invited me to its headquarters in Menlo Park to meet with Will Cathcart, director of product management for the News Feed. It's arguably one of the most important jobs at the company, overseeing development of the company's most used — and most profitable — product. And while the company has not made any data available, it seems clear that the News Feed drives exponentially more traffic to news stories than the lower-profile Trending Topics, which is all but invisible on mobile devices.

"We're not interested in adding our point of view"

One of the more surprising things Cathcart told me is that the News Feed and Trending Topics are run by separate teams. News Feed drives the majority of consumption on Facebook, and while humans create the algorithms that feed it, they don’t offer oversight on individual topics. Trending Topics, on the other hand, is run by the company’s search team, and is meant to take the pulse of all current events. But on both fronts, Cathcart said, Facebook strives for neutrality. "We want this to show you what you’re most interested in," said Cathcart, who has worked at Facebook since 2010. "We’re not interested in adding our point of view — we actually don’t think that works for a billion people."

Facebook critics say the company is naïve for asserting that it can deliver the news without bias, since its algorithms are written by people. Cathcart told me that at a high level, the product does have a point of view: it privileges posts that readers are likely to find "meaningful" over ones that are merely recent, as Twitter (mostly) does. It highlights posts that it believes to be "informative" in addition to those that are merely "entertaining," because the data suggests that users prefer it that way.

But Facebook’s reach extends to so many people and countries and languages, that building editorial products that are curated by humans would not be feasible, Cathcart says. Facebook wants the News Feed to feel like it has been tailored to your unique set of interests. "We think about what we’re trying to do in a really personalized way," he says. "And so I think if you’re trying to build a product for over 1 billion people to be informed about the news that they care about, you can’t really be building a product that has judgments about particular issues, or particular publishers. Because that doesn’t match what 1 billion people around the world want." Trending Topics are less personalized by design — they're meant to reflect activity on the platform — but Facebook sees that as a math problem, rather than one to be solved with editorial judgment.

"You can't be building a product that has judgments about particular issues."

I asked Cathcart whether in five years, Facebook was still likely to have editorial teams making the sorts of judgments that are currently made in Trending Topics. "We care about creating the product that people want," he says. "Whether or not we can do that entirely with automated systems, or it’s helpful to have people help, is actually just a detail." Cathcart played down the editorial influence of the current team, suggesting its primary role was bug-fixing — ensuring that related topics appear together as a single item, for example, or preventing the keyword "lunch" from trending every day around noon.

But what about cases where people are obviously wrong? If 100 million people are posting, falsely, that Barack Obama was born in Kenya, does Facebook have a role to play in stopping that? "I think you already see that happen on the platform today," Cathcart says. "It doesn’t have anything to do with us — people post a lot of this stuff and talk about it, and other people post different points of view. And the nitty-gritty of the details of how we should be involved I actually think is less important than building a platform where if people want to talk about that, it’s really easy to talk about that and find different points of view."

In the wake of Gizmodo’s story, the US Senate called on Facebook to provide more transparency into how it selects stories to be featured. I asked Cathcart about how Facebook views calls for transparency. "The general thing we’ve done over the past couple of years, in part in response to these questions and criticisms, is to try to be more transparent about how the News Feed works in general," he says. "We talk about the algorithms we use and why we use the algorithms we use, and why we change it we talk about when we change it. [We do] a bunch of whiteboard sessions where we go through, here are the top things that the algorithm looks at, and doing a blog post every time we make a substantive change to the News Feed. I think we’re trying to be a lot more transparent about it, without being able to point to for every specific person’s News Feed, what were all the decisions that went into it."

To read the rest of my discussion with Cathcart, click here.