Skip to main content

Will Google’s homepage news feed repeat Facebook’s mistakes?

Will Google’s homepage news feed repeat Facebook’s mistakes?


This is exhausting; I’m exhausted

Share this story

Image: Google

I woke up this morning to two pieces of news that I couldn’t help but see as two sides of the same coin: Brazil elected a person who very clearly looks like a demagogue who weaponized social media to win, and Google finally placed a news feed on its mobile homepage.

These two things are, of course, not related. But after a weekend where we learned about two domestic terrorists who were radicalized by social media, all I could think about regarding Google’s new effort to push an algorithmic news feed was: “Hey Google, read the freaking room.”

We live in an era where we see real-world tragedies inspired by some form of awfulness on social media every day. We sometimes struggle to clearly define the online causal connections to these tragedies, but over the past few weeks, it doesn’t seem all that hard. So it seems like a pretty inopportune time for Google to decide to put yet another news feed in front of millions (or billions) of people. There has probably not been a time in 2018 when Google could have chosen to launch a new news feed that wouldn’t have made me feel this way, but this week seems particularly bad.

Apologies for using a question for the headline of this story, but I think it’s fitting because there are way more questions than answers right now. The “why now?” question is probably best answered in the most typical way of all tech companies: because even though Google’s news feed was announced in September, it’s ready to ship now. But the more fundamental “why” is harder. Google, like Apple and Facebook and Twitter and everybody else, apparently really wants to be in the news aggregation game. That is probably because it’s a pretty good way to get people to look at a webpage, and getting you to look at webpages is fundamentally how Google makes money.

But the more urgent question is “what does this mean?” Because, again, the world’s most trafficked webpage is getting a news feed, so I feel like it’s going to mean something. And so far as I can tell, trying to figure out “what does this mean?” just leads to more questions.

Before I enumerate them, here’s some context. Facebook has faded as a traffic driver to news sites, but The Verge and others have seen that traffic replaced with Google AMP, the company’s new standard for fast-loading webpages. With AMP, Google has become an even bigger traffic driver for news sites than it was before since websites that participate in AMP can appear in the carousel of featured stories that appear at the top of search results. Now, those stories will also appear in the Google news feed on the mobile search page. I suspect this will also drive an enormous amount of traffic.

This feed has been available already for quite a while: in addition to the newly relaunched Google News app, the feed is also inside Google’s mobile app, and it appears when you swipe left from the home screen on many Android phones. Those screens also drive traffic for news sites, but nowhere near as much as the AMP search carousels, from what we can tell.

The point of all this context is that putting the news feed on Google’s homepage is a Big Deal because it, I suspect, will supercharge an already existing trend: there is going to be a lot more traffic coming from Google’s algorithmic feed, which means there’s a lot more incentive to game the algorithm.

Google, to its credit, has comported itself fairly well lately when it comes to presenting the news in a responsible fashion: the Google News app is curated, has lots of visual indicators that explain why you’re seeing something, and little buttons to help you see less of it. But make no mistake, Google is taking on a huge responsibility by allowing machines to start displaying news stories on what might be the most popular website on the planet.

So: will Google continue to be responsible in its efforts to distribute news? Will it increase our filter bubbles? Will it contribute to the radicalization of politics? Answering those questions is going to be devilishly hard because although the Google news feed is driven by algorithms that could make the same mistakes as Facebook’s did, it is not fundamentally social.

The news feed is, as the company likes to sometimes say, “your own personal Google.” When Google put false information in its carousel, we were able to replicate the searches and discover the problem, but if Google’s news feed presents false information on someone’s search page, the only ones who will know it happened are the person reading it and Google. Facebook’s news feed is difficult to replicate, but individual stories on Facebook have Likes and share metrics displayed on them. There’s really no way to measure what’s going on with the Google news feed.

We may never know if Google’s news feed is causing problems because it’s so personalized and asocial

The mechanics of who sees what, when, and how matter a great deal. Just ask one of Google’s other algorithmically driven content feeds, YouTube. It is a haven for extremism and conspiracy, and YouTube is making it worse with next video suggestions that can rapidly send you into a vile, dark hole. I haven’t seen many reports that Google News has had similar issues, but it’s worth noting Google doesn’t have a sterling track record.

As we conduct literal postmortems on acts of terrorism and elections that went awry, we’re able to see many of the messages that people post on various social networks and track their spread and popularity. You can count the likes and the retweets, and you can graph the spread of information across platforms. Google’s news offering doesn’t have the risk of false information virally spreading, but it’s also a black box. If it causes problems, there’s a very real chance we’ll never know.

Truthfully, another question is whether Google will ever know. That’s the thing with advanced machine learning and AI: why the computer does what it does is not always understandable. So even if you imagine a scenario where Google is made to tell the world that it showed a user false information in the news feed, it’s possible that Google will not have a clear idea of why it did so.

I have other questions. Being Very Online lately has been exhausting. Trying to curate the information you’re taking in is hard and getting harder, and now, we will all have another feed of information to curate.

Here’s one very minor example: I’m trying to pay less attention to American football. My reasons are personal (concussions and NFL owners attempting to suppress free speech), but it’s taking a while for Google to get the message. It knows I was a huge Vikings fan, so it’s showing me scores and recaps. I can click the various buttons to tell Google I’m not interested, but I have to admit I still look sometimes. Now I worry that every tap, every scroll, every time I linger on a news story as I flip through my phone is sending the wrong signal to Google. It literally makes turning on my phone something I feel guilty about.

Now we all have yet another feed of information to curate

That’s just football, and it’s also the personal experience of a person who thinks a lot about these news feeds and is more informed than the average person about how and why they work. I have no idea what most people will think and experience and feel when they encounter yet another news feed. Probably nothing quite so fraught, but then again, who knows?

Maybe this is all fine. Maybe Google pushing harder on news is going to be, on the whole, a good thing. Maybe the company is heartfelt and serious about improving the quality and accuracy of the information we see online, and putting a news feed on its homepage is a powerful counterbalance to the trough of slop we’re getting from other sources. (By the way, that is what Apple is saying it’s doing with Apple News.) Or maybe I’m overthinking all of this, and nobody really goes to Google dot com anymore, and this isn’t that big of a deal.

That’s a lot of maybes, though. The worst maybe of all is that we will maybe never know the answers to any of these questions.

Disclosure: my wife works for Oculus, a division of Facebook. You can see my ethics statement here.