Skip to main content

Facebook just gave us one more reason never to trust it

Facebook just gave us one more reason never to trust it

/

User testing gone too far, again

Share this story

The past several years have done wonders for Facebook’s once-murky public reputation. After a series of privacy-related controversies and a rocky debut in the stock market, the company embraced and quickly came to dominate the time we spend on mobile devices. More than a billion of us use its apps every day, and they have come to serve as a vital connection between family and friends. As a result, Facebook’s stock increased by a third last year, and revenue was up 40 percent in the last quarter.

Flush with cash, the company has invested heavily in philanthropic efforts — both through internet.org, the company’s controversial effort to connect the globe, and through CEO Mark Zuckerberg’s personal charities, which are set to give away billions of dollars. All of which has given the company a newly articulated sense of mission, which it promotes aggressively in all its public communications: to make the world more open and connected.

Selectively disconnecting the world for hours at a time

Fascinating to learn, then, that the company has been selectively disconnecting the world for hours at a time. In a fascinating report in The Information, reporter Amir Efrati details the various steps Facebook is taking to prepare for the possibility that Google one day removes its apps from the Play Store for competitive reasons.

Facebook has tested the loyalty and patience of Android users by secretly introducing artificial errors that would automatically crash the app for hours at a time, says one person familiar with the one-time experiment. The purpose of the test, which happened several years ago, was to see at what threshold would a person ditch the Facebook app altogether. The company wasn’t able to reach the threshold. "People never stopped coming back," this person says.

It’s important to highlight that this was apparently a one-time test that happened years ago. Facebook declined multiple requests for comment. But the revelation comes on the heels of the 2014 controversy in which the company altered the content of the News Feed to determine its effect on users’ moods. In part, that meant showing some users a barrage of sad or upsetting posts to see whether it made them less likely to visit Facebook. The study, which "creeped out" even the editor of the journal that published it, cast Facebook in an unusually negative light. Collecting demographic data about us was one thing. But trying to make us feel sad?

Trying to make us feel sad

Now comes the news that Facebook deceived users into thinking its Android app was broken, to see whether they would abandon the service or simply switch to using the inferior mobile website. Former Facebook data scientist JJ Maxwell defended the move to me, saying on Twitter such tests are "hugely valuable" to the company. "Their prerogative," he said. Maxwell likened it to Walmart removing parking from their store to test the effect of varying levels of parking on sales.

But Facebook isn’t a big-box store. It presents itself as a global mission to connect the world — a mission that just happens to operate a lucrative advertising business. "We’re looking for new opportunities to … open up new, different private spaces for people where they can then feel comfortable sharing and having the freedom to express something to people that they otherwise wouldn’t be able to," Zuckerberg told analysts in 2014. But how comfortable can you feel sharing inside a world that its executives treat as a private zoo and research facility? One that they’re willing to shut down on a whim?

Of course, to some extent every tech company is a research facility, and we users are their unwitting subjects. A/B testing offers companies valuable insights into how we use their projects, and give them meaningful data about how to improve them. Most such tests are innocuous — Google famously tested 41 different shades of blue on its home page to see which people preferred. At The Verge, we’ve recently begun testing multiple headlines.

But it’s one thing to see which color of blue leads to more queries, and another to break your own app for hours at a time. You can’t on one hand position your company as the place to declare your safety in the wake of terrorist attacks at the same time you’re selectively disabling access to your own service.

Software operates totally beyond our understanding

Volkswagen’s emissions scandal last year confronted us with the scary fact that most of the software we use daily operates totally beyond our understanding. As customers, we expect the products we use to be made and operated in good faith. Unlike with Volkswagen, there’s nothing illegal about what Facebook reportedly did with its Android app. But users are almost totally unaware of these experiments. And if they do eventually find out about them, they can't really leave — because there's simply no other meaningful Facebook-like service in the market. That gives the company a moral imperative to treat its users honestly.

Perhaps this experiment was controversial inside Facebook itself — it would help explain why it was apparently a one-off test. Maybe the company’s ethical standards have evolved over time. What’s troublesome is that we simply don’t know, because Facebook itself won’t say. If the company wants us to take it seriously when it promotes its mission, it will need to be much more transparent — and trustworthy. It has a long way to go.