Skip to main content

Shoshana Zuboff on surveillance capitalism

Shoshana Zuboff on surveillance capitalism


‘We are not the users, we are being used’

Share this story

Illustration by Alex Castro / The Verge

The Age of Surveillance Capitalism author Shoshana Zuboff considers whether “data is the new oil” and explains how data collection has fundamentally changed the economy and how big companies interact with consumers. Shoshana Zuboff breaks down how to define, understand, and fight surveillance capitalism.

You can listen to the discussion in its entirety on The Vergecast right now. Below is a lightly edited excerpt from this interview between Shoshana Zuboff and Verge editor-in-cheif Nilay Patel about how surveillance capitalism works.

Nilay Patel: So an example of industrial capitalism — which I’m a fan — a company like Ford buys steel on the market. They turn the steel into a car. They sell the car. We’re taking one thing, we’re applying some labor and we’re selling it in a market. There’s competition. You’re saying surveillance capitalism upends that relationship and you buy something that’s cheap or free and you’re paying into it with your behavior, which then gets turned into behavioral data that is bought and sold on a market that you cannot see or participate in so that other things are recommended to you or your decisions are somehow influenced.

Shoshana Zuboff: That’s correct. So we thought that we were using free services but they think that we are free. We thought we were using surveillance capitalism’s free services. In fact, surveillance capitalism looks at us as free sources of raw material for its production processes. They call us users, but, in fact, they are using us as raw material for their production processes.

Because what they produce is recommendations?

What they produce are predictions. I call them prediction products. So what they’re selling into these futures markets are predictions of our future behavior. What is a clickthrough rate? Just zoom out a little bit, a clickthrough rate is nothing but a prediction of a piece of future human behavior. So now we have predictions about not just clickthrough rates, but what we will purchase in the real world and whether or not we will drive insurance premiums up or down — be a positive or negative effect on the insurance company’s bottom line.

We have predictions of health behavior we will engage in, predictions of what kind of driving behavior we will engage in, predictions of what kind of purchasing behavior we’ll engage in, predictions of where we will go, what we will do when we get there, who we will meet, what they will do when they get there, and so on and so forth. So all this activity — which started with grabbing our online private experience and turning it into behavioral data for prediction — this has now swept out of the online world into the real world so that it follows us with our phones, it follows us through other devices that increasingly saturate our environment, whether we’re in our car, or walking through our cities, or in our homes.

And this increasingly saturated environment is collating creating data. There are complex ecosystems of players now, some players that do nothing but capture niches of behavioral data and then shunt them into these supply chains. These pipelines that are sending them to the aggregators, that are sending them into the machine learning specialist, and so forth. So these are complex ecosystems now with complex supply chains.

You know The Wall Street Journal, to some fanfare, published a report just a few days ago about their investigation of a whole range of mobile apps that people use to which they’re feeding very intimate data. Some are health apps, some are fitness apps, some are apps about your menstrual cycle, and on and on and on. These apps The Wall Street Journal discovered are taking those data and most of them are shunting that data right into Facebook supply chains.

This is something that I write about in detail of course, and has been well-known to the folks who research this closely for quite a while. We are living in the center of this ecosystem and once you begin to wrap your mind around this, so you understand that we’re not the users, we’re being used. So you understand that it’s not free, we are free. Once you make that mental switch, I promise you that your perception changes in a fundamental way.

Yesterday, I got off the plane and I’m walking through LaGuardia Airport and there is a space where everybody’s sitting at counters, you know, on these little stools. Everyone’s on their laptop waiting for their plane. I’m looking at this and thinking we just don’t realize that we’re just on our laptops feeding these supply chains and all of the wealth that is amassed here. All of the surveillance capital that is accumulated, much of it goes into a design effort to make sure that these mechanisms and methods are hidden from us. To make sure that we are kept in ignorance.

How do you mean? Could you give me an example of how its bypassing awareness?

An example is the Facebook contagion experiments which made a lot of headlines long before Cambridge Analytica hit the streets.

This is Facebook saying “We can make people feel bad if they look at this”?

Well, first they said “Can we make people vote?” So that was published in 2012. And the idea was subliminal cues in your News Feed and on your Facebook pages using social influence and other kinds of subliminal cues to see if they could actually produce more people casting real votes in the real world during the 2012 midterm elections. The conclusion was yes, they could. And when they wrote it up in a very reputable scholarly journal, the Facebook data scientists celebrated together with their academic colleagues who were part of this research. They celebrated two facts. One was we now know that we can spread a contagion in the online world that affects real-world behavior. And number two, they celebrated the fact that this can be done while bypassing the user’s awareness.

It’s all subliminal. We don’t know what’s happening to us. While the world was mobilized in outrage at the thought that Facebook unilaterally toyed with us this way — in what they call a massive-scale experiment — while we were in outrage, they were already putting the finishing touches on a second massive-scale experiment. And this one was to see if they could manipulate our emotional state with the same kind of methodologies, bypassing awareness, subliminal cues, and so forth. And of course, they discovered that they could. They could use subliminal cues in the online environment to make us feel either more happy or more sad.

The Vergecast /

Weekly tech roundup and interviews with major figures from the tech world.