Skip to main content

Google’s Selfish Ledger is an unsettling vision of Silicon Valley social engineering

This internal video from 2016 shows a Google concept for how total data collection could reshape society

Google has built a multibillion-dollar business out of knowing everything about its users. Now, a video produced within Google and obtained by The Verge offers a stunningly ambitious and unsettling look at how some at the company envision using that information in the future.

The video was made in late 2016 by Nick Foster, the head of design at X (formerly Google X) and a co-founder of the Near Future Laboratory. The video, shared internally within Google, imagines a future of total data collection, where Google helps nudge users into alignment with their goals, custom-prints personalized devices to collect more data, and even guides the behavior of entire populations to solve global problems like poverty and disease.

When reached for comment on the video, an X spokesperson provided the following statement to The Verge:

“We understand if this is disturbing -- it is designed to be. This is a thought-experiment by the Design team from years ago that uses a technique known as ‘speculative design’ to explore uncomfortable ideas and concepts in order to provoke discussion and debate. It’s not related to any current or future products.”

google selfish ledger
All the data collected by your devices, the so-called ledger, is presented as a bundle of information that can be passed on to other users for the betterment of society.

Titled The Selfish Ledger, the 9-minute film starts off with a history of Lamarckian epigenetics, which are broadly concerned with the passing on of traits acquired during an organism’s lifetime. Narrating the video, Foster acknowledges that the theory may have been discredited when it comes to genetics but says it provides a useful metaphor for user data. (The title is an homage to Richard Dawkins’ 1976 book The Selfish Gene.) The way we use our phones creates “a constantly evolving representation of who we are,” which Foster terms a “ledger,” positing that these data profiles could be built up, used to modify behaviors, and transferred from one user to another:

“User-centered design principles have dominated the world of computing for many decades, but what if we looked at things a little differently? What if the ledger could be given a volition or purpose rather than simply acting as a historical reference? What if we focused on creating a richer ledger by introducing more sources of information? What if we thought of ourselves not as the owners of this information, but as custodians, transient carriers, or caretakers?”

The so-called ledger of our device use — the data on our “actions, decisions, preferences, movement, and relationships” — is something that could conceivably be passed on to other users much as genetic information is passed on through the generations, Foster says.

Resolutions by Google, the concept for a system-wide setting that lets users pick a broad goal and then directs their everyday actions toward it.
Resolutions by Google, the concept for a system-wide setting that lets users pick a broad goal and then directs their everyday actions toward it.

Building on the ledger idea, the middle section of the video presents a conceptual Resolutions by Google system, in which Google prompts users to select a life goal and then guides them toward it in every interaction they have with their phone. The examples, which would “reflect Google’s values as an organization,” include urging you to try a more environmentally friendly option when hailing an Uber or directing you to buy locally grown produce from Safeway.

An example of a Google Resolution superimposing itself atop a grocery store’s shopping app, suggesting a choice that aligns with the user’s expressed goal.
An example of a Google Resolution superimposing itself atop a grocery store’s shopping app, suggesting a choice that aligns with the user’s expressed goal.

Of course, the concept is premised on Google having access to a huge amount of user data and decisions. Privacy concerns or potential negative externalities are never mentioned in the video. The ledger’s demand for ever more data might be the most unnerving aspect of the presentation.

Foster envisions a future where “the notion of a goal-driven ledger becomes more palatable” and “suggestions may be converted not by the user but by the ledger itself.” This is where the Black Mirror undertones come to the fore, with the ledger actively seeking to fill gaps in its knowledge and even selecting data-harvesting products to buy that it thinks may appeal to the user. The example given in the video is a bathroom scale because the ledger doesn’t yet know how much its user weighs. The video then takes a further turn toward anxiety-inducing sci-fi, imagining that the ledger may become so astute as to propose and 3D-print its own designs. Welcome home, Dave, I built you a scale.

A conceptual cloud processing node that is analyzing user information and determining the absence of a relevant data point; in this case, user weight.
A conceptual cloud processing node that is analyzing user information and determining the absence of a relevant data point; in this case, user weight.

Foster’s vision of the ledger goes beyond a tool for self-improvement. The system would be able to “plug gaps in its knowledge and refine its model of human behavior” — not just your particular behavior or mine, but that of the entire human species. “By thinking of user data as multigenerational,” explains Foster, “it becomes possible for emerging users to benefit from the preceding generation’s behaviors and decisions.” Foster imagines mining the database of human behavior for patterns, “sequencing” it like the human genome, and making “increasingly accurate predictions about decisions and future behaviours.”

“As cycles of collection and comparison extend,” concludes Foster, “it may be possible to develop a species-level understanding of complex issues such as depression, health, and poverty.”

A central tenet of the ledger is the accumulation of as much data as possible, with the hope that at some point, it will yield insights about major global problems.
A central tenet of the ledger is the accumulation of as much data as possible, with the hope that at some point, it will yield insights about major global problems.

Granted, Foster’s job is to lead design at X, Google’s “moonshot factory” with inherently futuristic goals, and the ledger concept borders on science fiction — but it aligns almost perfectly with attitudes expressed in Google’s existing products. Google Photos already presumes to know what you’ll consider life highlights, proposing entire albums on the basis of its AI interpretations. Google Maps and the Google Assistant both make suggestions based on information they have about your usual location and habits. The trend with all of these services has been toward greater inquisitiveness and assertiveness on Google’s part. Even email compositions are being automated in Gmail.

At a time when the ethics of new technology and AI are entering the broader public discourse, Google continues to be caught unawares by the potential ethical implications and downsides of its products, as seen most recently with its demonstration of the Duplex voice-calling AI at I/O. The outcry over Duplex’s potential to deceive prompted Google to add the promise that its AI will always self-identify as such when calling unsuspecting service workers.

The Selfish Ledger positions Google as the solver of the world’s most intractable problems, fueled by a distressingly intimate degree of personal information from every user and an ease with guiding the behavior of entire populations. There’s nothing to suggest that this is anything more than a thought exercise inside Google, initiated by an influential executive. But it does provide an illuminating insight into the types of conversations going on within the company that is already the world’s most prolific personal data collector.

Update: Nick Foster’s title has been updated to include the Near Future Laboratory and X’s response has been moved.