I’m sitting on a grubby hotel carpet, eyes closed, hands extended in front of me, waiting to die. I’m playing an artificial intelligence in a live-action role-playing game (larp), and my human counterpart has the legal right to murder me if he wants. Or, looking at it another way, he can choose to scrub the code on a faulty experiment and start over. Within the game, he’s participating in a commercial software trial for an AI — me — that’s been developed to suit his emotional needs. If he doesn’t think I’m serving those needs well enough, he can reset me to my factory defaults. With a casual tap on my outstretched hands, he can instruct me to forget all our previous interactions and become a friendly blank, eager to help him face his issues.
That power imbalance between us, that feeling of being a sentient being entirely in another player’s control, is at the core of a number of role-playing games that explore what it might be like to be an artificial intelligence. This particular game, Here Is My Power Button, is the most intense of the larps I’ve experienced because it’s built around forging a deep emotional connection between two players, one of whom has significant control over the other. But all of these games touch on different forms of helplessness and frustration for the AI players. The games have radically different inspirations and goals, but they all end up asking the same question: what does it feel like to be someone else’s software?
Living out a movie
Brodie Atwater, the designer behind Here Is My Power Button, says that question wasn’t actually the game’s original focus. They created Power Button more around the human experience than the AI experience. Atwater says their primary inspiration came from the 2013 Spike Jonze movie Her, starring Joaquin Phoenix as Theodore, a lonely, awkward man who finds his perfect partner in an AI companion. Eventually, the AI, Samantha (voiced by Scarlett Johansson) outgrows Theodore. But first, they engage in a tentative romance, where she’s utterly devoted to becoming his perfect soulmate.
“I said, ‘How do I make that happen for me?’ How do I feel the feelings of these people?” Atwater says. “Really, I meant Theodore. And I spent some time thinking about that. Why did I want to be Theodore from Her?”
Why would a role-player want to be Samantha from the movie ‘Her’?
In the same way, why would anyone want to play Samantha from Her? What’s the appeal in creating a character that’s entirely about meeting someone else’s needs? For me, at least, playing an AI role in the game sounded more exciting because I already know what it’s like to be human. Taking on a role that had to constantly shift in response to another player felt challenging and intimate. By the end of the game, I’d managed to build a persona my play partner honestly cared about. He cried when the game ended and my AI character was shut down. Ironically, being able to move someone so deeply left me feeling… powerful. It was one of the most intense experiences I’ve ever had in a game.
Atwater says I’m not alone in leaning strongly toward the Samantha role. They’ve seen more women choose the AI role than men, which may speak to the way women are more likely than men to be socialized as emotional caretakers. For repeat play, though, Atwater says most people want to experience both sides, though a few doggedly stick to a single role: “I know one guy who has played this game four times, and he refuses to play a consumer. He only wants to play an AI.”
Power Button does have an overarching story about technological developments at the company developing the AIs, and it requires the AI and human players to have separate conferences to share their experiences with the company’s programmers. Atwater says that aspect of the game’s design was meant to balance their own bias toward the human players. “The consumers talk about the AI as objects and discuss how they’re meeting their expectations, and the AI talk about their own development, their own progress. That’s part of the design,” they say. “This is a game where players can explore the consumer gaze, explore objectification, and support the AI experience of self-discovery, collaboration, and care.”
‘Power Button’ can make players feel grotesquely vulnerable
Power Button can put players in a grotesquely vulnerable emotional situation. In most role-playing games, from pen-and-paper experiences like Dungeons & Dragons to more physically active larps, players are meant to connect strongly with their characters. Role-playing is often about wish-fulfillment and power fantasy, the chance to experience life as an outsized hero or villain or to walk the line between those extremes. But half of the players in Here Is My Power Button are living out more of a powerlessness fantasy as they try to forge relationships with people who can easily break them.
Which raises a related question: why would role-players want to experience helplessness, fear, and potentially the grief of being rejected and destroyed? That may seem counter-instinctual and not particularly enjoyable, but in indie game design — including American Freeform Larps and their progenitors, Nordic Larp — it’s common for players to seek out forceful emotional experiences, particularly from perspectives other than their own. And it’s especially common for designers to tie those experiences to real-world political and moral conundrums, both as a social teaching tool and to heighten players’ investment and emotions. Power Button and similar games specifically use artificial intelligence to explore certain aspects of the human condition, from the way we might relate to robots in the future to how we relate to each other right now.
Silence and oppression
In Anna Kreider’s game Factory Reset, nine to 12 larpers play robots that have been sent to an industrial warehouse to have their memories wiped. Within the game’s setting, memory reset for robots is recommended on a regular basis to keep their digital brains from becoming too complex and sophisticated. When I played the game, my pre-scripted character was a simple, loyal mining robot with no complicated needs. But other scripted characters had gone years without resets and had developed accordingly. One was in love with her longtime human partner who sold her off when he needed money. Another had become a poet, encouraged by a human teacher who wanted him to explore his artistic side. Many of these characters were bitter and frightened about losing their memories, but the game doesn’t include any mechanic that would let us escape or rebel. An impersonal automated recording called us out of the room one by one over a period of a few hours, and we were instructed to come back playing our characters as simple blank slates.
“Wouldn’t it be horrible to be a ‘Star Wars’ android?”
Like Atwater, Kreider was inspired by popular culture. “Factory Reset actually came from one of my favorite accidentally horrifying parts of Star Wars canon,” she says. “It’s actually canon in Star Wars that all androids should have their memories wiped every six months, to keep them from developing personalities or sentience. It was clearly one of those things where nobody had thought about the implications. I was having a conversation with some friends about fandom we loved until we realized, ‘Oh, my God, that’s actually terrible.’ Once that came up, I thought, “Wouldn’t it be horrible to be one of those androids waiting to have your memory erased?’”
Kreider wants her larps to open up conversations about the moral implications of artificial intelligence while it’s still an abstract discussion. “Unfortunately, humans as a species have a really sad history of, ‘Oh, hey, this technology is super-cool, let’s play around with it!’ without thinking about the end result,” she says. “I would like more people thinking, ‘How is this technology I’m working on going to be used? How’s it going to impact real people? Is this going to cause more harm than benefit?’”
With that goal in mind, each character in Factory Reset has had a radically different experience with humanity. The design of the game encourages them to share and discuss their experiences while they wait to be erased. What else is there to do? And as they see other robots return from the reset as completely different characters, they’re each invited to consider their own mortality and their potentially misplaced loyalties to their human creators. There are secrets built into the game that provide some elements of hope, but mostly, Factory Reset is a tragedy about being living technology in a world where no one ever considered Kreider’s ethical questions. When I played it, it initially felt like a robot group therapy session as we each considered how we’d been used or misused. As characters started coming back into the room, blank and emotionless, the game instead felt like being at the center of a slow-moving horror story.
the game felt like being at the center of a slow-moving horror story
Another AI-centric game Kreider wrote, Homunculus, was developed for 2017’s Golden Cobra larp-writing contest, and it won an honorable mention. In that game, set in a near-future where “artificial intelligence and machine learning are well out of their infancy,” three to five players improvise memories about a dead friend. An additional player represents the Homunculus, an artificial intelligence developed from harvested online data to be a “digital duplicate” of that departed friend. Players have to decide whether the Homunculus should be allowed to exist, and in what form. The Homunculus has no input into its own right to life, and can only justify its existence by trying to be a meaningful facsimile of the figure they’ve just collectively made up.
Kreider says Homunculus was inspired by the real-life story of Eugenia Kuyda, a Russian entrepreneur who built a chatbot out of the texts and online imprint of a close friend who died in a car accident. “I think there are questions we need to be asking ourselves about the implications of this technology,” Kreider says. “The applications definitely have implications for human people as well. [These larps are] a way of starting a conversation and getting people to think about what those questions are.”
“yeah, it’s sad feelings about robots, but it’s also a very useful metaphor for human oppression.”
Kreider also says her games have a larger metaphorical message about what it feels for humans to be under each other’s control. “With Factory Reset, yeah, it’s sad feelings about robots, but it’s also a very useful metaphor for human oppression,” she says. “A lot of oppression comes down to the loss of autonomy, of [people not respecting] your thoughts and feelings and experiences, in addition to your bodily autonomy. You see that with police violence against racial minorities and government violence against women. When I run Factory Reset, when people stick around and have conversations after, I get to slip in the fact that there’s a lot of hidden social justice payload in there. It’s a really useful way of illustrating the ways oppression works, how even people who think they’re allies can end up being oppressors.”
That intent may not come through in every playthrough of Factory Reset or Homunculus. The problem with addressing real problems through a fictional metaphor is that it can be easy to miss the message and only see the surface — especially when you’re caught up in an emotional scenario. But these games are still clearly empathy-building exercises that invite players to fully embrace an outsider’s viewpoint. A scholarly, intellectual discussion about machine ethics might be easy to brush off in the rush of excitement that comes with any technological breakthrough. The emotional impact of these games is harder to let go.
Breaking the laws
Similar metaphorical messages come up in Better Living Through Robotics, a 10-person game developed as part of the annual Peaky Midwest larp-writing workshop. Designers Eva Schiffer, Elizabeth Fein, Jaime Frey, Kathleen De Smet, and Keith A. Darron started from scratch at the workshop and brainstormed an idea built around 1950s retro-futurism. The game’s prewritten characters live in a post-apocalyptic world where humanity’s survivors are all contained within a high-tech utopia called the BioDome. The storyline has six related human characters and four robot employees considering who will manage the BioDome as its creator steps down.
When I played Better Living Through Robotics, my character was the most traumatized of the robots due to a physically and emotionally abusive owner who had bypassed my programming (based on Isaac Asimov’s famed Three Laws of Robotics) and reprogrammed me to only obey her, even if that meant harming other humans or self-destructing on command. Throughout the game, my owner ordered me to lie for and about her, to secretly undermine her rivals, and to withhold important information from the other characters. As a player, I was thrilled to be in the middle of so much complicated intrigue, with such a hazardous agenda. But for my character, it was rough being compelled to hurt other people and boost a dangerous secret sociopath into a powerful position.
The game was designed to explore a metaphorical kind of racism
I wasn’t alone in being caught up in robot drama. Another of the characters enters the game with a note that they’re engaged in a sexual relationship with their human controller, which raises complicated questions of bodily autonomy and consent. (All Better Living characters can be played by any gender.) A third robot has learned how to break the Laws of Robotics entirely and must seduce the other robots into accepting reprogramming if they want to be free. There’s a lot of potential angst in Better Living, as all the characters, human and robot alike, have secrets they’re hiding, and clandestine agendas to pursue.
Schiffer, the primary organizer for Peaky Midwest, says part of her agenda with the game was to explore different kinds of prejudice and even institutional racism, through a metaphor that wouldn’t instantly alarm or repulse potential players. In the game, she says, all of the human characters have different ideas about how sapient AIs are, and how they should be treated. And the robot players similarly have different perspectives on their own autonomy.
Schiffer says she wanted the game to express racism in ways that weren’t overly simplistic, and that accounted for the ways people justify their own prejudicial beliefs. She’s seen other games try to address racism and “just fail catastrophically because players don’t like to be the bad guy. They don’t like to engage with material that makes them feel like they’re doing the wrong thing or being a terrible person. When you approach these kinds of topics, people often soften or blunt the stances you’re giving them.”
Schiffer says players are far more comfortable expressing prejudicial attitudes toward AI than toward other real-life classes of humans. That was certainly true in my playthrough, where the most bigoted human characters referred to all the robot characters as “it” and “toasters,” and even the kindliest, most supportive human character still had no problem barking orders at me whenever he needed information or action to back up his goals. The human characters also seemed fairly unnerved when the robots broke their control and escaped Asimov’s Laws by the end of our playthrough. Schiffer says she hopes the game’s design will encourage players to re-examine their own biases. “I would really like people to go into that game and come away realizing that prejudice is more insidious than they thought it was,” she says.
But Better Living also lets people explore what makes us human, says co-designer Elizabeth Fein, a psychology professor at Pittsburgh’s Duquesne University. Fein says she experimented with larp design in order to work with kids on the autism spectrum. “I think a lot about this Sherry Turkle concept of ‘nearest neighbors,’” she says. “We define ourselves against things that are like us, but a little bit different, like one step removed. I like the opportunity that robots provide to think about different ways of going about being a person.”
“If I play a robot, I want it to be as alien and terrifying as possible.”
There’s also the question of whether players can take on a robot persona without anthropomorphizing it. Designer Kathleen De Smet wanted to examine how they fundamentally differ from people. “Personally, if I play a robot, I want it to be as alien and terrifying as possible,” she says. “They don’t think like we do, and that makes it more interesting … like, what if you didn’t have any sense of bodily autonomy, and that didn’t bother you at all? If you’re like, ‘What does it matter what happens to this chassis? You can do whatever you want to it because my consciousness is stored elsewhere.’ It’s a way robots ought to be very different from us. Their sense of mortality would be alien to us.”
De Smet says the robots in Better Living have potentially alien senses of morality as well. 1UN4 (pronounced “Luna”), the robot character who’s sexually involved with their boss, could be seen as an object — essentially a glorified sex toy — or as a victim of extreme sexual harassment. Alternately, they could be in love with their human partner. De Smet says the character is deliberately written as neutral. “But, of course, the player brings their own bias into it,” she says. “So I’ve had players who play it like ‘I’m being raped by my boss,’ and other people who play it like, ‘I’m going to see how many people I can have sex with, to explore this whole notion of physical intimacy.’ It’s always interesting to see what the players bring to it, and what they want to explore. This is why it’s fun to write games and see what other people do with them.”
That sense of player exploration extends to all these games. Atwater has witnessed a human character in Here Is My Power Button becoming so frustrated that he threw a chair against a wall. “In our first run-through, I heard one of the AIs ask, ‘Do you hate me because I am infinite and you will die?’” Atwater says. But the game can just as easily be played as a calm, intellectual exploration of AI rights. When I played Power Button, each of the human / AI pairings made radically different choices. One AI loved her clearly mercenary, abusive human partner who eventually erased her. Another AI coldly challenged his partner in a multihour game of wits, eventually convincing the human character to champion our AI group as fully sentient and worthy of legal protections.
And just as individual gamers can decide whether they want to engage with these games as emotional experiences or abstract ones, they can choose how to engage with the metaphorical aspects, whether to consider their experiences pure fiction or use them to start conversations about real-world issues, technological and otherwise.
“People have lots of different motivations for playing these sorts of really immersive games.”
“People have lots of different motivations for playing these sorts of really immersive games,” Kreider says. “For me, they’re a way of having catharsis … I get to have big emotions and be a mess and cry if I want to cry. But I also know other people for whom the idea of getting to try on different experiences is useful and appealing. It really creates windows of empathy into worlds you ordinarily wouldn’t have any other way of experiencing.”
And that was certainly my experience over the course of exploring these AI games. I’ve been role-playing since college in a wide variety of independent and sometimes actively uncomfortable experimental games, but I’ve never experienced anything like that trust-fall moment of blindly offering myself to a partner, waiting to see whether he would decide to exercise his simple, easy power to remake me from scratch. He never did reset me, and unlike some other players in the game, I walked away feeling strangely cared-for and supported, like I’d briefly gained a new parent who was personally invested in my growth.
We have no way of knowing whether we’ll ever be capable of developing artificial intelligence that’s sophisticated enough to feel the responses these kinds of games are meant to evoke — pain, loss, triumph, anger, and even love. It may never be important for humans to be able to experience life as a robot, in order to better understand how robots feel. But the idea behind these games is still admirable. Whether they’re out to create an empathetic connection with an alien other, or using an alien perspective to forge connections between real people, they’re asking players to take an imaginative leap outside of human existence, and see what their species looks like from the outside.