Skip to main content

How the minds behind Rez and Death Stranding turned virtual worlds into physical spaces

At Virtual Realms: Videogames Transformed, game designers tackle a new realm

If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.

“Together the distance between (us)” by thatgamecompany and FIELD.IO.
“Together the distance between (us)” by thatgamecompany and FIELD.IO.
Photo: Marina Bay Sands.

When you build a procedural tree of life, there are going to be branches. This was a learning process for the Irish artist David OReilly, whose existential simulation games started making their way into galleries around the world. Everything, which was long-listed for an Academy Award, was launched in the San Francisco Museum of Modern Art. The game allows players to control, bond, and shift between objects at various scales, from subatomic particles to land animals and planets.

“To make Everything, it involved making this incredibly detailed library of objects,” OReilly says. “And so all the interaction was done by having a really big catalog of things, along with information about their relationship with other objects: their color variation, their size, their sound effects, how they move, and so on.”

This came in handy when OReilly was asked to translate his virtual worlds into physical art installations. Eye of the Dream is an adaptation of this remarkable library of metadata, an audio-reactive Kaleidoscopic film for 360 domes and planetaria. OReilly calls it an Everything branch because of how it grew out of the game’s development repository. 

Initially, Eye of the Dream was engineered to be more interactive, with the simulation parameters controlled by a website on participant’s smartphones. “But once it was put in practice, it was total chaos,” OReilly says. “You have this very harmonious simulation experience with all of this beautiful music, but when you have 100 people with controllers on their phones in the darkness, you get a lot of people going back and forth from zero to 100 over and over again to figure out what their level of control is.”

As a result, Eye of the Dream toured Europe and America as a recording of this algorithm, a “ballet choreographed by math” to which observers could surrender. The interactive infrastructure was left behind, but it wasn’t long before another branch sprouted from the Everything tree. Eye, a collaboration between OReilly and media designers onedotzero, is currently on display at Singapore’s ArtScience Museum as part of the Virtual Realms: Videogames Transformed exhibition.

Virtual Realms is a bold opportunity for video game developers to transpose their virtual worlds into physical space, confronting new challenges and exploring the medium’s boundaries. As well as gazing into OReilly’s Eye, attendees will be able to experience installations from Kojima Productions (Death Stranding), Enhance (Tetris Effect), thatgamecompany (Journey), Media Molecule (LittleBigPlanet), and Tequila Works (Rime). 

Eye by David OReilly and onedotzero.
Eye by David OReilly and onedotzero.
Photo: Marina Bay Sands

Instead of 100 people fighting the algorithm from their phone screens, OReilly’s Eye offers three oversized discs that participants can spin to scale and warp a mandala of assets like flowers, shells, and cars. Music from the London Symphony Orchestra reacts to the simulation, with non-interactive participants lounging on spongey seats, basking in its infinity. 

OReilly has long been inspired by the abstract animator Oskar Fischinger, who synchronized films to jazz and classical music in the early 20th century. “This is very much taking the game engine and the codebase but interacting with it more like you’d interact with a musical instrument,” OReilly says.

“The game is spinning out from the digital world.”

Virtual Realms was curated by Enhance’s Tetsuya Mizuguchi, who OReilly calls “the master of audiovisual interactive experiences.” Mizuguchi has pushed the medium forward throughout his career by attempting to summon feelings of synesthesia in players. In his 2001 rail shooter Rez, Mizuguchi managed this by combining abstract visuals, eclectic dance music, and strong haptics.

Mizuguchi’s pedigree meant that reaching out to potential collaborators was an easy process. Working with London performing arts center The Barbican, he made a list of designers and artists with ambitions beyond the broad strokes of video games.

The response was immediate; as soon as they heard the concept, developers were attracted to the unique challenge of turning their virtual visions into something that people could experience in meatspace. “The game is spinning out from the digital world,” Mizuguchi says. “This is a very deep architecture, one that is unlimited, and we are growing as technology evolves.”

Rezonance by Enhance and Rhizomatiks.
Rezonance by Enhance and Rhizomatiks.
Photo: Marina Bay Sands

Enhance’s contribution to Virtual Realms is a collaboration with media artists Rhizomatiks called Rezonance, which Mizuguchi describes as “an experiment to crystallize the game mechanism of Rez into a real spatial experience.” Four participants enter a dark room and wield haptic spheres. When they move around the space and interact with others, music and visuals react in real time, building a unique synesthetic performance. “It’s the same mechanism as a game, but with much more freedom,” Mizuguchi says. 

For Enhance, it was about stimulating as many senses as possible. “We are trying to make a new kind of emotional chemistry that is not only emotional but emotional and physical,” Mizuguchi says. “It’s a new kind of storytelling, a synesthetic experiential narrative … where the visuals and sound become borderless.”

Ayahiko Sato of Rhizomatiks said that it took countless adjustments to get the audio and visuals in tune with each other to create “sounds which can be seen and visuals that can be heard and felt.” Mizuguchi hopes that Rezonance will move people in a way that they can’t explain, inviting them into the world of video games by showing them what the medium is truly capable of.

“We are trying to make a new kind of emotional chemistry.”

Everything is about a 90-minute to three-hour experience in general, or maybe five or six hours if you want to unlock everything,” OReilly says. But when designing physical adaptations, the conventional rules of video games have to change. “In a gallery, you don’t know when people will enter and leave, so you adapt the systems for that and unlock more things, as there’s no start or endpoint. You let go of control,” OReilly says.

Anyone who visits Virtual Realms during its proposed world tour will experience their own version of Eye or Rezonance due to how the installations have been designed. “I think that ephemeral quality makes each moment unique to what you’re able to see of it,” OReilly says. “It’s not something you can do with other forms.”

Without the comfort of conventional controllers, installations present a challenge for exhibitors, but the breathing room and immersion that physical space provides can be liberating. “I had spent, you know, the better part of 15 years, releasing everything online, with things being part of people’s internet experience,” OReilly says. “But I felt like that hit a certain limit, a saturation point, where there’s just too many things going on, and it’s very difficult to create this kind of experience I want when it might be one tab away from social media or the news or whatever, so I really love the idea of doing it in a physical space.”

Wall by Kojima Productions and The Mill.
Wall by Kojima Productions and The Mill.
Photo: Marina Bay Sands

“I completely took the entire roof off, but if it had rained, my house would be ruined.” John Beech is tilting his webcam to show me the makeshift ceiling above his head. A former construction worker turned lead designer at Media Molecule, his past experience was a boon when developing the studio’s Virtual Realms installation, Dream Shaping.

“The knowledge of what I knew as a builder actually really helped, especially with the early development of Dream Shaping when we hooked up with Marshmallow Laser Feast, who are very much in that physical realm,” Beech explains. “I could be that bridge between them and the artists and audio people from Media Molecule, both with my sense of physicality and my experience making stuff in Dreams.” 

Marshmallow Laser Feast is an experimental design collective based in London known for crafting exhibits that blend cutting-edge technology with physical space. Accordingly, Dream Shaping takes place in a dark room and has participants donning helmets and waving giant soft play shapes in front of an interactive screen. 

“Nothing was right, and nothing was wrong.”

Robin McNicholas of Marshmallow Laser Feast called it “exploding the game controller and distributing it in space.” The combined vision for Dream Shaping was one of regression. Beech says Media Molecule wanted to break down the structural barriers that form with age and inhibit the childlike wonder of play. “You would go into a soft play amusement area, and nothing was right, and nothing was wrong,” Beech says. “You picked up some shapes; you fiddled around here; there was no sort of structure to it”

The Dream Shaping props are embedded with trackers which interface with a virtual environment that Media Molecule built entirely in Dreams, the studio’s latest commercial release, a PlayStation-based game creation engine. “It really does just run off a PS4; there isn’t any sort of cheating in that sense,” Beech says. All of the development took place using DualShock 4s and PlayStation Move controllers. The only external tweak was allowing Dreams to interpret the tracking coordinates on the soft play props, but even then, the data was fed to an in-game gadget.

The development of Dream Shaping raises an interesting discussion about the democratization of interactive art. Beech says that Dreams users and indie game developers could easily make their own art installations out of the box, pointing to how others have interpreted the engine for their own means. “One of our audio coders, Bogdan [Vera], he took Dreams to something called Algorave, which is like a cool generative rave synth party scene,” Beech says. “And he did it with the music tools in Dreams!” 

Media Molecule’s intake of motion capture data for Dream Shaping is something that isn’t possible in the public version of Dreams, but Beech says the studio is always looking for ways to expand the engine’s digital toolset.

Dreams is definitely not a finished project,” Beech says. “We will keep adding cool features that will hopefully facilitate ways to use Dreams in more and more ways that people don’t expect … we’re hoping people won’t have to necessarily hack it.”

Dream Shaping by Media Molecule and Marshmallow Laser Feast.
Dream Shaping by Media Molecule and Marshmallow Laser Feast.
Photo: Marina Bay Sands

The COVID-19 pandemic also caused serious problems for the Virtual Realms exhibit. Mizuguchi mentioned that after initial meetings in Tokyo to discuss the idea, he couldn’t regroup with local collaborators like Hideo Kojima due to the impact of the virus.

Testing ambitious installations in quarantine also demanded creative solutions. Enhance and Rhizomatiks simulated the Rezonance installation space in virtual reality, all the way down to the position of its many speakers. The two teams used Oculus Quest 2 headsets to walk through the virtual venue and fine-tune audio spatiality and visuals, though they naturally had to compromise when it came to testing high-end haptics.

“I’m excited to have something that will bring people physically together.”

Media Molecule’s work on Dream Shaping started before the pandemic, but the project faced its own challenges due to remote collaboration. “We were doing it pre-COVID, but it may as well have been because we were in two separate offices,” says Media Molecule’s John Beech. “Marshmallow Laser Feast ran around with the shapes doing random stuff, but not knowing what they were interacting with, and they sent back that recorded data which we could then play as like a replay that informed Dreams what to do,” Beech explains. “So you would see if it was technically working, but you wouldn’t be able to see if, in the game scenario, people would respond to the interaction,” Beech added. 

Unable to attend in person, many developers will have to wait for restrictions to lift or until the exhibition tours their part of the world. Meanwhile, they just have to trust that visitors, game-savvy or not, will respond to their unique installations.

OReilly’s former work in glitch art has steeled him for any potential errors. “In general, with my work, you acknowledge that it has a digital origin, so you know parts of these errors are part of the experience. But at the same time, you want the right errors; you want the right kind of accidents,” OReilly says. “We go and see musicians live because on some level, there’s a potential that an accident will happen and that it won’t be exactly like the album you’ve been listening to, and it’s the same here.”

As the world starts to tentatively open up, social experiences like Virtual Realms provide a way for people to reconnect after spending so much time in lockdown, which can be profound for creators, too. “A lot of this comes out of concentrated periods of time between me and the computer,” OReilly says. “And even though this project was set back by all of this, I’m excited to have something that will bring people physically together.”

“This is a young medium that’s totally undeniable and an absolutely natural evolution of all artistic forms that have come before it,” said OReilly. “And it can produce emotional effects that nothing else can do. I don’t particularly care how the intellectuals parse these things, but I know that people will come away from [Virtual Realms] with a feeling that they’ve never had from any other kind of art form before, and that’s its power. “