I was wearing a VR headset, standing in the middle of a dark, gothic take on A Christmas Carol. A grisly Jacob Marley asked me what I missed most about my childhood, and I told him I missed the hope and optimism of youth, when it seemed like anything was possible. When the Spirit of Christmas Past subsequently visited me, he pointed out a pair of ghostly, shadowy children chasing each other a few feet away. The spirit leaned in close. “Look at them,” he whispered, “so full of hope and potential.” It’s an easy trick, drawing a response out of an audience member and using that response to personalize their experience. But it worked — my own words coming back to haunt me drove home the sad longing in that scene and made it specific to my own thoughts.
As the experience moved toward its conclusion, I wound up in a cemetery with the Spirit of Christmas-Yet-To-Come. My brain knew that the thin, slightly pixelated figure in a shadowy cloak was just a digital character in a VR headset. But when he put his hands on my shoulders, his touch was real and weighty. And as the experience ended, he physically turned me toward a gravestone with a name I immediately recognized…
In Los Angeles, a virtual reality experience called Chained: A Victorian Nightmare is combining VR with live actors and motion-capture technology to create a new kind of hybrid show. It pairs some of the same principles used in location-based mixed-reality pieces like The Void’s Star Wars: Secrets of the Empire with the work being done in interactive and immersive theatre. In the process, it serves as a strong example of the added depth virtual experiences can have when augmented by the simple power of human interaction.
“This one really was a product of my growing love of immersive theater,” says creator and director Justin Denton. An experienced VR creator, Denton has worked on tie-ins for projects like Guillermo del Toro’s film Crimson Peak and the Syfy series Nightflyers as well as the Microsoft HoloLens augmented reality experience for Legion that ran at last year’s San Diego Comic-Con. That HoloLens project also incorporated live actors who interacted with participants, but with Chained, Denton wanted to blend live performance with VR’s ability to transport audiences to wildly different locations and environments.
Before beginning the 20-minute VR experience, I was shown into a dimly lit gallery with concept art lining the walls. From there, I was ushered into a small antechamber with a large door and a foreboding, old-school knocker. I knocked three times, and a young woman answered (actress Haylee Nichele, formerly of the New York immersive shows Sleep No More and Then She Fell). She whisked me into another small room, and as she hurriedly spoke, it became obvious that while she appeared to be a living, breathing person, she was almost certainly playing a ghost. Sitting me down in front of a mirror, she gave me a fountain pen to sign my name inside a large ledger resting on the desk. She warned me that a journey of self-discovery was coming and handed me the VR headset.
Inside the headset, I saw a digital re-creation of the small room — and then a bony hand emerged from the mirror, like a VR nod to John Carpenter’s Prince of Darkness. I reached out and found a real hand in mine. Then, I was suddenly pulled through the virtual mirror and into a Victorian-era bedroom.
An actor in a motion-capture rig played Jacob Marley
Going through location-based VR experiences like those from Dreamscape Immersive and The Void have trained me over the past couple of years to buy into the physical reality of a seemingly digital space. If you see a wall while inside a Void experience, for example, odds are you can actually reach out and touch it. So being pulled through what had been a physical wall just moments before was unnerving.
The owner of the bony hand was none other than Marley himself. The digital character’s movements and gestures were being supplied in real-time by an actor (Michael Bates, from the LA-based immersive theatre troupe The Speakeasy Society) wearing a motion-capture rig. But Bates wasn’t in some separate, standalone capture space. He was right there, playing the scene next to me, keying off my movements and vocal inflections. When Marley touched my arm, Bates actually touched my arm. When he walked up close to me, I could sense his physical presence, even while cocooned inside the headset. When the actor spoke, I heard his voice — not filtered through digital processing and piped into my ears via the headset, but because he was actually standing right next to me.
That seems like a simple enough idea, and projects like Jordan Tannahill’s Draw Me Close have already explored introducing physical contact with actors while in virtual environments. Oculus is working on a project that will let audiences interact with motion-captured actors from the comfort of their own homes. But Chained demonstrates how live performers can allow virtual experiences to become more personalized than they would if an audience member was just watching an automated digital character moving along programmed rails. The actors can change their performance, cadence, and approach based on participants’ behavior, Denton explains.
“Those magical moments when you’re in any immersive theater experience, when you’ve got a great performer there with you, and you get a bit of that ASMR thing going on where you’re starting to feel the tingles go up the back of your neck,” he says. “Taking the best of immersive theater in those intense one-on-one personal moments, and then also being able to take the best of VR where we can actually take you different places at the same time — combining those two was what I was trying to do.”
For the most part, Chained succeeds in doing exactly that. The show has the lone audience member taking over the Ebenezer Scrooge role in a story roughly following the structure of A Christmas Carol. The participant is invited to feel regret, longing, or fear as the story dictates. Each of the three Christmas spirits present their own vignettes, with Bates playing all four digital characters. They all have radically different appearances and styles of interaction, with their movements driven by a live motion-capture system not dissimilar to the ones used to create digital characters in blockbuster films.
“What we’re seeing in an interesting way in the visual effects space is that a lot of VFX houses have been experimenting with real-time [rendering],” explains Madison Wells Media Immersive executive producer Ethan Stearns, who previously worked on projects like Carne y Arena, the VR collaboration between ILMxLab and Birdman director Alejandro González Iñárritu. “ILM and Digital Domain and all these companies have been trying to work on, ‘How do we work game engines into our VFX pipeline?’”
On the film side, that’s resulted in a lot of tools that have made it easier for filmmakers to realize their respective visions: using VR for set design in Rogue One or letting the camera operator on Thor: Ragnarok see a live version of the Hulk in the camera viewfinder when lining up a shot of actor Mark Ruffalo.
A scripted experience with room for the audience to interact
Chained flips that idea. Behind the scenes, one computer rig is dedicated to handling the motion capture from the live actor and turning it into real-time character animation. Another renders the environment and the rest of the experience. When the two systems integrate, they allow the mo-cap actor to interact with the participant in the physical space, while simultaneously reflecting those same movements and performance choices in the digital character.
Combining a scripted show with on-the-fly moments of improvisation and customization allows the story to remain fixed, while still ensuring each participant’s individual experience will be unique. That approach also extends to the show’s pacing and structure. Rather than having the entire piece run on a timed loop, some individual scenes and transitions are triggered by an on-site stage manager, while others are activated by the way the participant handles certain props. When meeting the Spirit of Christmas Present, for example, I was handed an apple; placing that item on a table in the room triggered the next beat in the scene.
But from an audience perspective, innovative approaches and technological solutions don’t matter if the resulting experience isn’t compelling. Chained does a strong job of conveying the potential in combining theater and VR. That said, as with any production that’s experimenting and pushing boundaries, some moments that work better than others. At one point, I’m pretty sure I nearly stepped on Bates as he tried to secretly crawl away during a scene transition. At other points, it appeared a character was looking down at my chest rather than meeting my eyes — not the best way to foster a sense of connection. And in terms of visuals, the project didn’t quite reach the same level of polish as titles like the Neil Gaiman and Dave McKean adaptation Wolves in the Walls, or some of the projects from the now-defunct Oculus Story Studio.
But as a proof of concept, Chained nevertheless does what Denton set out to do. He makes a strong case that some of the most interesting virtual reality experiences today are the ones that incorporate elements beyond the headset and what goes inside it. And with plans to take Chained to different cities after it completes its Los Angeles run, Denton and Stearns say the project will only continue to iterate and improve as more audience members get a chance to explore it.
“In a lot of ways, this is sort of like our preview run,” Stearns explains. “We’ve done this in the lab, we’ve done a lot of workshopping, we’ve done a lot to prepare. But when you run lots and lots of audience members through, you really learn a lot, and we’ll make it better and more seamless and more comfortable for people throughout the process.”
Chained is currently scheduled to run in Los Angeles through January 6th, 2019. The next round of tickets will go on sale on Monday, December 10th.