Skip to main content

This game is a movie: 'New Cinema' explores the altered future of film

This game is a movie: 'New Cinema' explores the altered future of film

/

Artists and directors conspire in the space between film and video games

Share this story

If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.

"In Opsis" - Anton Marini, Golan Levin, Brian Chasalow (courtesy of the artists)
"In Opsis" - Anton Marini, Golan Levin, Brian Chasalow (courtesy of the artists)

Peter Jackson's high frame rate Hobbit and the second coming of 3D have probably been the most publicized attempts to revive the stagnating movie industry. But elsewhere, the future of cinema seems to be converging on something more closely resembling a video game as artists, hackers, and filmmakers process the language of film through the lens of code.

Last week an experimental program in New York City called New Cinema shared a tiny glimpse of what that might look like. The works-in-progress are the results of a hackathon from last December organized by The Creators Project and NYC's Eyebeam art and technology lab in collaboration with Framestore, the Oscar-winning visual effects studio behind films like Children of Men and Where The Wild Things Are. And each one, however small, seems to hint at some larger idea about how interactive software could dramatically alter — maybe even outright replace — cinematic storytelling as we currently know it.

Tcp_hackathon_bridgegroup_bts_051

It’s true that the idea of interactive cinema has been around — Radúz Činčera’s Kinoautomat famously debuted at the Montreal World’s Fair in 1967, if you want to go all the way back. So why "New Cinema?" Participating artist Brian Chasalow admits that like "New Aesthetic," it's one of those regrettably pretentious placeholders used for lack of a better term.

"If you paint, you're a painter. If you program and do art, you're a New Media Artist."

"Calling something "new" adds a certain valuation or quality to whatever word comes next," he lamented to a crowd at Eyebeam last week. "It's a difficult time for artists who are in this field because they don’t know what to call themselves. If you paint, you're a painter. If you program and do art, you're a New Media Artist. We put ourselves in the category because it's easier — because we have to — to make other people understand what it is that we do."

Chasalow and collaborator Anton Marini were showing off what you might describe as the software equivalent of a cinematic close-up. Conceived with the help of Another Earth director Mike Cahill, "In Opsis" renders an interactive landscape from footage of a subject's eye, captured at 3072 x 1307 resolution at around 200 - 300 frames per second. The images run through software written by media artist Golan Levin, which scans for unique features in each iris (identified by tall white lines), then are brought into the Unity game engine to build a terrain map which can be navigated with mouse gestures.

"We Make The Weather," a sound-reactive piece by Sofy Yuditsky, Karolina Sobecka, and Greg Borenstein, portrays a scene with a grayscale CG model of a woman walking across a bridge, where the size of the bridge and the movement of the surrounding water are controlled by breathing into a microphone. The group says it was partly inspired by the events of Hurricane Sandy, which nearly destroyed the digital archive at Eyebeam just a few months prior.

Facedance1

A more lighthearted entry, "Face Dance" uses facial mapping to do Michael Jackson dance moves — a kind of interactive music video headed up by Catfish directors Ariel Schulman and Henry Joost, Lauren McCarthy, Aaron Meyers, and James George, one of the artists behind Clouds, the interactive documentary shot on Kinect. The software has to be "trained" to work with individual faces before they can execute the moves, which the group says it motion-captured from a professional Michael Jackson impersonator.

Other projects seemed to focus on developing other cinematic techniques, tropes, and ideas — like exposition. "We struggled a lot with what is a video game and what is interactive cinema," said Ramsey Nasser, an Eyebeam resident who worked on "Before the Flood," a kind of on-rails shooter minus the shooting that treks through a computer-generated cave. As the camera slowly floats through the space, motion-tracking tech provided by Google (running on a "magical Google laptop" Nasser says they weren't allowed to touch) allows participants to discover secret cave paintings (made by another Eyebeam resident, Nick Fox-Gieg) by moving their bodies inside a marked-off area to adjust the position of light sources.

David Miller of Framestore explains: "Often in a film you're given a montage to show you what happened and why things have happened. So we started looking at the idea of what if the viewer was in control of the backstory; what if the viewer could discover things as they went through a narrative." Miller says much of this is new to him, being as how his work usually exists as a series of pre-rendered sequences — a rare collaboration between traditional filmmaking and software storytelling.

There's been some dialogue between those two disciplines this week. At the DICE media summit, J.J. Abrams and Valve boss Gabe Newell talked about playing in each other's gardens, hinting at an upcoming Half-Life or Portal movie, as well as a game produced by Abrams' Bad Robot studio. Heavy Rain and Indigo Prophecy director David Cage suggested that an even closer collaboration between game designers and veteran filmmakers could give birth to another medium entirely.

"New Cinema" seems to exist somewhere within that grey area — even if no one knows exactly what it is yet.

EDIT: A previous version of this article stated that the environment in "In Opsis" was built using Quartz Composer. While Quartz Composer was used to prototype the project, the game environment was actually built within the Unity game engine.

'New Cinema' hackathon

1/8

Images courtesy of the artists