Skip to main content

Electronic music has a performance problem, and this artist is trying to solve it

Share this story

If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.

Photo: Ben Houdijk

While rehearsing at SXSW last month, Chagall van den Berg ran into an unusual issue: her digital knee shot out in the wrong direction. “My friend laughed and said, ‘Wow, that’s a problem no SXSW artist has ever had,’” van den Berg tells The Verge.

van den Berg is a musician who performs wearing motion-tracking gloves and a full-body suit covered in sensors, which, during this SXSW performance, not only control a projection of a digital avatar that appears behind her, but also control nearly every instrument and effect in the music and her voice. As she moves across the stage, her avatar, floating in space, moves in sync. When she stretches her arms above her head, grains of audio slow to a grind and stutter. Every hand and body movement has cause and effect, crafting a pop-infused dreamscape that’s mesmerizing to watch.

Wearing these sensors, van den Berg can bring in chords and melodies with a sweep of her hand, or distort video of herself in freakish ways by lifting an arm. Because every movement can create audio or visual changes, her performances are very physical, but in an elegant and deliberate way. “I can do all the movements and look like an air traffic controller,” van den Berg says. “That would work, but that’s not very performative. All the songs I perform have movements that are functional and also meaningful.”

van den Berg performs using motion-tracking gloves in 2017. Performance begins at 3:50.

Born in Amsterdam, Holland, van den Berg was obsessed with both music and computers from an early age, but had always viewed them as separate things. She played instruments in bands, was a singer / songwriter, and traditionally worked with other producers to create beats for her tracks. Then in 2011, she entered a “real” studio for the first time. Watching producers noodle around with her songs to make remixes, it clicked: “I can do this.” She taught herself music production and shortly after, in 2012, released her debut EP. “I immediately noticed how much freedom and independence that gave me,” van den Berg says. “I didn’t need other people producing my songs anymore. My musical expression became so much more direct because I could just make the sounds I had in my head instead of explaining them to another human being.”

“I’m kind of a guinea pig, but that’s fine with me.”

van den Berg had solved one problem for herself, but releasing music created another: how would she perform the songs she’d made? The nature of most electronic tracks meant she had two options — stand behind a table with a bunch of gear and knobs and faders, or play a backing track and sing on top. Neither was acceptable for her. “I had this choice,” says van den Berg. “Either it was going to be real and live, but boring to watch and distant from the audience, or I’d play a recording and be able to dance around onstage. Dancing around and being one with the audience was way more appealing, but the musician in me really didn’t like the idea of singing along to a track. So I had a dilemma.”

This dilemma van den Berg faced is a problem many DIY and electronic artists encounter — how do you incorporate movement and expressiveness when you essentially perform standing at a desk, using an interface the audience will likely never see? And then make it interesting? Acts like The Glitch Mob use electronic drum kits and hacked Microsoft Surfaces tilted toward the audience, while several startups, like Enhancia and Genki Instruments, are banking on MIDI controller rings. van den Berg’s initial solution was Mi.Mu, a pair of movement-tracking gloves created by musician Imogen Heap. Each glove has nine sensors and triggers that are completely customizable. Almost any movement can be assigned any musical parameter, so you could, for example, drop your arm down to add reverb, or pinch in the air to add chorus to your voice. van den Berg was so inspired by the Mi.Mu gloves, she wrote “Sappho Song” the day she received them.

That was in 2014, and it launched a quest for van den Berg to fluidly bridge the gap between her body movements onstage, and control over audio and visual effects. While the gloves allowed van den Berg to control audio and video with her hands, she wanted to do more: how could she use her whole body instead of just her hands? The iteration she’s now working on combines the gloves with a motion-capture suit, the kind of thing generally used to record people’s movements for video games or films. For now, she has one song programmed to use all the tech she’s wearing, but the goal is to eventually expand to a full, hour-long show. That’s not an easy task. Along the way, she’s had to learn C++, find a company to lend her sensors (she doesn’t own the system, estimating the total cost at around $12,000), and continuously experiment with new combinations of existing technologies across platforms to tie it all together.

van den Berg’s setup appears sleek and minimal onstage at SXSW, but there’s a complex web of hardware and software needed in order to make it run. She wears the Mi.Mu gloves on her hands, and then puts on a custom bodysuit embedded with 15 of Xsens’ 3D motion-tracking sensors spanning her arms, shoulders, head, pelvis, legs, and feet. Off to the side are three computers — a Mac laptop running three apps for controlling audio playback and effects, a Mac mini that uses the “creative coding” tool openFrameworks to manage all the visuals controlled by the Mi.Mu gloves, and a Windows laptop that runs the suit’s sensors and connects them in real time to the game engine Unity using the wireless full-body VR platform VRee. There’s also additional hardware, like microphones and an audio interface. To make sure there aren’t any hiccups, the show runs on its own Wi-Fi network, which requires venues to turn off their house Wi-Fi or move it to a different frequency to prevent interference. It’s a lot. But the result — a realization of five years of work to connect body, voice, music, and video into one reactive thing — is incredibly exciting.

The whole process needs to get a lot simpler

The whole process needs to get a lot simpler if other electronic musicians are going to adopt it, van den Berg says. “[My setup] is not how we’re going to change the world of electronic music performance because no one’s going to spend all of this money and then also be super bothered about IP addresses and networking during soundcheck,” she says. “Who wants to tour with three expensive computers and set all of that up during a 20-minute changeover at a festival? That’s crazy.”

One of van den Berg’s custom suits with Xsens sensors
One of van den Berg’s custom suits with Xsens sensors
Image: Jan Mulders

van den Berg isn’t the only electronic artist thinking about the connection between her physical self and technology — musician Laura Escudé also uses movement-based controllers and reactive visuals, for instance — but she’s one of the most ambitious. And although every hack is self-serving in that it brings her closer to the show she has in her head, there’s ultimately a bigger goal: van den Berg hopes her experimentation will spur others to think about how their performances can be more immersive, and that she can make it easier for artists who want to do similar things.

“My dream is that I’m not the only one in the world doing stuff like this,” says van den Berg. “That electronic music performance becomes more human and more expressive.” That means being the one to deal with all the technical headaches, so others don’t have to.

“I’m kind of a guinea pig,” she laughs. “But that’s fine with me.”

van den Berg performs at SXSW. Video credit: New Dutch Wave.