Skip to main content

Dolby Labs is using biosensors to learn how we’re reacting to movies and shows

Dolby Labs is using biosensors to learn how we’re reacting to movies and shows

/

Netflix and not chill

Share this story

The myriad ways in which video and audio impact our brains and bodies have long been studied by academia. But that research isn’t only taking place in university labs: at Dolby Laboratories, the company has been conducting its own internal research into how media can trigger reactions in human beings.

On a recent tour through Dolby Laboratories in San Francisco, the audio and imaging company spent most of the day showing journalists the latest developments in its proprietary high-dynamic range (HDR) technology, which is being licensed to Netflix for the new series Iron Fist. One stop along the tour included a closer look at how Dolby scientists are studying how we react to what we’re watching.

Inside a makeshift, sound-proofed living room, a woman sat on a leather couch wearing a 64-channel EEG cap, a type of biomedical cap that’s commonly used to measure the electrical activity generated by neurons in the brain. On her wrist was a tracker measuring heart rate and galvanic skin response (or, sweat); she also had a pulse oximeter on her fingertip. A thermal imaging camera was pointed at her.

This is how Dolby is trying to “better understand the human experience,” according to Poppy Crum, chief scientist at Dolby Labs. Crum runs the team of data scientists and academic researchers who work on all of this. The company first started looking at physiological responses in 2012, and in 2015 created a full-fledged biophysical lab for it. “We want to know if people are more engaged, or if their reactions are heightened at some points,” Crum said.

What scene in a movie makes your heart beat faster? What makes you sweat? What makes you fall asleep?

There were three screens in the room, along with at least a dozen speakers. One of the displays showed a series of videos to the woman on the couch; the other two showed her reaction data in real time. Her reflection appeared as a glowy, thermal image on the left, while her heart rate appeared in spikes and dips on the right.

This data is all raw data, Crum said; in this case, it was mostly for show. But what Crum’s team is doing is taking all of that info and processing it, using it to potentially inform decisions around how media is produced. Right now, Dolby has about 40 trained subjects it rotates in and out of its labs (some from within the company, and some outside participants), who are all willing to inform the algorithms: What scene in a movie makes their hearts beat faster? What makes them sweat, or causes their cheeks to flush? What makes them fall asleep?

Dolby is hardly the first tech company to use advanced sensors to get a sense of how viewers are engaging with something. Netflix, for example, used eye-tracking technology to inform its decisions around an interface redesign in 2014, finding that viewers were constantly being forced to shift their eyes back and forth between a program’s title and its description in a sidebar. Now that virtual reality headsets are becoming more accessible to people, technologists and researchers alike are gathering troves of data around how people react to media in a more immersive environment. (Facebook-owned Oculus just bought an eye-tracking company last December, in what some saw as an effort to understand what people are looking at in VR so Facebook could better serve up ads.)

At this point, Dolby isn’t working with content creators to actually alter any kind of content. “We’re not trying to tell them that what they’re creating is wrong, but these tools and insights can certainly feed ideas and directions in the future,” Crum said. In other words, Dolby isn’t licensing these insights to its partners right now; but eventually, it could. And the next time you find yourself sweating through a movie scene: blame the algorithms.

Photography by Lauren Goode / The Verge