Skip to main content

Netflix Chief Product Officer: expect 4K streaming within a year or two

Netflix Chief Product Officer: expect 4K streaming within a year or two

/

Neil Hunt talks site outages, streaming 4k, and the company's gold mine of user data

Share this story

hunt netflix
hunt netflix

Neil Hunt is likely the most important Netflix executive that nobody's ever heard of. While everyone in tech media knows CEO Reed Hastings and Chief Content Officer Ted Sarandos, Hunt's kept a low profile despite the pivotal position he holds as the company's chief product officer. Hunt looks after the video service's technology, including the streaming platform, as well as the tech behind the new feature announced yesterday, which will enable subscribers to share what they watch with their Facebook friends.

Hunt also oversees the unprecedented amount of user data that Netflix sits on. The data helps his team create the algorithms that support Netflix's recommendation features. At a time when the cost of licensing content is spiking, Hunt said in an exclusive interview with The Verge that these recommendations help Netflix and its subscribers get the most out of Netflix's video library by suggesting titles that customers are more likely to enjoy. Hunt also talked about those nasty holiday site outages, why Hollywood and Web movie distributors need better supply-chain technology, and why he thinks the film industry places too much emphasis on pixel counts when the sweet spot for improving viewing quality is higher frame rates.

What is the biggest technological hurdle for you right now? Where are you most focused?

"We expect to be delivering 4K within a year or two."

A big piece is the actual delivery: making sure that we can continue to deliver 30-percent plus of the total internet downstream traffic. That's a big project. Our initiative with Open Connect (Netflix's single-purpose content distribution network, which helps service providers cut down on costs) moves the bits as close to the consumer as possible and provides a direct connection to ISPs. As part of Open Connect, we pay the cost and expense of installing Open Connect servers at common peering points or within an ISP's network at the most effective point. This makes it easier and more efficient for an ISP to deliver the best quality Netflix video, including our Super HD and 3D streams, by caching Netflix videos close to consumers and thus offloading it from a good part of an ISP's network.

How is that going?

We've had a lot of success. In Europe, essentially all of our traffic is delivered through Open Connect. We never set up commercial CDN infrastructure there, and have been successful at deploying Open Connect for most ISPs. We've been using our own technology from the start. The major ISPs in Europe like British Telecom have embraced Open Connect and deployed servers very rapidly. When we went to the Nordics, we had wide engagement with Open Connect. Open Connect is the way forward. It is the new platform for delivering TV shows and movies from Netflix. We engineered Open Connect specifically to deliver streaming video from Netflix, unlike commercial CDN products.

For example, we're able to upload new content to Open Connect caches overnight in off-peak traffic, because we know the content and what will be popular tomorrow; we don't fill on cache-miss, like most CDNs. This takes a meaningful load off the upstream ISP networks as a result. We bring in hundreds of hours a week of new content, encoded in many formats and different bit rates for all the devices we support. That's a fair amount of traffic pushed out towards to the servers every night.

You don't think this push for the 4K format and ultra high-definition by the film studios and TV manufacturers is an attempt to wall out streaming services like Netflix? That's a lot of info to stream.

On the contrary. Streaming will be the best way to get the 4K picture into people's homes. That's because of the challenges involved in upgrading broadcast technologies and the fact that it isn't anticipated within the Blu-ray disc standard. Clearly we have much work to do with the compression and decode capability, but we expect to be delivering 4K within a year or two with at least some movies and then over time become an important source of 4K. 4K will likely be streamed first before it goes anywhere else. To that point, our own original House of Cards was shot in 4K. It's being mastered in full HD, but the raw footage, or a good chunk of it, was shot in 4K, and we hope to have some House of Cards 4K encodes later this year.

Hoc-560

Are people really asking for this? Is there a demand for higher quality video?

Our goal is for people to get immersed in the story, whatever that is. And to that end we try to make the technology as seamless and smooth as possible. If people notice the rendering of the picture or the user interface, then that's subtracting from the experience we're looking for. The goal is to deliver the best possible picture that your equipment and network and source material is capable of. That way, we let people connect most closely to what they're watching. But we intend to stay on the leading edge of what I call the "quality of experience", so that poor quality does not become a discussion or competitive point.

If you were talking to the Hollywood film studios and TV networks now, what would you tell them?

"I can't imagine any other industry surviving when they misdeliver 3 out of 10 different assets."

The message is that we need to get better quality sources, closer to the original, but mostly they need to do a better job of managing and tracking the assets. We have a ridiculous 30 percent reject rate of assets delivered to us. I can't imagine any other industry surviving when they misdeliver three out of ten different assets. We get the wrong episode, or we get a soundtrack that doesn't match the content, or it has a giant drop-out, or the ads haven't been stripped out. There are lots of problems that have to deal with tracking and management. We need a digital asset-management system that is shared across the industry, a standard or format. A couple decades ago Walmart pushed very hard on EDI [electronic data interchange, the systems used by big companies to send and track orders] and working with their suppliers to get robust, on-time delivery. We need to go through that process with delivery of digital assets.

Then what we need is a high quality digital mezzanine — the industry term for the transfer of assets. We generally now get at least full 1080p high bitrate video from most suppliers. We're pretty agnostic in terms of format, whether it's MPEG-2 or H.264 at a high bitrate. Some of it comes in a professional-encoding standard that's lightly compressed. We get maybe 50 or 100Mbits a second and that's okay, although we'd be happy with formats even closer to the original, if available. We have plenty of storage in the cloud. We can handle that stuff. The format it comes in is less important than making sure it's the right thing with the right amount of lead in, that doesn't truncate at the end, that the audio matches the video where the subtitles match the audio. It can be a big pain.

What do you think is causing the studios to err this way?

Traditionally content suppliers have delivered their content in a different way. They send it to a theater on a film print, and it goes once to a Blu-ray mastering posthouse. Then they get the packaged Blu-ray and send it out to all distribution channels that way. There hasn't been an opportunity to exploit the assets in the digital world in the way that we're trying to do now until fairly recently. It's still a very immature industry, and it hasn't developed the necessary robustness and practices. We've come a long way since we started doing this, though — a lot of our stuff would come in on a terabyte hard drive that we would ship back and forth.

Are they trying to fix this?

It's much better than it was, but I would say that not everyone has seen the light. We have a team working on making that better.

"They've missed a beat in terms of figuring out that frame rate matters more than pixel count."

To deliver a good quality experience you've told me it's not just about concentrating on pixels.

I feel like as we've gone from VHS to standard def, from DVD to high-definition DVD, to now ultra-high-def 4K, what we've done is doubled or quadrupled the spatial resolution — the pixel count — at every step and yet fundamentally the frame rate is still largely 24p, or sometimes 30p. The pixel density is much more than the resolution of your eyeballs unless you get very close to the screen, so we definitely have saturated the spatial density.

"The Amazon outage was unfortunate timing. They're more embarrassed about it than we are."

But the industry has been slow to adopt higher frame rate, which I think is now a much more significant way to make a better quality picture for consumers to enjoy. I would love to see the industry get to 60p as a routine standard for shooting material in the first place, instead of the exception. The ultra-HD standard allows for 48p, 60p and 120p framerate delivery, but there's a bunch of pieces missing along the way: the encoders don't necessarily support the high frame rates. The current HDMI connector standard doesn't support the full 120p frame delivery rate. We have a lot of work as an industry to make the frame rate catch up to the same kinds of high quality as the pixel resolution.

I think that the pixel count is easily identifiable and prints out on a glossy brochure and fits nicely into marketing material, so the industry has pushed hard on upping that number, but they've missed a beat in terms of figuring out that frame rate matters more than pixel count.

Let's talk about Amazon and this Christmas Eve outage.

That was an unfortunate outage. To be sure, we've embraced Amazon as a robust solution for providing the high availability that we're looking for. We aspire to deliver at 99.99 percent of the time — the percentage of the time that people click a play button and content plays. Most of the time we're above 99.9 percent. Christmas Eve clearly took us below 99.9 percent. I would say accidents happen and the Amazon outage was unfortunate timing. They're more embarrassed about it than we are.

You're satisfied with Amazon's explanation?

The component that failed, the elastic load balancer, is the very front end, and it's a component that is hard for us to build our own. Unless we deploy our own hardware we're going to use Amazon's load balancers to make this stuff work. We've been aware that the architecture of the ELB has not been where it ought to be in terms of providing for a robust solution. For a long time we've been pushing Amazon to improve that. Their commitment was to move their retail operation and use the same Amazon Web Service ELB that they're offering as product to everyone else. The Christmas Eve accident I think revealed that it's not as robust as it should be, and Amazon has committed to us and to others to rebuild it with a better architecture and come up with a solid product.

In the meantime, we do have some alternatives, to figure out how to employ our own load balancing within the AWS infrastructure. We'll be working hard on that. But I would say that in general AWS has given us higher availability and robustness than we would have had in our own data center throughout the year. Even though that was a significant and unfortunate outage, I still think that in the end the decision to move to AWS was a good one.

Have you looked at any of Amazon's competitors in this area?

"Google and Microsoft both have technology solutions ... I would say that Amazon's technology is a very meaningful step ahead of them."

Google and Microsoft both have technology solutions, so they're chasing after similar customers. I would say that Amazon's technology is a very meaningful step ahead of them, in terms of having a complete solution of low-level components that can be assembled into a package that works.

You may know that we've done a lot of work with open-sourcing. We have what we call the "Netflix Platform," which is essentially how we compose the 20 or 30 services from AWS that we use into a coherent platform that our secret sauce can run on. By open sourcing, we encourage our peer companies working on other applications to use and contribute to that same platform and technology base. The goal is to make sure the community coalesces around one platform solution that's good, and to make it the best one by all contributing to it, rather than diversify into a lot of privately implemented platforms. We think it gives the industry a great way to move forward.

Everybody speculates that your user data is a huge pile of gold, an unprecedented record of what people watch and how they watch. What can you do with this information?

We are programmers assembling interesting TV shows and movies for our members to watch. We don't have everything ever made, but a selection. If we're to make the most value from what we have available to stream, we need to be able to suggest and recommend the right titles to our members, rather than expect them to come to us with an idea of what they want to see.

"If we're to make the most value from what we have available to stream, we need to be able to suggest and recommend the right titles."

People think they want to watch what has been marketed heavily in the past weeks on TV, radio, billboards, etc. We typically don't have those titles, because they are way too expensive to fit into $8 a month. So we have to change the model and suggest or recommend titles that people find interesting and compelling. That way we build value out of content that we can afford. So how do we do that? Our members watch more than a billion hours on Netflix a month. The number continues to grow. That represents a huge pool of data about what our members watch. We don't need to make it spooky for people, but it does provide an opportunity to learn about what people love to watch, what they watch in order, where they stop watching, and if you watch this, then you might watch that.

How do you know when what you're doing works?

We continuously test ways to improve our recommendation technology. We use many different algorithms to populate the rows of titles you see on Netflix. Every change we make to our algorithms, or any part of our service for that matter, gets A / B tested. This means that we will give a broad group of our members the updated experience and keep others on the existing experience. We will then measure what version generates most streaming activity and, as a result, is successful having our members stay with us longer.

My girlfriend and I use the same account. When will you guys force us and others who do the same to pay for two accounts?

One account is good for two streams at a time, and we're testing a new feature called "Profiles" that will let households with shared Netflix accounts create profiles for each user, so you will continue to get personalized recommendations. If you have kids, you can have them use a kids profile, and if you want the mashup of family recommendations you could even create a family profile that you use when watching together.

We hope you continue to get lots of value from Netflix, that you explore the Profiles and Social features, revel in the better and better picture quality, enjoy "House of Cards" and our other originals, and recommend us to all your friends because that's how we know we are achieving our goal of a delightful service that exceeds expectations.