Skip to main content

Man vs. machine: why Apple doesn’t want to pick

Man vs. machine: why Apple doesn’t want to pick

Share this story

If you buy something from a Verge link, Vox Media may earn a commission. See our ethics statement.

Earlier this morning Susan Prescott, Apple’s vice president of product marketing, took the stage to introduce Apple News. She showed the app’s features and design, while also pointing out its ability to adapt to users’ tastes. "There are powerful machine learning algorithms that analyze the content of the articles" to determine which stories are surfaced for a given user, she said. It echoed a point Craig Federighi, senior vice president of software engineering, had made earlier about iOS 9’s new Proactive Assistant: smarter, more sophisticated computer algorithms are allowing Apple’s products to provide custom-tailored features that its customers have never had before.

Less than an hour later, Jimmy Iovine took the same stage to introduce Apple Music, focusing his presentation on how the service uses hand-picked playlist curators and internet radio DJs to provide the best listening experience possible. Machine-curated playlists, he insisted, were a menace — so terrible they could even screw up your sex life.

Huh?

The tension between human-curation and machine-learning is symptomatic of both the state of consumer electronics, as well as the ever-developing state of the music industry. For Apple, that push and pull goes back to what Steve Jobs always claimed were the two pillars of the company: technology and the liberal arts.

Apple Music WWDC 2015

Devices need to become more human than human to understand us

Today, most smartphones have achieved a basic level of feature parity: increasingly, differentiation comes in the form of software. As this year’s WWDC and Google I/O proved, that software’s next frontier is the ability to anticipate our needs. Our digital world simply has too much data, too many apps, and too much cultural noise to be truly useful. It’s a world where our devices have to become more human than human to understand us, and offer a way to filter, cross-reference, and contextually analyze the torrents of information we throw at them every moment of every day. The best devices will need to figure out what we want before we know we want it, and create utility by serving it up as quickly as possible. It’s an exciting future that isn’t far away.

It’s also a far cry from the early days of personal computing, where even something as simple as saving a file to floppy disk was a task that required serious intent on the part of the user; today it’s about intent disappearing almost entirely — and as Apple’s executives continually pointed out today, it’s the sophisticated nature of today’s machine-learning systems that makes that possible.

All of which made Iovine’s insistence on the importance of human involvement when it came to Apple Music so jarring. "These people are going to help you with the most difficult question in music: when you’re listening to a playlist, what song comes next?" he said. "Algorithms alone can’t do that emotional task."

"Algorithms alone can't do that emotional task."

The human-curated playlist pitch is one Iovine has been making since the launch of Beats Music. As a Nine Inch Nails fan, it certainly intrigues me that Trent Reznor might put together a playlist of songs that inspired him. But that feature wasn’t enough to keep me shelling out a monthly fee for Beats (nor did it impact Spotify’s huge lead in the marketplace). The truth is, Apple is playing catch-up here, and it needs something to distinguish itself from the biggest streaming player. The fact that hundreds of millions of users already use iOS and have iTunes accounts will help Apple considerably, but when you’re trying to convince people to switch platforms you need something more than just a frictionless sign-up process.

Trent Reznor and Apple Music

So Apple’s competitive advantage comes in framing music as something intimate — an emotional experience that can’t be reduced to ones and zeros — and selling the service with the same sense of personality and familiarity that many of Apple’s physical products exude. It’s not just access to music they’re promising here; they’re offering the idealized, aspirational experience of music. It’s a thinly-veiled swipe at Google as well: machine learning is great for appointments, phone numbers, and traffic conditions, they might as well be saying, but it can’t understand the nature of the human heart. Only Apple — standing at that intersection of liberal arts and technology — can divine what belongs to computers, and what belongs to humans.

With 60 million Spotify customers already entrenched with playlists and libraries of their own, it’s totally unclear whether this pitch will work — or how many users will even want to explore the idea of playlists created by somebody else. (I still remember a time when making a playlist for a friend was an act of care and love, and that’s something that no DJ or Iovine-picked employee can ever replicate.) But despite what appears to be a philosophical schism, the differing philosophies are Apple sticking to its core strength: selling technology with the promise that it understands you better than any competitor.