Skip to main content

Siri's big upgrades won't matter if it can't understand its users

Siri's big upgrades won't matter if it can't understand its users

Share this story

Apple announced some sweeping changes to its software this week, including major updates to iOS and its newly named macOS desktop operating system. The company also overhauled Siri, announcing that the voice assistant will finally be available on the desktop and — more importantly — opening up the service to third-party developers. That’s an important step that Apple needed to take in order keep up with the competition, and to pave the way for Siri to become less of a novelty and more of a powerful platform.

What Apple didn’t talk about was solving Siri’s biggest, most basic flaws: it’s still not very good at voice recognition, and when it gets it right, the results are often clunky. And these problems look even worse when you consider that Apple now has full-fledged competitors in this space: Amazon’s Alexa, Microsoft’s Cortana, and Google’s Assistant.

Our own Walt Mossberg summed up this problem pretty succinctly last month. "On too many occasions, Siri either gets things wrong, doesn't know the answer, or can't verbalize it," he wrote. "Instead, it shows you a web search result, even when you're not in a position to read it."

That’s the most frustrating part about Siri. All voice assistants occasionally get things wrong, but Siri tends to misunderstand users more often. When you finally do get Siri to understand you, you often quickly run into its limitations.

Mossberg goes on to write that improvements to Siri — specifically the "Proactive Assistant" feature, which made Siri a bit more like Google Now — have not been enough to make it more widely useful. Proactive can glean local information stored on your iPhone and in Apple’s stock apps, but it still can’t (and won’t) intermingle with the cloud. That’s why Mossberg calls Siri "one of the tech world's biggest wasted opportunities."

He’s right! Siri was introduced five years ago, a project fostered by Steve Jobs and introduced by Tim Cook just before Jobs died. And it felt amazing at the time to see a feature like Siri in a phone. But the service has languished while Apple’s competition met, and then exceeded, the voice assistant’s capabilities.

A big part of that could be the AI that powers Siri. Google, Microsoft, and Amazon have all used neural networks from the get-go to make their assistants better at recognizing speech, figure out what it is that you’re asking, and increase response times. Apple started shifting in this direction in 2014, but it’s unclear how much of a role it plays to this day. (The creators of Siri, which was acquired by Apple in 2010, left the company because they weren’t happy with the development path it was on.)

Siri lags behind the competition on both the software and hardware side

Apple squandered its head start in other ways, too, because its competitors are building their assistant services into other apps and hardware now, as well. Just last month, Google built its digital assistant into Allo, a brand-new messaging app. (One that Apple’s new version of iMessage, shown yesterday at WWDC, looks a lot like if you take away the assistant features.) Amazon has built out a small army of Alexa-enabled hardware devices, and one of them — the Dot — can even plug into and add Alexa to other third-party speakers. The Echo, Amazon's flagship Alexa device, tackles the speech recognition and interaction problem with more than just software — the Echo uses seven microphones to pick up and understand a user's voice.

Siri can now catch up to those competitors on the adaptability side, because bringing in third-party apps will let the digital assistant know more about its users. Apple has built enough walls around user data that Siri has had trouble becoming as fine-tuned an experience as, say, Google Now, which can automatically present things like flight information or package tracking because it is able to crawl through a user’s email. Apple’s never been able to do that with Siri, and Proactive Assistant was too small a step. The more third-party integrations developers come up with, the better off that side of Siri will be now that it’s a more open platform.

But the value of those features will really depend on whether Apple can clean up the the mess that is the first steps of using Siri. Speech recognition and interaction are basic problems, but they're big ones, and they're going to matter even more as Siri’s capabilities expand.

Correction: This article mistakenly stated that Siri was introduced by Steve Jobs in 2011. It was announced by Phil Schiller. The article has been changed to reflect that.