Skip to main content

The iPhone X’s new neural engine exemplifies Apple’s approach to AI

The iPhone X’s new neural engine exemplifies Apple’s approach to AI

/

Artificial intelligence in your hand, not in the cloud

Share this story

Apple’s new iPhone X is billed as “the future of the smartphone,” with new facial recognition and augmented reality features presented as the credentials to back up this claim. But these features wouldn’t be half as slick without a little bit of hidden futurism tucked away in the phone’s new A11 Bionic chip: Apple’s new “neural engine.”

The neural engine is actually a pair of processing cores dedicated to handling “specific machine learning algorithms.” These algorithms are what power various advanced features in the iPhone, including Face ID, Animoji, and augmented reality apps. According to Apple’s press materials, the neural engine performs “up to 600 billion operations per second” to help speed AI tasks (although this stat is hard to put in proper context; operations-per-second is never the sole indicator of performance).

What’s clear about the neural engine is that it’s typical of Apple’s approach to artificial intelligence. AI has become increasingly central to smartphones, powering everything from the speech recognition to tiny software tweaks. But to date, AI features on mobile devices have been mostly powered by the cloud. This saves your phone’s battery power by not taxing its processor, but it’s less convenient (you need an internet connection for it to work) and less secure (your personal data is sent off to far-away servers).

Apple unveils its new neural engine on stage at the iPhone X event.
Apple unveils its new neural engine on stage at the iPhone X event.

Apple’s approach is typical of the company’s ethos: it’s focused on doing AI on your device instead. We saw this back in June 2016, when the company introduced “differential privacy” (using statistical methods to mask users’ identity when collecting their data), and at WWDC this year when it unveiled its new Core ML API. The “neural engine” is just a continuation of the same theme. By having hardware on the phone itself that’s dedicated to AI processing, Apple sends less data off-device and better protects users’ privacy.

The iPhone-maker isn’t the only company to pursue this approach. Chinese tech giant Huawei put a similar “Neural Processing Unit” in its Kirin 970 system-on-chip, saying its can handle tasks like image recognition 20 times faster than a regular CPU. Google has developed its own methods of on-device AI called “federated learning,” and has hinted that it too is working on mobile chips for machine learning. ARM has reconfigured its chip design to favor artificial intelligence, and chipmaker Qualcomm says it’s only a matter of time before it, too, launches its own mobile AI chips.

Because although the iPhone X’s neural engine is typical of Apple’s approach to AI, it’s not just the company’s particular quirk. It’s the future of the whole mobile industry.