Skip to main content

A brief guide to mobile AI chips

A brief guide to mobile AI chips


Do I need one? What is it? Seriously, what’s going on?

Share this story

One of the iPhone X’s new selling points is the “neural engine” in the new A11 chip.
One of the iPhone X’s new selling points is the “neural engine” in the new A11 chip.
Photo: Apple

Mobile AI chips. What are they actually good for?

In the recent months we’ve heard a lot about specialized silicon being used for machine learning in mobile devices. Apple’s new iPhones have their “neural engine”; Huawei’s Mate 10 comes with a “neural processing unit”; and companies that manufacture and design chips (like Qualcomm and ARM) are gearing up to supply AI-optimized hardware to the rest of the industry.

What’s not clear, is how much all this benefits the consumer. When you’re buying your phone, should an “AI chip” be on your wish list? If you want to use the latest AI-powered app that (just picking an example at random here) automatically identifies and hides your nude selfies, do you really need an AI chip? Short answer, no, but let’s dig a little deeper.

Why do we need AI chips*?

The reason for having mobile AI chips in the first place is pretty straightforward. Regular CPUs found in phones, laptops, and desktops just aren’t well suited to the demands of machine learning, and trying to make them do it ends up with slow service and a fast-draining battery.

AI chips are useful because AI has changed computing

Contemporary AI requires computers to make lots of small calculations very quickly, but CPUs only have a handful of “cores” available to do the math. That’s why the industry loves graphical processing units, or GPUs. These were originally designed to render video game graphics, which, coincidentally, requires making lots of small calculations very quickly. Instead of a handful of cores, they have thousands.

Now, fitting thousands of cores into a chip for your phone isn’t going to happen. But there are other architectural changes you can make to increase the number of simultaneous work your chip can do. Qualcomm’s head of AI and machine learning, Gary Brotman, tells The Verge: “I think parallelization is certainly key, and doing it efficiently, especially.” He’s quick to add, though, that dedicated AI compute units aren’t the only way forward — other bits of chip architecture can also be adapted.

*”AI chip” is a usefully recognizable term, but it’s also imprecise. In the case of Huawei and Apple, what’s being offered is not a single, self-contained chip, but dedicated processors that come as part of a bigger SoC (or system on chip), such as Apple’s A11 Bionic. SoCs already contain various specialized components for things like rendering graphics and processing images, so adding a few cores for AI is kind of par for the course.

Huawei’s Kirin 970 chipset also comes with its own neural processing engine. Let’s hope it doesn’t glow like this when in use though.
Huawei’s Kirin 970 chipset also comes with its own neural processing engine. Let’s hope it doesn’t glow like this when in use though.
Image: Huawei

What do we get out of it?

As mentioned above, specialized AI hardware means — in theory — better performance and better battery life. But there are also upsides for user privacy and security, and for developers as well.

on-device AI means better performance and more privacy

First, privacy and security. At the moment, a lot of machine learning services have to send your data to the cloud to perform the actual analysis. Companies like Google and Apple have come up with methods to do these sorts of calculations directly on your phone, but they’re not widely used yet. Having dedicated hardware encourages more on-device AI, which means less risk to users of data getting leaked or hacked.

Plus, if you’re not sending data off into the cloud every few seconds, it means users can access services offline and save data. That latter part is a boon for developers, too. After all, if the analysis is done on-device, it saves the people running the app paying for servers. As long as the hardware is up to scratch, everyone benefits.

Is this stuff ready to use?

This next section is where things get trickier. Just because a phone has an AI chip, doesn’t mean AI-powered apps and services will be able to take advantage of it.

In the case of Huawei and Apple, for example, both companies have their own APIs that developers need to use to tap the power of their respective “neural” hardware. And before they can integrate that API, they have to make sure the AI framework they used (for example, Google’s TensorFlow or Facebook’s Caffe2) is also supported. If it’s not, they’ll have to convert it, which also takes time.

Anthony Mullen, a tech analyst at Gartner, says navigating this patchwork of interfaces “isn’t for the faint-hearted.” Speaking to The Verge, he says: “It’ll be a while yet before people are developing elaborate experiences using this hardware. Until then there’ll be special partnerships between manufacturers and third-parties.” That’s why Microsoft is working with Huawei to make sure its Translator app works offline with the company’s NPU chip, and why Facebook partnered with Qualcomm to integrate the latter’s AI focused hardware to load its augmented reality filters faster.

Facebook worked with Qualcomm to make its augmented reality selfie filters (above) run faster on the company’s hardware.
Facebook worked with Qualcomm to make its augmented reality selfie filters (above) run faster on the company’s hardware.
Image: Facebook

But while big companies like these can afford to put in the time, it’s not clear if it’ll be worth the effort for every small app developer. This won’t be a problem for Apple, whose developers will only have to adapt their app once using the company’s Core ML framework; but it could be a headache for Android, especially if different manufacturers all start introducing their own protocols.

Thankfully, Google is using its power over the ecosystem to combat this problem. Its mobile AI framework, TensorFlow Lite, is already standardizing some experiences on mobile devices, and it’s introducing its own Android-wide APIs to "tap into silicon-specific accelerators."

“From a developer’s standpoint in the Android environment it won’t mitigate all the fragmentation risks,” says Brotman. “But it’ll certainly provide a construct to make it easier.” He adds that some of the effects of this work won’t be fully felt until Android P is ready.

So do I need an AI chip in my phone?

No, not really. So much work is being done on making AI services run better on the hardware currently available, that unless you’re a real power user, you don’t need to worry about it.

In both Huawei and Apple’s cases, the primary use of their shiny new hardware is just generally making their phones... better. For Huawei that means monitoring how the Mate 10 is used over its lifetime and reallocating resources to keep it from slowing down; for Apple that means powering new features like Face ID and animoji.

Having computing power dedicated to AI tasks is neat, sure, but so are other features of high-end handsets — like dual camera lenses or waterproofing. Boasting about AI chips makes for good marketing now, it won’t be long before it just becomes another component.