Skip to main content

Apple and Google are heading in the same direction but on different paths

Apple and Google are heading in the same direction but on different paths

/

Both companies are doing very similar things

Share this story

Developer conference season is behind us, leaving in its wake a huge pile of software updates to come, promises of fancy features, and a slightly clearer direction for where the computer platforms we use every day are going.

The last of them is always Apple’s WWDC, and this year, I was struck by a thought that I can’t seem to shake even though I know it’s not completely accurate: this was a very Googley year for Apple’s product announcements.

Apple has four major software platforms, and it has updates for all of them

On a surface level, that feeling comes from the fact that Apple had a lot to talk about. It has four major software platforms (at least), and it has updates for all of them. Running through all of that makes for a long and somewhat scattershot keynote. Google I/O has always been similar. The biggest job I have after a Google keynote is trying to find a coherent narrative thread that ties the announcements together. Apple is usually pretty good a presenting a vision. But this year, there was so much to go over that I don’t know how that would have been possible.

Another surface-level reason is that Apple announced a few features that are quite similar to products Google is working on. Both companies are releasing dashboards for your phone that will tell you how much you’re using it (answer: too much). Apple went a long way toward fixing the notification problem on iOS by adding features that have long been on Android: grouped notifications and the ability to turn notifications off without going spelunking through your settings.

Both companies released new versions of their respective augmented reality frameworks that allow multiple devices to “see” the same digital objects in space. Google’s solution is cross-platform and depends on “cloud anchors,” while Apple’s solution can work locally, with devices communicating directly and not sending any information to the cloud.

Apple and Google are clearly headed in the same general direction

The new version of Apple Photos on iOS borrows a ton of stuff from Google Photos. It has a “For You” section that automatically puts neat little effects on your photos. It has more advanced search, which allows you to string lots of modifiers together to find what you’re looking for. It also has suggested sharing, where Apple Photos can identify who’s in your pictures and offer to create a shared album with them.

All of those things already exist on Google Photos, but as with AR, Apple’s way of doing things is very distinct from Google’s. Apple keeps photos end-to-end encrypted, and it’s very clear that its AI works on-device instead of leaning on a cloud infrastructure.

But I think the reason that this year’s WWDC felt a little Googley is that both companies are trying to articulate a vision of computing that mixes AI, mobile apps, and the desktop. They’re clearly heading in the same general direction.

As a first example, take Shortcuts on iOS and Actions / Slices on Android P. Both are attempts to get smart assistants to do a better job of communicating with apps. The idea is to allow you to do the stuff you’d normally do in an app and bust it out into your phone’s search or into the smart assistant. I think it’s an exciting trend, but I do worry that in both cases there’s a risk of the old Microsoftian “Embrace, Extend, Extinguish” strategy on the horizon.

All we really wanted to hear from Apple was that it’s fixing Siri (or at least adding multiple timers), but the company chose not to address those concerns; instead, it introduced Siri Shortcuts. Shortcuts are based on the Workflow app Apple acquired, and I think they’re a fairly smart way for Apple to add functionality to Siri without needing to gather as much data as the Google Assistant.

It’s also a fascinating example of the different philosophies of these two companies. With Actions / Slices on Android P, app developers simply make a bunch of stuff available to Google Assistant, and then users go searching (or asking) for them. Instead of configuration, there’s a sense that you have to trust Google to just figure out what you want. Because Google is so good at that, I have high hopes that it will work.

But with Shortcuts, you have to do a lot of the configuration yourself. You look for an “Add to Siri” button, you set your own hot word, and maybe even chain them together if you’re a power user.

Siri can do some of the machine learning stuff to create suggested shortcuts that Google Assistant can do (on device, of course), so the differences here aren’t as big as they might first appear. But for the most part on Android, you put your faith in Google to figure it out; on iOS, you configure it.

If there’s an area where it’s clear that both Apple and Google are thinking along similar lines, it’s moving mobile apps to the desktop. Again, their approaches here are as radically different as the companies are.

Google has been putting Android apps on Chrome OS for a while now. They’re not ports; they’re just straight Android apps running on Chromebooks, and that means that they don’t feel native to Chrome OS. There are some nice integrations (like notifications), but you can’t resize windows yet. Basically, Google’s approach was to throw a beta out into the world, then iterate. That iteration has taken longer to execute on than I’d like, but it’s happening.

Apple, on the other hand, is looking to find a way to make iOS apps feel native to the Mac — so much so that it’s probably not even correct to call them iOS apps. (Apple told me that it’s also not correct to call them “ported” apps either.) It was a very Google move to announce this was happening so far ahead of a developer release, but it’s a very Apple move to insist that the apps feel native to the Mac and to test these apps before sharing APIs with the world.

The goal is to bring the momentum in mobile apps to the desktop

In both cases, as Chaim Gartenberg and I talked about here, the goal is to take some of the momentum in mobile apps and bring it back to the desktop. There’s a recognition that the way we are using our laptops could benefit from mobile apps. Ironically, this is precisely what Microsoft has been struggling to achieve with Windows — though the difference is that iOS and Android have a much larger base of apps to work with.

It’s fair to say that Apple is acting just a little more like Google when it comes to its ultimate goals, but it’s also fair to say that both of these companies see the same trends happening in computing, and so they are triangulating their platforms in complementary ways.

Despite all the similarities, there’s still one massively important difference. It’s not privacy (though that’s a big one). It’s that Apple is going to do a better job of getting its innovations into people’s hands. When iOS 12 comes out later this year, it’s going to land on hundreds of millions of devices. When Android P ships later this year, it will hit a tiny fraction of Android’s install base. And that is forever a key advantage: Apple ships.