Now that Google Assistant has officially arrived to the iPhone (at least, for US-based users) a lot of people may be wondering why Google pushed the AI as a separate app from the main one. Here’s an overview of the differences between the two.
While it’s not officially called “Google Search” (and rather, just its plain ol’ namesake), many people refer to it as such because that’s essentially what the app is designed for. iOS users can download the app to add a nice little search widget on their home screen, using it to quickly look up things as you would on the website.
You can search on Google Assistant too, but Google says it wanted to keep the two apps separate because some people may not want the AI capabilities if they’re already happy and used to Siri. You can argue that Siri is not the best for search (it defaults to Bing and Apple Maps for directions, for example) so keeping the Google app on hand is helpful for general queries, searches for nearby businesses, and quick updates like sports scores and movie showtimes. The Search app has also been continuously updated to provide search results basically instantaneously.
Where the Google Assistant comes in handy is personal preference-based requests. The AI is designed to be conversational, combining search queries with task completions. For example, if you asked what restaurants are nearby, you can select from the results and follow up with a request to book a reservation. You can also tell Google Assistant your favorite genre of movies, types of food, and even your family members’ or significant other’s names so it has a record of all this information in order to provide the most relevant responses.
The Assistant will also gain third-party Actions, which allows you to control smart home gadgets or order food deliveries without leaving the app or manually entering an address and credit card number.
Though Google Assistant is designed to take on Siri, there are several things it cannot do on an iPhone. For example, you can’t use it to take a screenshot and share it on the web like you could on an Android, ask it to take a selfie, or use it to set alarms. Requesting Google Assistant to make calls or send an iMessage works, but it’s a couple of steps longer than just using Siri. You also can’t easily wake Google Assistant with the “OK Google” voice command or holding the home button — instead, you have to launch the actual app or hold the microphone down from the widget. Google blames these limitations to the API it was given to port the Assistant to iOS.
Restrictions aside, once Google Lens arrives to the Assistant, you’ll have a much better contextual image search which includes things like taking a photo of the Wi-Fi router and automatically log in to the network, or hover the camera over a concert hall marquee to learn more about the band. It’s also going to bring on the Word Lens feature from the Translate app to help translate foreign languages. Siri can’t do all these things (not yet anyway) and neither can the iOS Google app.
If you’re already happy with using Siri as an assistant for search, setting reminders, booking reservations, and launching apps, you probably won’t need the Assistant on top of the additional Google Search app. (You also may not need the Search app at all, if you’re satisfied with just asking Siri to “Google” something.) But if things like HomeKit has been disappointing you with their limited ecosystems, or you want to combine better search with task requests, perhaps the Assistant is the better AI for you. Either way, Google wants to give you the option to stick with just search if that’s all you need it for on an iPhone.
To give Google Assistant a shot, you can download it from the App Store for free here.