Skip to main content

Google says its Bard chatbot isn’t a search engine — so what is it?

Google says its Bard chatbot isn’t a search engine — so what is it?

/

Bard is not a good search engine. Neither are ChatGPT and Bing. Figuring out what they’re actually good at and how we should use them is going to take a while.

Share this story

A screenshot of an empty Bard text field.
Bard looks like a search engine, though Google says it isn’t one.
Image: Google

“Bard is a complement to search.” That’s how Google describes the relationship between Bard, the new chatbot entering into beta testing today, and its monolithic search engine. The way the company sees it, Bard is less a tool for finding information and more a way to automatically generate ideas and emails. And poems. And poem-emails. You want answers to search queries? That’s what Google search is for. There’s even a “Google It” button at the bottom of most Bard responses.

But the thing about Bard — and really the thing about every chatbot including ChatGPT and the new Bing — is that Google doesn’t actually get to choose how you use it. People have spent the last few months using ChatGPT to replace a search engine... and wondering what it might do to Google’s bottom line. Even Google’s competitors see chatbots as search engines: Microsoft CEO Satya Nadella said he launched the Bing chatbot to bring the fight to Google. “I want people to know that we made them dance,” he said.

All Bard is, from a user experience perspective, is a text box. And what has Google spent two-plus decades training us to do with a text box? Type search queries. Meanwhile, Google has also spent the last few years rebranding itself as “helpful,” using Google Assistant to much more directly answer user questions and adding more information to the results page so you never need to click away.

Bard is a search engine — and an extremely on-brand Google one. Whether or not Google wants to admit it, it’s currently launching the future of search.

The way Google talks about Bard right now seems to be, reasonably enough, based on what Bard can actually do. “In our initial testing, we found that people are delighted to use it for use cases like planning their neighborhood block party,” says Sissie Hsiao, a VP of product at Google and one of the Bard leads. For now at least, she called Bard a “creative collaborator” and didn’t seem bothered when Bard got a newsy query wrong. “There’s Google search for that, right?” The way she sees it, “Bard is really here to help people boost their imagination and their productivity.” 

A GIF of a Bard conversation about fly fishing.
Brainstorming and planning are the kinds of things Google thinks Bard is already useful for.
Image: Google

That’s all well and good, but Bard is a general purpose chatbot: a blank text box into which people can type their questions and hopes and weird fantasies and get instantaneous feedback and responses. Even the text box itself invites exploration — all it says is “enter a prompt here.” 

Google has put some rails on the experience, trying to narrow Bard’s scope to only the things it does well. It refused to tell us how to make mustard gas, for instance, with one draft gently saying no and another angrily admonishing us for even trying. Google’s limiting the back-and-forths in a conversation — Eli Collins, a VP at Google Research and another of the Bard leads, wouldn’t say how many turns you can take exactly, except that it’s a single-digit number — in order to keep things from spiraling out of control. Google has been testing Bard-like products for a long time, too. But if we’ve learned anything from Bing, it’s that you can never truly guess what real-world users will do. (But a lot of it will be weird and romantic.)

The reality is, Bard’s UI still looks like a search box, and Bard is still a crappy search engine

The reality is, Bard’s UI still looks like a search box, and Bard is an extremely hit-or-miss search engine. So are the new Bing and ChatGPT. All are likely to hallucinate facts, stridently offering incorrect facts or examples that don’t even exist. Bard doesn’t even provide footnotes or citations with its answers unless it’s directly quoting a source, so there’s no way to check its facts other than the “Google it” button. (That ultimately puts even more of the onus on Google to get things right because it can’t simply point you to information and wash its hands of the results.) And when there are things on which reasonable people disagree — like the amount of light a fern should get, per one example in Google’s demo — Bard offered one perspective without even a hint that there might be more to the story. 

Google does go way out of its way to remind you that Bard is still very much an experiment. There are pop-ups and reminders everywhere reminding you that what Bard tells you might be wrong and a notice underneath the text box that says, “Bard may display inaccurate or offensive information that doesn’t represent Google’s views.” “It’s different than other ways that we’ve incorporated ML into our products,” Collins says. The company is being cautious for a number of reasons, surely including both regulatory watchdogs and pressure on the ad business but mostly stemming from the fact that if Google search suddenly stops providing good information, it won’t stay in business for long.

A screenshot of Bard saying, “Bard is an experiment.”
Google really doesn’t want you to forget that Bard is still an experiment.
Image: Google

That’s why Google has been slow to roll out products like Bard despite nearly a decade of work on some of the foundational technology. (Collins says that he’s been using Bard prototypes since 2019.) Even now, it hedges every time: each time you ask Bard a question, it provides three different “drafts,” each one representing a different output from the underlying model. In a demo, sometimes the three models were quite similar, but often, they were quite different. When my colleague James Vincent asked about the load capacity of his washing machine, the three drafts gave three completely different answers. Like so many chatbots, Bard is likely to be useful for creative ideas and low-stakes questions. (I asked for heist movie recommendations on Prime Video and got five totally serviceable options.) For anything else, it’s not to be trusted. 

Someday, though, as Bard continues to progress, you are going to see it in Google search results. To some extent, you already are: the infamous “10 blue links” aren’t gone yet, but Google has been using its AI models to summarize search results for the last couple of years and to help people find new things to search for. All Bard really changes is the UI. And heck, when Google first announced Bard in February, it even included a screenshot showing AI-generated answers at the top of search results. “LLM-based features directly in search are coming soon,” Hsiao says. “And there we’re using the application of the technologies but in a different fashion.” 

In a sense, Bard actually does fit neatly into Google’s vision for the future of search. Last year, at its annual Search On event, Google executives explained that most users come to Google not seeking answers but adventures. They want to learn more about Kelsea Ballerini; they’re looking for fun things to do in a new city; they want something new to watch or read or cook tonight. 

This is the sort of thing that Bard could eventually handle really well. In that sense, it’s a coconspirator and idea machine, rather than a question-and-answer bot. But the thing about search is, you have to do both. And while users might not notice when Bard recommends five great San Francisco restaurants but not the five best ones, they’ll surely notice if it lies about whether pad thai has peanuts. Google always likes to say that 15 percent of its search queries every day are things that have never been typed into Google before; that’s hard enough to deal with when your output is 10 blue links, and it’s a whole other ball game when you’re trying to teach a bot to cogently and accurately answer the question on its own. 

Google wants you to see Bard as a fun toy, a glimpse into a far-off future. But if it looks like a search engine, talks like a search engine, and has google.com in the URL. People are going to use it like a search engine. And that could go badly for Google.