Skip to main content

Google’s new image search tools could help you identify AI-generated fakes

Google’s new image search tools could help you identify AI-generated fakes

/

This summer, Google reverse image searches will be able to tell you when it indexed a picture the first time, and metadata tags can help identify AI-generated pictures.

Share this story

Image: Google

In a world increasingly filled with photorealistic images that have either been altered with AI editing tools or created using a generative AI bot like Midjourney or Stable Diffusion, how do you know if a picture is real? One thing that could help is a new tool Google is rolling out this summer for English-language searches in the US called “About this image.”

It’s similar to the “about this” drop-down that appears on links in regular search results but is now available in Google image searches. When you perform a “reverse image search” by uploading an image of unknown provenance, you’ll now see a menu option that lets you find out when that picture and others similar to it were first indexed by Google — as well as where on the web it first appeared and which sites it’s appeared on since.

Image: Google

Google’s example involves uploading a picture of a faked Moon landing, with the tool then showing how the image has appeared in debunking stories, but that’s not the only kind of circumstance where this would be useful.

A Midjourney-created image tagged to mark it as an AI-generated picture.
A Midjourney-created image tagged to mark it as an AI-generated picture.
Image: Google

For example, if a picture of a breaking news event first appeared when it was uploaded to Getty, Reuters, or CNN, then that would seem like a fair indication that it’s legit. But a picture that first appeared in a random comedy subreddit with a news organization’s watermark is more likely to be a fake — no matter how incredible the pope’s new Balenciaga outfit looks.

Assuming it works as intended, this type of tool will hopefully be copied by competitors and quickly become available to people beyond just the borders of the US, as the spread of misinformation is a problem in more than just one country and language.

Google also announced that its own generative AI tools would include metadata with each picture to indicate it’s an AI-created image, not a photo, regardless of whether you see it on a Google platform. It also said other creators and publishers will be able to label their images using the same tech, though it’s unknown how widespread the participation will be.

Google’s blog post says Midjourney, Shutterstock, and others will roll out the markup in “the coming months.”