Google is introducing two new features for its online shopping experience that are designed to help users search for clothes in more detail and better visualize how clothing will look on different body types.
Starting today, Google Shopping users in the US can access a virtual try-on experience that realistically displays how an item of clothing will look on a selection of real human models. These models are available with various skin tones, ethnicities, hair types, and body shapes, ranging in size between XXS and 4XL to help users see how a piece of clothing will look on a body type that looks similar to their own.
Initially, only women’s tops from a selection of brands like H&M, Anthropologie, Everlane, and Loft will be available for the virtual try-on experience, with Google claiming that men’s tops and “other apparel” will be available sometime later this year. The feature has been designed to help shoppers avoid disappointment by accurately visualizing what an item of clothing will look like before they buy it. The company claims, referencing its own shopping data, that 59 percent of online shoppers are disappointed with a clothing purchase because they expected it to look different on their bodies, and 42 percent don’t feel represented by online clothing models.
The new Google Shopping virtual try-on experience uses a diffusion-based generative AI model, which is trained by adding Gaussian noise to an image (essentially random pixels) that the model then learns how to remove to generate realistic images. The process allows Google’s AI model to realistically depict how an item of clothing would wrinkle, drape, fold, cling, and stretch on the available range of diverse models, regardless of what angle or pose they’re in. To be clear, the models for Google Shopping aren’t AI-generated — AI is simply used to shape the clothing around images of these human models.
New filters are also being introduced to Google Shopping today that are designed to help users find exactly what they’re looking for, such as a similar but cheaper alternative to a shirt or a jacket in a different pattern. Machine learning and visual matching algorithms allow users to refine inputs like color, style, and patterns across various online clothing stores to find an item that best matches their requirements. The feature is available now within product listings on Google Shopping and is similarly limited to tops — Google has not mentioned when this will be expanded into other types of apparel.
Levi’s similarly announced it was using AI to expand modeling options for online shopping back in March. Instead of using images of real people like Google, however, the denim brand said it would test using AI-generated models, initially claiming it would help “diversify” the denim company’s shopping experience. Levi’s later walked back those comments after the announcement was met with backlash but maintained that using AI-generated models would allow the brand to “publish more images of our products on a range of body types more quickly.”