Google has launched a new shopping tool that lets people see what clothes might look like on their own bodies without going near a changing room. All it takes is a full-length photo. Once uploaded, the tool places clothing on the image, giving a realistic view of how something could fit and hang on different body shapes.
This feature started out earlier in the year as a test through Search Labs. It’s now rolling out to people in the US across Search, Google Shopping and Google Images. To use it, shoppers tap the “try it on” icon next to any supported clothing item and upload their picture.
The tech behind it was trained to understand how different fabrics stretch, fold and drape across bodies. That means it tries to give a more accurate picture than those cut-and-paste apps from the past. It works with billions of items listed in Google’s Shopping Graph, so most items from known shops should be covered.
What Makes This Tool Useful Besides The Visuals?
Google has also added new price alert features. US shoppers can now set alerts that go beyond the product itself… they can pick their preferred size, colour and even how much they’re willing to spend.
If that black jacket someone’s been eyeing goes down to the right price, they’ll get a heads up. The system uses product and price data from across the web, pulled from Google’s Shopping Graph, which now covers around 50 billion items.
Danielle Buckley, Google’s Director of Product for Consumer Shopping, said, “No more constantly checking to see if that bag you’re eyeing is finally at the right price for you or forgetting to come back to a product you loved!”
Is It Just Clothes Or More Than That?
From this autumn, US users will also be able to search for outfit and home decor ideas through AI Mode, Google’s new AI chat tab launched in May. People will be able to type in things like “soft pink dress for a picnic” or “living room with light wood and plants” and get back visual suggestions with links to actual products.
These suggestions are created using vision match tech. It works by taking what the person types and turning it into visual ideas, then matching those ideas to products listed in the Shopping Graph. The idea is for people get inspiration and options in one go, rather than going through multiple apps and websites one by one.
Who Is This Tool For?
At this stage, it’s mostly for US shoppers. There’s no word yet on when it’ll roll out to other countries. But it’s clearly being built for people who want less back and forth when it comes to online shopping. That could be anyone from students doing back to school shopping to people redoing their bedrooms or buying clothes for an event.
Shoppers who don’t want the uncertainty that comes with guessing sizes or who don’t trust model photos, could find this useful because seeing the item on a body that looks like theirs could make a difference. The same goes for shoppers who’ve missed deals because they waited too long or didn’t save a link.
It still depends on people uploading their own photos, and not all clothes will display perfectly. But it’s a sign that Google is trying to fold more of the shopping process into its own search tools, and make it feel a bit less annoying.