Google’s inclination to extract the most out of Artificial Intelligence is evident in its incessant efforts aimed at launching features that deploy AI and machine learning techniques. Its latest update in the Google Shopping section will undoubtedly take the fashion world by storm. It generative-AI-backed virtual try-on for apparels helps users preview clothes on a vast selection of real models with different body types. Besides giving an insight into the overall look of the garment, the tool helps with the little intricacies like how the cloth would drape, fold, stretch, cling, etc. in various postures.
The tool is backed by Google’s indigenous diffusion-based models which comprise of the text-to-art generators Stable Diiffusion and DALL E 2. Diffusion models work on images that are completely made of noise. They begin to gradually subtract the noise from the image and consequently moving it closer to the target.
The tech behemoth released a blog post which read “Thanks to our latest virtual try-on apparel tool n Search, you can see whether a piece is right for your before buying it.” The tool equips users with the option to select a variety of sizes ranging from XXS to 4XL displaying a multitude of skin tones, hair types, body shapes and ethnicities.
Lilian Rincon, senior director of consumer shopping at Google mentioned in a statement that when we try out clothes at a store we immediately know if it is right for us. The motto behind developing this feature was to instill confidence in people when shopping online. She cited a survey which highlighted that a whopping 42% of online shoppers don’t feel that the images of of models do not accurately represene them. Antother 59% reported dissatisfaction during online shopping since the product they bought looked different on them that what they had anticipated.
The concept of virtual try-on is not entirely new. Tech giants, Amazon and Adobe have been involved in experimentation with generative apparel modelling for quite some time now. Last year, Walmart released an online feature that utilised customer images to model clothing.
AI startup AIMIRR expands on the concept by superimposing photos of apparel on a live video of a person using real-time garment rendering technology.
In the past, Google has experimented with virtual try-on technology, collaborating with brands like L’Oréal, Estée Lauder, MAC Cosmetics, Black Opal, and Charlotte Tilbury to provide Search users the ability to put on different makeup hues on a variety of models with different skin tones.