Google brings AI into online shopping, aiming for a more diverse e-commerce experience. Credit: Google
Cher’s virtual closet in the 1995 film classic Clueless is one of the most coveted futuristic tech suggestions of the pre-Y2K era. It’s also the closest comparison Google has for its latest virtual try-on tool, which uses a new AI model to offer an online shopping experience that’s nearly as customizable as the one owned by the Beverly Hills fashionista.
Launched today, Google’s virtual try-on (VTO) seeks to make the online shopping experience more like the process of buying apparel in-store, with realistic image try-ons that provide more size- and skin-tone-inclusive options for online customers, as well as apparel choices that mimic the service a customer would get with a personal sales associate, Google explains.
“42 percent of online shoppers don’t feel represented by images of models and 59 percent feel dissatisfied with an item they shopped for online because it looked different on them than expected,” the company wrote in its announcement. “Our new guided refinements can help U.S. shoppers fine-tune products until you find the perfect piece. Thanks to machine learning and new visual matching algorithms, you can refine using inputs like color, style, and pattern. And unlike shopping in a store, you’re not limited to one retailer: You’ll see options from stores across the web. You can find this feature, available for tops to start, right within product listings.”
The new tech is marketed as the most advanced version of what we’ve come to know as Augmented Reality (AR) try-on options, like the Metaverse makeup experiences released last year, the many beauty filters introduced regularly on TikTok, or brand-based marketing gags like Gucci’s entirely virtual shoe offering(opens in a new tab). Amazon’s even tried out its own AR tool for virtually trying on shoes and eyeglasses(opens in a new tab).
Google’s new tech uses a different process that combines its Shopping Graph(opens in a new tab), the company’s worldwide database of shopping information, with a process known as “diffusion.” In diffusion, Google explains, extra pixels (known as “noise”) are gradually added to an image and then removed, leaving behind an imprint or reconstruction of the original image. When that’s utilized in an AI model trained with a library of images showing human models wearing garments in different poses (rather than a collection of text data, like a large language model or LLM) it results in shoppers being able to generate a new, more accurate, try-on image.
Google also says its virtual try-on goes beyond other forms of AR options by offering a wider selection of diverse — although still limited — human models and increasing the ability of shoppers to select a model that looks like them to make more informed shopping choices.
“Our new generative AI model can take just one clothing image and accurately reflect how it would drape, fold, cling, stretch and form wrinkles and shadows on a diverse set of real models in various poses. We selected people ranging in sizes XXS-4XL representing different skin tones (using the Monk Skin Tone Scale as a guide), body shapes, ethnicities and hair types,” the company wrote.
Unlike the AI passes of brands like Levi’s, which opted to forgo human models for AI-generated diversity, Google is attempting to combine real-life human variability with the complexity of AI. Other applications of AI and AR have proven to be limited in their training, with a lack of diversity in their creation(opens in a new tab) that leads to real-life consequences(opens in a new tab) for people of color using them — making Google’s explicit effort to expand its data input for the virtual try-on AI model notable.
But the question remains if the new tool will actually have a lasting, broad impact on building AI with diversity in mind. At the very least, the new virtual try-on tool takes a significant step beyond other offerings on the market, acknowledging shoppers with a range of body shapes and appearances (and their potential purchasing might) in a suggestion that AI could be used to promote fashion inclusivity. Guess it’s time for a summer closet upgrade.
Chase joined Mashable’s Social Good team in 2020, covering online stories about digital activism, climate justice, accessibility, and media representation. Her work also touches on how these conversations manifest in politics, popular culture, and fandom. Sometimes she’s very funny.