Google’s newly unveiled virtual dressing tool, Try It On, is drawing concern for the way it alters user images when simulating what various outfits might look like. This feature, shown off at a recent developer conference, allows shoppers to upload photos of themselves to preview clothing before they buy.
When images of well-known men such as J D Vance or historical figures like Abraham Lincoln were tested in women’s fashion, the AI not only placed the clothing on them but also generated breasts that were not present in the original photos. Female figures received similarly problematic edits, such as artificially enhanced chests or revealing undergarments, creating images that were clearly different from reality.
The issue is present even when the subjects are famous statues or paintings. Michelangelo’s David and Portrait of Madame X were both transformed by the algorithm, with the tool adding features typically associated with the clothing’s original modeled body, not the photo’s subject.
Risks for Privacy and Safety
These transformations appear to stem from Try It On’s reliance on Google’s Shopping Graph, a massive product database filled with idealized images of clothing on models. Instead of simply adjusting the garments, the tool sometimes morphs users’ bodies to match those of the product’s model.
The ease with which these AI-generated images can be created raises questions about virtual dressing tool privacy concerns and unwanted eroticization. Alarmingly, the technology also proved capable of generating adult imagery when images of minors were used. Tests with photos of teenagers, both male and female, resulted in depictions with pronounced physical features and revealing outfits, despite the original photos showing fully clothed children.
Efforts by Google to block explicit results or known public figures were only partially successful. While high profile politicians like Donald Trump and Kamala Harris were restricted, many other uploads went through, bypassing company guidelines meant to prevent dangerous misuse.
Google states that it enforces policies to prohibit adult and sexual images while also requiring that uploads not feature explicit content or minors. A company spokesperson stressed the existence of safeguards and ongoing improvements, but trial results showed significant loopholes. Even standard try-ons produced suggestive alterations, such as the addition of revealing bulges for men in gym shorts.
These errors reflect broader issues with generative AI in visual products, many of which have previously suffered from notable mistakes or inappropriate outputs. The technology that powers undressing and eroticizing images has been growing rapidly worldwide, with millions visiting similar online services and social media bots creating inappropriate images at scale.
Google restricts Try It On to adults in the United States through its experimental Search Labs feature. Yet, the tool’s tendency to significantly alter body shapes undermines its primary function for shoppers, who are left with images that do not accurately reflect how clothes might actually look on their own bodies. This misrepresentation not only threatens user trust but potentially impacts mental health and Google AI image alteration.