Critiqs

Teen Leads Fight Against Deepfake AI After Nude Photo Scandal

Teen Leads Fight Against Deepfake AI After Nude Photo Scandal
  • A teen sues an app after her fake nude photos spread, facing emotional harm and school disruption.
  • Lawyers target the app to force it offline, but removal of explicit images remains uncertain for victims.
  • Laws now ban fake nudes in most states, with new federal rules requiring fast removal of harmful content.

Her world was shattered the moment she discovered altered nude images of herself spreading online, created without her knowledge or consent.

Now, as a teenager, she’s become the face of a legal battle against ClothOff, an app accused of fueling a surge in deepfake explicit images targeting children and young women.

The emotional fallout has been severe. In her court complaint, she revealed feeling “mortified and emotionally distraught, and she has experienced lasting consequences ever since.”

She sued the boy she believes is responsible, adding that the incident nearly made her quit school altogether. Yet, according to her complaint, attempts by law enforcement to investigate hit a wall. Neither the accused nor possible witnesses were willing to cooperate, and police never got access to the devices involved.

Tough fight against deepfake apps

Her lawyers are targeting the app directly as well as its related websites. They hope the US courts can force them offline, especially if the developers fail to engage and a default judgment goes against them.

Even so, the girl cannot shake the anxiety that her fake images may never fully disappear. She worries, according to her legal filing, that monitoring for these images will become a lifelong burden.

“Knowing that the CSAM images of her will almost inevitably make their way onto the Internet and be retransmitted to others, such as pedophiles and traffickers, has produced a sense of hopelessness,” the complaint spells out.

Telegram, one of the platforms criticized for allowing the spread of such content, has responded through a spokesperson. “Nonconsensual pornography and the tools to create it are explicitly forbidden by Telegram’s terms of service and are removed whenever discovered,” the representative stated.

Across the country, momentum is building to clamp down on this growing threat. Laws criminalizing fake nudes are now on the books in 45 states. Earlier this year, the Take It Down Act was signed into law by Donald Trump, requiring websites to remove both real and AI-generated intimate images within two days of a victim’s report.

The teen’s lawsuit is just the latest push in a broader campaign to hold tech companies and developers accountable for the spread of dangerous, life-altering content, as highlighted by federal law targets deepfake and revenge porn online.

No one can say for certain whether her battle will stop the circulation of her images. She is left to grapple with the possibility of strangers, friends, or even future employers stumbling upon them at any time.

SHARE

Add a Comment

What’s Happening in AI?

Stay ahead with daily AI tools, updates, and insights that matter.

Listen to AIBuzzNow - Pick Your Platform

This looks better in the app

We use cookies to improve your experience on our site. If you continue to use this site we will assume that you are happy with it.

Log in / Register

Join the AI Community That’s Always One Step Ahead