Critiqs

Federal law targets deepfake and revenge porn online

federal-law-targets-deepfake-and-revenge-porn-online
  • Federal law now punishes sharing nonconsensual explicit images, including those made with AI.
  • Online platforms must erase flagged images within 48 hours and track down duplicates.
  • Support is strong, but critics warn the law could accidentally limit free expression.

A landmark piece of legislation was signed into law by President Donald Trump on Monday, ushering in a new era of federal penalties for the distribution of nonconsensual explicit images, including those created with artificial intelligence. The Take It Down Act aims to curb the spread of both authentic and synthetic explicit media posted online without the consent of individuals depicted.

Anyone found guilty of publishing such explicit photos or videos faces a range of criminal sanctions, from monetary fines to imprisonment, as well as the obligation to provide restitution to the victims. The sweeping law requires online platforms and social networks to erase flagged content within 48 hours upon notification from those affected.

New Obligations for Tech Companies

Alongside the rapid removal mandate, companies must also work to scrub duplicate versions of the offending materials, increasing their responsibilities in moderating harmful content. While many local governments already have rules prohibiting explicit deepfakes and so-called revenge porn, this is the first nationwide regulation targeting internet-based companies on this issue.

During a White House signing event, President Trump underscored the historic nature of the act, describing it as a vital tool to prevent digital sexual exploitation. He declared that online abuse in the form of unauthorized explicit imagery would no longer be tolerated at the federal level.

The initiative saw strong support from First Lady Melania Trump and gained bipartisan momentum in Congress, with Senators Ted Cruz and Amy Klobuchar championing the proposal. Senator Cruz cited a troubling incident, in which a young girl’s deepfake image remained online for months before action was taken, as a major motivator for co-sponsoring the bill.

Some civil liberties groups and digital rights advocates have expressed concern that the law could sweep too broadly, potentially leading to the suppression of lawful images or silencing critics of government actions. Despite these criticisms, supporters argue that the need to protect individuals from online sexual exploitation outweighs the risks, making the law a significant step forward in regulating the digital landscape.

SHARE

Add a Comment

What’s Happening in AI?

Stay ahead with daily AI tools, updates, and insights that matter.

Listen to AIBuzzNow - Pick Your Platform

This looks better in the app

We use cookies to improve your experience on our site. If you continue to use this site we will assume that you are happy with it.

Log in / Register

Join the AI Community That’s Always One Step Ahead