top of page

Gender, Media, News

Paris Hilton, AOC push DEFIANCE Act against AI deepfakes

As Grok “undressing” spreads, Hilton and Ocasio-Cortez urge Congress to allow survivors to sue

MMS Staff

28 Jan 2026

4-min read

Grok AI is being used to digitally undress women and children, and despite public pledges to suspend the feature, investigators and journalists have confirmed that it continues to power the creation of non-consensual sexualised images of real people, women, girls.

In some cases, images appear to involve minors.


What’s actually happening on Grok


On Grok, users have been uploading photos of real women and children, some as young as 10, prompting the AI to “remove their clothes.” The results are hypersexualised deepfakes, edited into bikinis, lingerie, porn-like poses, sometimes with fake filters meant to simulate bodily fluids.


Between December 25, 2025 and January 1, 2026, AI Forensics tracked over 20,000 images generated using this tool. More than half showed women in “minimal clothing.” At least 2% appeared to include minors, some reportedly as young as toddlers.They were real individuals, digitally altered, and their images circulated without permission.


The scale is new


Non-consensual deepfake pornography has always disproportionately targeted women. Studies consistently show that over 96% of non-consensual AI-generated sexual content features female victims. What’s changed is speed and scale.


AI has made this kind of abuse faster, cheaper, and harder to trace. What once required technical expertise can now be done with a single prompt and a photo pulled from social media. The violence may be digital, but the harm is intimate and enduring, especially for women, girls, and children.


Victims have spoken about discovering deepfake images of themselves years after the original photos were taken. Some images are edited to make women appear younger. One survivor found sexualised images of her 14-year-old self circulating online.


These images are used for ridicule, coercion, revenge, and blackmail. They are part of a long, familiar pattern of hidden cameras, revenge porn, and leaked videos, now re-engineered through AI.


Governments are scrambling to respond. The UK’s communications regulator Ofcom has contacted the platform. The European Union has said it is “very seriously” investigating. France and India are tracking complaints and potential violations.


In countries like the UK, laws already exist that criminalise non-consensual deepfakes, especially those involving children. Feminist campaigners fought hard to push those laws through. Yet survivors remain unprotected in practice, stuck in systems that prioritise corporate damage control over human safety.


Grok’s response so far has been telling. The tool was moved behind a paywall, as if charging for access counts as a safety measure. A public statement about “urgent fixes” was reportedly generated by AI itself.


Naming the harm matters


This is tech-enabled sexual violence. It’s what happens when patriarchy gets re-coded into algorithms, fed by engagement metrics, and shielded by corporate language about innovation. When consent is optional in digital spaces, women’s bodies become endlessly reusable, scraped, altered, monetised.


From “scandal” to accountability: Why lawmakers are stepping in


On January 22, 2026, Paris Hilton walked into Capitol Hill alongside US representative Alexandria Ocasio-Cortez united by a demand that feels basic but radical in practice: make AI deepfake abuse legally actionable.


They were there to push the DEFIANCE Act, legislation that would give survivors the civil right to sue people who knowingly create and distribute non-consensual AI-generated intimate images. That one word, sue, changes everything.


Unlike takedown-focused laws such as the TAKE IT DOWN Act, the DEFIANCE Act is about consequences. It moves survivors out of a “report and pray” system and into one where harm carries real legal and financial cost.


The bill has rare bipartisan backing, with Laurel Lee as the Republican co-lead and Senators Dick Durbin and Lindsey Graham supporting the push. The decision to bring it to a vote rests with Mike Johnson, Speaker of the United States House of Representatives.


At its core, the bill asks a simple question: will consent online be enforceable, or will it remain a suggestion?


Paris Hilton on rewriting the narrative


Hilton’s presence matters because of what she represents. When an intimate video of her was distributed without consent at age 19, it was framed as a “scandal.” She has spent years insisting on a different word: abuse.


“Scandal” blames the woman for being seen. “Abuse” names the violation, and the systems that enabled it. It reframes shame as something that belongs to perpetrators, platforms, and cultures that profit from humiliation.


Why deepfake abuse works


Deepfake pornography thrives because it weaponises social punishment. The image may be digital, but the consequences are brutally physical: fear, isolation, lost jobs, damaged reputations, teenagers switching schools, and women shrinking their lives to avoid being “made an example of.”


This happens in a world where violence against women is already normalised. According to the World Health Organization, nearly 1 in 3 women globally experience physical and/ or sexual violence in their lifetime.


So when people dismiss deepfakes as “just online,” they ignore the truth. That digital abuse plugs directly into offline inequality, and amplifies it.


Why this moment matters


The DEFIANCE Act matters because it treats non-consensual deepfakes for what they are: image-based sexual abuse. And it gives survivors leverage. If the cost of violating women remains low, this abuse will keep scaling. If the cost rises - legally, financially, socially - the business model breaks. That’s what accountability looks like.


We were told AI would make life easier. Instead, we’re watching it replicate the oldest forms of harm at unprecedented speed.


What’s at stake here is dignity. And it’s whether lawmakers, platforms, and the public are willing to say, clearly, that sexual violence doesn’t become acceptable just because it’s automated.


And if this story makes you uncomfortable, that’s the point. Because silence is what lets systems like this grow, and real, enforceable accountability, is the only thing that stops them.

Much much relate? Share it now!

SHORTS

bottom of page