UK to bring into force law to tackle Grok AI deepfakes this week

The UK will bring into force a law which will make it illegal to create non-consensual intimate images, following widespread concerns over Elon Musk’s Grok AI chatbot.

The Technology Secretary Liz Kendall said the government would also seek to make it illegal for companies to supply the tools designed to create such images.

Speaking to the Commons, Kendall said AI-generated pictures of women and children in states of undress, created without a person’s consent, were not “harmless images” but “weapons of abuse”.

The BBC has approached X for comment. It previously said: “Anyone using or prompting Grok to make illegal content will suffer the same consequences as if they upload illegal content.”.

It comes hours after Ofcom announced it was launching an investigation into X over “deeply concerning reports” about Grok altering images of people.

If found to have broken the law, Ofcom can potentially issue X with a fine of up to 10% of its worldwide revenue or £18 million, whichever is greater.

And if X does not comply, Ofcom can seek a court order to force internet service providers to block access to the site in the UK altogether.

In a statement, Kendall urged the regulator not to take “months and months” to conclude its investigation, and demanded it set out a timeline “as soon as possible”.

It is currently illegal to share deepfakes of adults in the UK, but legislation in the Data (Use and Access) Act which would make it a criminal offence to create or request them has not been enforced until now, despite passing in June 2025.

In addition to the Data Act, Kendall said she would also make it a “priority offence” in the Online Safety Act.

“The content which has circulated on X is vile. It’s not just an affront to decent society, it is illegal,” she said.

“Let me be crystal clear – under the Online Safety Act, sharing intimate images of people without their consent, or threatening to share them, including pictures of people in their underwear, is a criminal offence for individuals and for platforms.

“This means individuals are committing a criminal offence if they create or seek to create such content including on X, and anyone who does this should expect to face the full extent of the law.”

Alongside the new rules, the technology secretary said the government would also build on measures outlined in the Crime and Policing Bill to criminalise nudification apps.

“This new criminal offence will make it illegal for companies to supply tools designed to create Non-Consensual Intimate Images, targeting the problem at its source,” she said.

“In addition to all of these actions, we expect technology companies to introduce the steps recommended by Ofcom’s guidance on how to make platforms safer for women and girls without delay.

“If they do not, I am prepared to go further.”

In a response to an earlier post questioning why other AI platforms were not being looked at, Elon Musk said the UK government wanted “any excuse for censorship”.

But Kendall refuted this.

“This is not, as some would claim, about restricting freedom of speech,” she said.

“It is about tackling violence against women and girls.”

Be the first to comment

Leave a Reply

Your email address will not be published.


*