
Parents and Children Unite in Call to Ban AI Nudifying Apps
Nude deepfakes have increased by 400% in the past year, with AI-powered nudifying tools becoming easily accessible online. These tools create non-consensual sexual imagery from regular photos, predominantly targeting women and girls.
Studies show that approximately 500,000 children (13%) have encountered nude deepfakes online, either through websites, peer sharing, or personal use of nudifying apps. The impact can be severe, causing PTSD, depression, anxiety, and suicidal thoughts among victims.

Woman using smartphone in pink cardigan
Key findings:
- 99% of nude deepfakes feature women and girls
- Nudifying tools typically don't work on images of males
- 84% of teenagers and 80% of parents support banning these tools
- 55% of teenagers believe nude deepfakes can be more damaging than real explicit images
While creating, possessing, or sharing nude deepfakes of children is illegal and classified as child sexual abuse material (CSAM), the tools used to create them remain legal. MP Jess Asato is advocating for a complete ban on nudifying tools in the UK to protect children from digital abuse.

Woman viewing tablet
Impact on victims includes:
- Loss of bodily autonomy
- Uncertainty about image distribution and creator intentions
- Fear of misrepresentation
- Concern about others' perceptions
- Risk of physical violence if personal data is shared alongside images

Group holding phones up together

Parent and child gaming together
The government and industry must take decisive action to protect children from deepfake sexual abuse, as relying solely on schools and parents is insufficient. Banning nudifying tools would align with the government's goal to reduce violence against women and girls by 50% within the next decade.