Become a partner in our mission of building a world where every child is protected and can thrive.
MAKE A DONATION
Get In Touch

Insights Paper: Understanding nudify apps

August 20, 2025
Understanding nudify apps

Launching our new series of Insights Papers

AI-driven “nudify apps” can take an innocent photo and turn it into a sexualised image within seconds. Once a niche tool, they are now mainstream, monetised and industrialised, fuelling sexual extortion, peer-on-peer exploitation and the large-scale creation of AI-generated child sexual abuse material (CSAM).

The first paper in ICMEC Australia’s new Insights Series examines how nudify apps exploit open-source AI, the severe harms they cause for children, and the urgent need for legal, community and cross-sector action.

The claim that “no real child is harmed” is false. Real children’s images are being scraped to train these tools, meaning exploitation begins the moment those images are reused.

Written by: Cherise Holley, Mikaela Jago, and Dr Janis Dalins

About ICMEC Australia's Insight Papers

ICMEC Australia’s Insights Papers provide clear, accessible analysis of emerging risks at the intersection of child protection and technology. Produced with input from experts, the series offers timely insights for government, industry, and the community to inform action as new threats arise.

Subscribe to the ICMEC Australia newsletter

Stay up to date with the latest news, information and activities.

ICMEC Australia acknowledges Traditional Owners throughout Australia and their continuing connection to lands, waters and communities. We pay our respects to Aboriginal and Torres Strait Islanders, and Elders past and present.

Copyright © 2024. All Rights Reserved. |  Logo by Blisttech. Design by Insil.
magnifiercrosschevron-downtext-align-justify