

Launching our new series of Insights Papers
AI-driven “nudify apps” can take an innocent photo and turn it into a sexualised image within seconds. Once a niche tool, they are now mainstream, monetised and industrialised, fuelling sexual extortion, peer-on-peer exploitation and the large-scale creation of AI-generated child sexual abuse material (CSAM).
The first paper in ICMEC Australia’s new Insights Series examines how nudify apps exploit open-source AI, the severe harms they cause for children, and the urgent need for legal, community and cross-sector action.
The claim that “no real child is harmed” is false. Real children’s images are being scraped to train these tools, meaning exploitation begins the moment those images are reused.
Written by: Cherise Holley, Mikaela Jago, and Dr Janis Dalins
ICMEC Australia’s Insights Papers provide clear, accessible analysis of emerging risks at the intersection of child protection and technology. Produced with input from experts, the series offers timely insights for government, industry, and the community to inform action as new threats arise.

ICMEC Australia acknowledges Traditional Owners throughout Australia and their continuing connection to lands, waters and communities. We pay our respects to Aboriginal and Torres Strait Islanders, and Elders past and present.