Become a partner in our mission of building a world where every child is protected and can thrive.
MAKE A DONATION
Get In Touch

Our work

Why we need the SaferAI for Children Coalition... 

The risk

AI is being misused to harm children as AI-generated abuse material and new methods of grooming become increasingly prevalent.

The opportunity

AI can also be a powerful tool, helping detect exploitation, assist law enforcement, and educate families.

Our mission

The SaferAI for Children Coalition works at this crossroads, aiming to reduce risks while unlocking AI’s potential to protect children.

Risks

How is AI putting children at risk

As artificial intelligence technologies become more advanced and accessible, they are increasingly being misused to exploit children in new and deeply concerning ways. These are not hypothetical threats; they are real and happening now. The SaferAI for Children Coalition works to identify and address the following critical risks:

AI-Generated child sexual abuse material (CSAM)

AI is being used to generate realistic synthetic images and videos of child sexual abuse — some indistinguishable from real material. Offenders also use AI to alter real abuse content to appear artificial, frustrating detection and legal response. These materials are illegal and perpetuate harm by fuelling demand and normalising exploitation.

Grooming and sextortion at scale

Large language models can be exploited by offenders to mimic the tone, language, and identity of peers or trusted adults. This enables more targeted, convincing, and scalable grooming attempts — often leading to sexual extortion (“sextortion”) of children and young people.

Nudify apps and deepfakes

AI tools that can strip clothing from images or create sexualised deepfakes are increasingly being used to exploit children, including by peers. These tools are being used for image-based abuse, often without the victim’s knowledge, and contribute to a toxic and unsafe online (and offline) environment for children.

AI Companions and chatbots

AI companion apps, designed to offer friendship, advice, or emotional support, are often unregulated and unmoderated. When used by children, they may facilitate unsafe conversations, reinforce harmful ideas, or become tools of manipulation. Some apps include sexualised interactions or lack age verification entirely.

Opportunities

Using AI to protect children

AI isn’t only a threat — it can also be a powerful force for good. With the right investment and oversight, these technologies can revolutionise child protection and support frontline responders. The Coalition is focused on unlocking these opportunities:

AI-Driven detection tools

AI can be trained to detect known and new CSAM — even within encrypted environments — helping investigators act faster while reducing exposure to traumatic content.

Supporting law enforcement

Machine learning can help identify patterns in grooming tactics, prioritise cases based on urgency, and analyse large volumes of evidence — freeing up officers to focus on child rescue.

Empowering education and digital literacy

AI can be used to personalise online safety education — helping parents, carers, and schools equip children with the tools they need to recognise and respond to risk.

Global coordination and safety standards

With Australia’s history of leadership in child protection, we have the opportunity to shape global conversations around AI ethics, policy, and regulation — putting child safety at the centre of the AI future.

The SaferAI for Children Coalition's work

The Coalition drives action through the following: 

  • Publishing expert-led policy briefings and insights papers to guide decision-makers 
  • Hosting roundtables and forums with cross-sector leaders 
  • Facilitating youth engagement initiatives to ensure the voices of young people are heard 
  • Providing technical and strategic advice to inform ethical AI policy 
  • Promoting collaboration between government, civil society, and industry 

2024 Discussion paper: A collaborative approach to a safer future. 

Protecting children in the digital age has never been more urgent. While rapid advances in generative AI promise significant societal benefits, they also bring new risks. Developed with the SaferAI for Children Coalition, a partnership of child protection organisations, academic experts, law enforcement and public sector partners, this paper examines both the opportunities and challenges AI presents for child safety in Australia.

It explores how AI-enabled tools can be misused for child sexual exploitation, while also showing how AI can detect harmful content, support investigations and protect vulnerable children. This discussion paper is both a call to action and a guide, advocating for the responsible use of AI to ensure technology safeguards rather than endangers children.

READ OUR DISCUSSION PAPER

2025 A national call to action: Prioritising child safety in the age of AI. 

The SaferAI for Children Coalition – a national alliance of child protection organisations, academic experts and law enforcement agencies led by ICMEC Australia – is urging the Australian Government to act on the growing risks of AI-facilitated child sexual exploitation and abuse.

With more than 1 in 4 Australians (Australian Child Maltreatment Study) having experienced sexual abuse in childhood, the prevalence of this crime is undeniable and at epidemic proportions.

The letter calls for urgent action: making child protection in the age of AI a national priority; working with the SaferAI for Children Coalition on tech-informed solutions; investing in AI tools for prevention and , detection; boosting education and legal reform; and leading global efforts to set child safety standards in AI.
READ THE COMPLETE STATEMENT

SaferAI Education Working Group 

The SaferAI for Children Coalition is committed to closing gaps and proactively protecting children from AI-related risks. One way we do this is through focused working groups that respond to urgent and emerging challenges. The SaferAI Education Working Group recognises that AI is rapidly transforming Australia’s education landscape. As the education sector adapts to new technologies, this group ensures that child safety remains at the forefront of education policy and practice.

Contact us

General enquiries

If you are interested in hearing more about our work, please contact us at saferai@icmec.org.au 

Media enquiries

For interviews, expert comment, or press materials, please contact Elisabeth Drysdale at edrysdale@icmec.org.au

ICMEC Australia acknowledges Traditional Owners throughout Australia and their continuing connection to lands, waters and communities. We pay our respects to Aboriginal and Torres Strait Islanders, and Elders past and present.

Copyright © 2024. All Rights Reserved. |  Logo by Blisttech. Design by Insil.
magnifiercrosschevron-downtext-align-justify