
We are pleased to share the Impact Report FY2024–25.
A year of collaboration, innovation and measurable change. Together, we’re working towards creating a world where technology cannot be used to harm children.
“At the heart of our mission is a simple focus, to strengthen the professionals who detect, disrupt and prevent harm.”
— Colm Gannon, CEO, ICMEC Australia
We’re proud to share the ICMEC Australia Impact Report FY2024–25, highlighting another year of progress in strengthening our response to child sexual exploitation and abuse (CSEA).
ICMEC Australia is an independent not-for-profit organisation working to create a world where technology cannot be used to harm children, starting here in Australia.
We’re seeking Non-Executive Directors to join our Board and help guide the organisation’s strategy, risk, and performance in line with our mission: to strengthen the professionals working to detect, disrupt and prevent child sexual exploitation and abuse (CSEA).
As a Board Director, you will:
We’re looking for leaders who bring integrity, ethical governance, and a deep commitment to child safety and social impact.
To express your interest, please send a cover letter and CV to cosec@icmec.org.au.
On Tuesday, 2 September, leaders from across the political spectrum came together at Parliament House to drive urgent action and ensure child protection remains at the forefront of Australia’s approach to AI.
Convened by ICMEC Australia, the National Leaders' Conversation on AI and Child Safety brought together parliamentarians, senior law enforcement leaders, and child safety advocates to focus on three urgent priorities: embedding baseline AI training for police across the country, scoping a law-enforcement only facial recognition tool to assist investigations, and elevating awareness on AI-enabled harms nationally through a focus on prevention.
Following this roundtable, ICMEC Australia commends Kate Chaney MP for raising the issue directly in the House and drawing it to the attention of the Attorney-General. Her leadership highlights the momentum building across Parliament to address the growing risks of AI-enabled child sexual exploitation..
Listen to Kate Chaney speak in Parliament House below. Kate's speech begins at 6.02.00.
Launching our new series of Insights Papers
AI-driven “nudify apps” can take an innocent photo and turn it into a sexualised image within seconds. Once a niche tool, they are now mainstream, monetised and industrialised, fuelling sexual extortion, peer-on-peer exploitation and the large-scale creation of AI-generated child sexual abuse material (CSAM).
The first paper in ICMEC Australia’s new Insights Series examines how nudify apps exploit open-source AI, the severe harms they cause for children, and the urgent need for legal, community and cross-sector action.
The claim that “no real child is harmed” is false. Real children’s images are being scraped to train these tools, meaning exploitation begins the moment those images are reused.
Written by: Cherise Holley, Mikaela Jago, and Dr Janis Dalins
ICMEC Australia’s Insights Papers provide clear, accessible analysis of emerging risks at the intersection of child protection and technology. Produced with input from experts, the series offers timely insights for government, industry, and the community to inform action as new threats arise.
From concern to action.
How can Australia lead the world in protecting children in the age of AI?
ICMEC Australia's CEO Colm Gannon reflects on our recent national roundtable at Parliament House and highlights the urgent need for legal and technological safeguards as AI accelerates child sexual exploitation and abuse.
Read Colm’s opinion piece on why the time to act is now.
Stay informed and subscribe to our newsletter today below!
On Thursday 17 July, Australia’s first national roundtable on child safety in the age of AI took place at Parliament House. The event placed this urgent issue firmly on the national agenda.
Hosted by ICMEC Australia, the roundtable brought together senior leaders from government, law enforcement, technology, and child protection sectors. Participants addressed the alarming growth in AI-facilitated child sexual abuse, including deepfakes, synthetic child sexual abuse material (CSAM), automated grooming, and child-like AI personas.
The forum attracted widespread media coverage across television, radio, online, and print platforms, helping raise national awareness and public momentum for action.
“AI is being weaponised to harm children, and Australia must act,” said Colm Gannon, CEO of ICMEC Australia. “This roundtable represented a pivotal moment for child protection. We need urgent action to ensure these technologies do not outpace our systems of prevention and justice.”
New figures from the National Center for Missing and Exploited Children (NCMEC) revealed a 1,325 percent increase in reports of CSAM involving generative AI, rising from 4,700 in 2023 to more than 67,000 in 2024. In Australia, these threats compound a national crisis, with more than one in four Australians reporting they experienced sexual abuse in childhood (ACMS, 2023).
Key outcomes from the roundtable:
Participants included:
The national roundtable builds on ICMEC Australia’s SaferAI for Children's Coalition’s 2024 discussion paper and 2025 call to action. It is a key opportunity for Australia to lead globally on child protection and AI governance, whilst encouraging industry innovation through ‘Safety by Design’.
“If we act now, Australia can set a global benchmark for ethical AI and child protection,” said Gannon.
“Children’s safety must be at the centre of how we think about and regulate AI in Australia,” said Sonya Ryan, CEO of The Carly Ryan Foundation. “Technology is advancing rapidly. Without decisive, child-centred action, we risk failing to protect children from new and emerging threats.”
“We are seeing AI generate entirely new types of child abuse material. This is a turning point,” said Jon Rouse APM.
ICMEC Australia extends sincere thanks to all partners who joined the roundtable. Together, we can ensure that technology protects children rather than exploits them.
– Ends –








Colm Gannon, our CEO, has been fielding many questions about his thoughts on age assurance lately, and for good reason. As calls for stricter age verification grow louder, Australia finds itself at the centre of a global conversation about protecting children online.
In his latest op-ed, Colm explains why age verification, while well-intentioned, may not be the silver bullet many hope it will be.
"We can't view children as passive users. They are digital citizens with valuable insights into their own online experiences," he writes.
Colm's key insights:
With everyone watching Australia's next moves, we have an opportunity to establish best practice by developing solutions that are both effective and respectful of children's rights.
"Robust doesn't have to mean invasive," Colm notes. "Let's include them in the conversation."
The SaferAI for Children Coalition, a national alliance of child protection organisations, academic experts, and law enforcement agencies led by ICMEC Australia, is calling on the Australian Government to act urgently on the growing risks of AI-facilitated child sexual exploitation. The SaferAI Coalition urges that child protection remain a central policy priority in Australia’s AI strategy and calls for expert consultation and immediate action to ensure AI is a tool for safety, not exploitation.
With more than 1 in 4 Australians (Australian Child Maltreatment Study) having experienced sexual abuse in childhood, the prevalence of this crime is undeniable and at epidemic proportions. AI is making it easier for offenders to exploit children and harder for law enforcement to intervene.
“AI is being weaponised to harm children, and Australia must act swiftly to prevent these technologies from outpacing our systems of protection,” said Colm Gannon, CEO of ICMEC Australia. “We’re proud to lead this coalition and work across sectors to protect children in a rapidly evolving digital environment.”
As AI-facilitated child exploitation escalates, from the creation of AI-generated abuse material to automated grooming, Australia has a critical opportunity to lead the global response to these complex crimes. By building on our strong foundations in child protection and online safety, we can shape a new international standard for how technology is harnessed to keep children safe — one grounded in innovation, responsibility, and ethical design.
The National Centre for Missing and Exploited Children (NCMEC) reported a 1,325% increase in reports involving GenAI, from 4,700 in 2023 to 67,000 in 2024. The scale of this issue is rapidly growing and, without intervention, will become even harder to control.
The letter calls for urgent action: making child protection in the AI era a national priority; working with the SaferAI for Children Coalition on tech-informed solutions; investing in AI tools for prevention and , detection; boosting education and legal reform; and leading global efforts to set child safety standards in AI.
This call to action follows the SaferAI Coalition’s widely welcomed 2024 discussion paper, which laid the groundwork for these recommendations.
Australia must act decisively to prevent AI from becoming another tool for harm. “If we fail to act now, this problem will only escalate. But if we lead, Australia can set a global benchmark for ethical AI and child protection.” Colm Gannon, CEO, ICMEC Australia.
-ends-
For more information, please contact:
Elisabeth Drysdale edrysdale@icmec.org.au Ph: 0414 390 740
At ICMEC Australia we are working towards a world where online technology cannot be used to harm children.
Detecting financial crime is a core responsibility of the financial services and payments industry and protecting children from exploitation must be at the forefront of these efforts.
Child protection is not just a social responsibility — it’s an environmental, social and governance (ESG) priority. As technology evolves, so too do the tactics of those who seek to harm the most vulnerable.
The financial services and payments industry must unite to prevent, detect, and disrupt child sexual exploitation and abuse (CSEA). And the broader corporate industry has a responsibility to mitigate their systems being used to facilitate this crime.
Collaboration is key. By sharing learnings, strengthening systems, and prioritising child protection in business practices, we can drive real change — and protect future generations.
ICMEC Australia is committed to supporting the corporate industry whose systems and services are impacted by child sexual exploitation and abuse.
Watch the video below to learn more.
If your organisation is ready to be part of the solution, we’d love to work with you.
Contact our ICMEC team today at: info@icmec.org.au
With special thanks to: Luke from Modified Photography, Rosie Campo, Colm Gannon, Lynda McMillan, Bridget Scougall, and Peter Cowan.
February 11 is Safer Internet Day.
In Australia, the eSafety Commissioner leads this important day, encouraging all of us to help make the internet a safer and more positive place.
At ICMEC Australia, we prioritise creating a safer digital environment by addressing the harmful impacts of online child sexual exploitation (CSE). Our dedicated focus is on fostering an online space that promotes positive outcomes, supports the development of a civil society, and places child protection at its core.
We all have a role to play in fostering a safer, more inclusive internet.
Here are five ways you can contribute:
1. Have conversations about online safety
2. Learn about online risks
3. Report harmful content
4. Practice respect and kindness online
5. Share online safety resources
You can view more resources here and learn more about how ICMEC Australia works towards a world where online technology can’t be used to harm children here.
Let's prioritise online safety every day and work together to build a safer digital future.
Hear from our CEO, Colm Gannon

ICMEC Australia acknowledges Traditional Owners throughout Australia and their continuing connection to lands, waters and communities. We pay our respects to Aboriginal and Torres Strait Islanders, and Elders past and present.