

On Thursday 17 July, Australia’s first national roundtable on child safety in the age of AI took place at Parliament House. The event placed this urgent issue firmly on the national agenda.
Hosted by ICMEC Australia, the roundtable brought together senior leaders from government, law enforcement, technology, and child protection sectors. Participants addressed the alarming growth in AI-facilitated child sexual abuse, including deepfakes, synthetic child sexual abuse material (CSAM), automated grooming, and child-like AI personas.
The forum attracted widespread media coverage across television, radio, online, and print platforms, helping raise national awareness and public momentum for action.
“AI is being weaponised to harm children, and Australia must act,” said Colm Gannon, CEO of ICMEC Australia. “This roundtable represented a pivotal moment for child protection. We need urgent action to ensure these technologies do not outpace our systems of prevention and justice.”
New figures from the National Center for Missing and Exploited Children (NCMEC) revealed a 1,325 percent increase in reports of CSAM involving generative AI, rising from 4,700 in 2023 to more than 67,000 in 2024. In Australia, these threats compound a national crisis, with more than one in four Australians reporting they experienced sexual abuse in childhood (ACMS, 2023).
Key outcomes from the roundtable:
Participants included:
The national roundtable builds on ICMEC Australia’s SaferAI for Children's Coalition’s 2024 discussion paper and 2025 call to action. It is a key opportunity for Australia to lead globally on child protection and AI governance, whilst encouraging industry innovation through ‘Safety by Design’.
“If we act now, Australia can set a global benchmark for ethical AI and child protection,” said Gannon.
“Children’s safety must be at the centre of how we think about and regulate AI in Australia,” said Sonya Ryan, CEO of The Carly Ryan Foundation. “Technology is advancing rapidly. Without decisive, child-centred action, we risk failing to protect children from new and emerging threats.”
“We are seeing AI generate entirely new types of child abuse material. This is a turning point,” said Jon Rouse APM.
ICMEC Australia extends sincere thanks to all partners who joined the roundtable. Together, we can ensure that technology protects children rather than exploits them.
– Ends –









ICMEC Australia acknowledges Traditional Owners throughout Australia and their continuing connection to lands, waters and communities. We pay our respects to Aboriginal and Torres Strait Islanders, and Elders past and present.