
Colm Gannon, our CEO, has been fielding many questions about his thoughts on age assurance lately, and for good reason. As calls for stricter age verification grow louder, Australia finds itself at the centre of a global conversation about protecting children online.
In his latest op-ed, Colm explains why age verification, while well-intentioned, may not be the silver bullet many hope it will be.
"We can't view children as passive users. They are digital citizens with valuable insights into their own online experiences," he writes.
Colm's key insights:
With everyone watching Australia's next moves, we have an opportunity to establish best practice by developing solutions that are both effective and respectful of children's rights.
"Robust doesn't have to mean invasive," Colm notes. "Let's include them in the conversation."
The SaferAI for Children Coalition, a national alliance of child protection organisations, academic experts, and law enforcement agencies led by ICMEC Australia, is calling on the Australian Government to act urgently on the growing risks of AI-facilitated child sexual exploitation. The SaferAI Coalition urges that child protection remain a central policy priority in Australia’s AI strategy and calls for expert consultation and immediate action to ensure AI is a tool for safety, not exploitation.
With more than 1 in 4 Australians (Australian Child Maltreatment Study) having experienced sexual abuse in childhood, the prevalence of this crime is undeniable and at epidemic proportions. AI is making it easier for offenders to exploit children and harder for law enforcement to intervene.
“AI is being weaponised to harm children, and Australia must act swiftly to prevent these technologies from outpacing our systems of protection,” said Colm Gannon, CEO of ICMEC Australia. “We’re proud to lead this coalition and work across sectors to protect children in a rapidly evolving digital environment.”
As AI-facilitated child exploitation escalates, from the creation of AI-generated abuse material to automated grooming, Australia has a critical opportunity to lead the global response to these complex crimes. By building on our strong foundations in child protection and online safety, we can shape a new international standard for how technology is harnessed to keep children safe — one grounded in innovation, responsibility, and ethical design.
The National Centre for Missing and Exploited Children (NCMEC) reported a 1,325% increase in reports involving GenAI, from 4,700 in 2023 to 67,000 in 2024. The scale of this issue is rapidly growing and, without intervention, will become even harder to control.
The letter calls for urgent action: making child protection in the AI era a national priority; working with the SaferAI for Children Coalition on tech-informed solutions; investing in AI tools for prevention and , detection; boosting education and legal reform; and leading global efforts to set child safety standards in AI.
This call to action follows the SaferAI Coalition’s widely welcomed 2024 discussion paper, which laid the groundwork for these recommendations.
Australia must act decisively to prevent AI from becoming another tool for harm. “If we fail to act now, this problem will only escalate. But if we lead, Australia can set a global benchmark for ethical AI and child protection.” Colm Gannon, CEO, ICMEC Australia.
-ends-
For more information, please contact:
Elisabeth Drysdale edrysdale@icmec.org.au Ph: 0414 390 740
At ICMEC Australia we are working towards a world where online technology cannot be used to harm children.
Detecting financial crime is a core responsibility of the financial services and payments industry and protecting children from exploitation must be at the forefront of these efforts.
Child protection is not just a social responsibility — it’s an environmental, social and governance (ESG) priority. As technology evolves, so too do the tactics of those who seek to harm the most vulnerable.
The financial services and payments industry must unite to prevent, detect, and disrupt child sexual exploitation and abuse (CSEA). And the broader corporate industry has a responsibility to mitigate their systems being used to facilitate this crime.
Collaboration is key. By sharing learnings, strengthening systems, and prioritising child protection in business practices, we can drive real change — and protect future generations.
ICMEC Australia is committed to supporting the corporate industry whose systems and services are impacted by child sexual exploitation and abuse.
Watch the video below to learn more.
If your organisation is ready to be part of the solution, we’d love to work with you.
Contact our ICMEC team today at: info@icmec.org.au
With special thanks to: Luke from Modified Photography, Rosie Campo, Colm Gannon, Lynda McMillan, Bridget Scougall, and Peter Cowan.

ICMEC Australia acknowledges Traditional Owners throughout Australia and their continuing connection to lands, waters and communities. We pay our respects to Aboriginal and Torres Strait Islanders, and Elders past and present.