
Earlier this month, Anthropic chief executive Dario Amodei stood before some of Australia's leading policymakers and said, plainly: “The fundamental challenge remains – we know much less than we would like to, but the technology is moving faster than we’d like it. So we have to act, but we're not sure how to act.”
Eight days later, OpenAI released a policy blueprint on protecting children in the age of generative AI – a detailed framework co-developed with the National Center for Missing and Exploited Children (NCMEC) and US state attorneys general, calling for updated laws, better reporting standards and safety-by-design controls built into AI platforms from the ground up.
Then this week, Microsoft CEO Satya Nadella arrived in Australia to announce the company's largest ever investment in Australia – A$25 billion by 2029 – and signed a memorandum of understanding with the Albanese Government.
Three of the world’s most powerful AI players in a mere month, explicitly told governments and industry to act.
OpenAI’s blueprint is a worthwhile read. Its core argument – that protecting children requires a layered, prevention-first approach, not a single technical fix – is right, and echoes longstanding calls from the child safety sector. So does its insistence that better reporting isn’t just about volume, but about quality: structured, actionable information that allows investigators to triage cases faster and identify children who are at immediate risk of harm. That last point matters more than most people realise.
Much of Australia's public conversation about AI and child safety has centered on AI-generated child sexual abuse material (CSAM) – synthetic imagery that doesn't depict a real child. There is sometimes an implicit assumption that this is therefore a lesser harm; a content problem, largely detached from real-world abuse. The evidence is unambiguous that this is wrong.
AI-generated abuse material is being produced using existing images of real survivors, embedding their trauma into synthetic content and re-victimising them without their knowledge. Offenders are now using AI to create deepfakes of specific children from as few as 20 images. And increasingly, a disturbing legal tactic has emerged – what researchers call the ‘liars’ dividend’ – where offenders claim genuine evidence of contact abuse was AI-generated, exploiting public awareness of synthetic media to create plausible deniability.
The harm is not abstract. Every piece of AI-generated material represents a real victim who deserves identification and justice – but the volume is now outpacing the capacity of those tasked with responding. Specialist investigators are being overwhelmed. The question is no longer whether this is a crisis; it is whether our response is equal to it.
Australia is not ignorant to these complexities. ICMEC Australia hosted two National Roundtables on Child Safety in the Age of AI at Parliament House in July and September last year, driving a shift from concern to action. Independent Member for Curtin, Kate Chaney MP, in collaboration with ICMEC Australia, introduced a private member’s bill to criminalise AI tools built specifically to generate CSAM. The eSafety Commissioner has issued legal notices to AI companion chatbot providers. The Minister for Communications has announced an intention to ban ‘nudify’ apps. Across the research, advocacy and industry sectors, coalitions are forming – among them the SaferAI for Children Coalition, which unites more than 25 organisations around the shared goal of ensuring that children are kept safe in the rapidly developing digital space.
The commitment in this space is genuine and the expertise is deep – but meaningful steps do not amount to a coordinated strategy. Too many efforts run in parallel rather than compounding. Legislative proposals sit without action. Memoranda of understanding with individual AI companies, while welcome and significant, are not legally binding and reach only one player at a time in a rapidly expanding market. The OpenAI blueprint illustrates the gap: valuable, but one company, one document, one jurisdiction. A piece of the puzzle, not the whole picture.
Australia's conversation about AI and child safety has been almost entirely one-sided – focused on stopping AI being used to harm children. That is very necessary, but we’re missing a key part of the conversation. AI is also the most powerful tool we have to find and help children who are being harmed right now. Machine learning can flag harmful content faster than any human investigator. AI-assisted detection tools can triage which cases involve real victims in active danger, directing scarce investigative resources where they are most urgently needed.
Australia has the research capability, the cross-sector buy-in and the policy momentum to lead on this – not just to regulate AI as a threat, but to deploy it as a protector. What is needed now is a systematic national approach that connects the dots: legislation that keeps pace with the technology, platform obligations with real enforceability, and active investment in AI as a tool for early intervention and victim identification.
The window to act has not yet closed, but it will not stay open indefinitely.
About the author
Mikaela (a/g Manager, Government Affairs and Public Policy) works across government relations, public policy, and partnerships at ICMEC Australia, engaging with parliamentarians and agencies to turn emerging risks into practical policy outcomes. She leads the SaferAI for Children Coalition and contributes to cross-sector discussions on AI-enabled harm, governance, and child protection.
Where to get help and report harm, specific to every state within Australia.
Over the past year, ICMEC Australia has been meeting with parliamentarians and their staff from across the country – walking them through the realities of child sexual exploitation and abuse (CSEA) in Australia, what the data shows, and where the policy landscape is heading.
What those conversations made clear is that the landscape is genuinely hard to keep up with. The issue is evolving quickly, the terminology is specialised, and the reporting and support pathways differ from state to state. People want to help, but without a clear baseline understanding of the issue, it's hard to know where to start.
That’s what these trifolds are designed to address. State-based and plain-language, they define the issue clearly, including terms like grooming and sexual extortion and guide people to the reporting channels and support services most relevant to where they are.
The numbers behind these resources are sobering. The ACCCE received 82,764 reports in the 2024–25 financial year alone. One in four Australians has experienced sexual abuse as a child. Emerging technologies including AI, are making it easier to create, manipulate and distribute abusive material and to target children in ways that weren’t possible even a few years ago.
When high-profile cases make national headlines – a childcare centre, a school, an online platform – constituents often come to their local MP’s office looking for answers. Staff are eager to help, but navigating an unfamiliar and fast-moving issue in real time is difficult. These trifolds give offices something concrete to have on hand, so that when those moments arise, the right information is already there.

Clear and accessible information is one of the most powerful tools we have. By building a better baseline understanding of this issue and making it easy to know where to report, we can help more children across Australia get help sooner.
How to access the trifolds
Digital versions of the trifolds are available to download from our website, you can find your state’s version below. If you work in a setting where these would be useful and would like to discuss these resources further, we'd love to hear from you.
Any questions? Reach out to us at info@icmec.org.au
ICMEC Australia responds to Louis Theroux's Inside the Manosphere and calls for child safety to be centred in technology, law enforcement, and government
Louis Theroux's Inside the Manosphere has millions of Australians asking hard questions about what young people encounter online. At ICMEC Australia, those questions are ones we live with every day, and the answers point to something bigger than any single platform or influencer.
When Inside the Manosphere landed on Netflix earlier this month, it did what good documentary filmmaking does. It took something specialists have been watching with alarm for years and made it visible to everyone. Parents, educators and policymakers who had never heard the term 'manosphere' are now searching for it and asking what it means for the children in their lives.
"Boys are being harmed. That is the starting point."
Colm Gannon, CEO, ICMEC Australia
Boys are being harmed
The manosphere is not simply a debate about gender or culture. At its core, it is a child protection issue. Young boys, many of them still in primary and secondary school, are being exposed to content that distorts their understanding of themselves, of women, and of what healthy relationships look like. This is not content they are seeking out deliberately. They arrive at it through entirely ordinary online behaviour: watching fitness videos, looking for confidence tips, searching for a sense of direction. And then, gradually, the content shifts.
The harm this causes is real and compounds over time. These are children at a formative stage of development, absorbing messages about masculinity, power and relationships that they are not equipped to critically evaluate. A Movember report found that more than two-thirds of young Australian men engage with masculinity influencers online. These are not radicalised outliers. They are ordinary boys, and they deserve better than what these spaces are offering them.
How content reaches children at this scale
Inside the Manosphere focuses, understandably, on the individuals whose views are visible enough to generate reaction. But the more important question is how those views travel so far and so fast, particularly to children who were never looking for them. Recommender systems are designed to maintain attention by serving up content that generates a strong response. They do not distinguish between what is healthy and what is harmful. A young person who pauses on certain material, even briefly, signals to the system: show me more. Step by step, the content escalates.
The reason is straightforward: these platforms are built to drive engagement. More views, more clicks, more follows, more time on screen.
That is the metric they are optimised for, not the wellbeing of the children using them. This is not a criticism of any particular company, sadly this is the nature of social media platforms. It reflects a broader challenge in how online systems have evolved, one that the technology sector, regulators and the child protection community are working to address together. Where child safety has not been centred in that process, the consequences for young people have been significant.
What the frontlines are telling us
The evidence linking specific online content to specific offline harm is still developing, and honesty requires us to say so. What I can say with confidence is what ICMEC Australia hears across every sector we work with. Financial institutions, law enforcement agencies, and government partners are consistently and increasingly concerned. The professionals on the frontlines, investigating exploitation, responding to disclosures, following financial trails, are not theorising about these dynamics. They are managing caseloads shaped by them. The Australian eSafety Commissioner has put it plainly: when harmful attitudes are normalised and reinforced over time, the risk of real-world harm is real.
The conversation is already happening
Across every sector we engage with, this moment has prompted real discussion. People want to understand these issues better - not just as a cultural talking point, but in terms of what it means for their organisations and their capacity to protect children. That means financial institutions building the capability to recognise indicators of exploitation. It means law enforcement with current training on technology-facilitated harm. It means technology partners, government and the child protection sector working from a genuinely shared understanding of how these harms operate.
Building that understanding across sectors, sectors that do not always speak the same language, around issues that change fast, is the work ICMEC Australia exists to do. Inside the Manosphere will not solve any of this. But it has done something that years of expert reports and policy papers have struggled to do: it has put the issue in front of millions of ordinary Australians and started a conversation that needed to happen at scale. That matters. Child safety belongs in every boardroom, every law enforcement briefing and every technology conversation. If Inside the Manosphere has reminded us of that, it has done something genuinely important. The door is open, and ICMEC Australia is ready to keep that conversation going.
About the author
Colm Gannon is the CEO of ICMEC Australia, a non-profit organisation, working to prevent and respond to child sexual exploitation by strengthening safeguards across technology, industry, and policy systems. ICMEC Australia regularly engages with regulators, platforms, and technology leaders on safety-by-design, consent protections, and platform accountability.
ICMEC Australia has lodged its 2026–27 Federal Pre-Budget Submission, calling for targeted Commonwealth investment to strengthen Australia’s response to online child sexual exploitation and emerging AI-enabled harms.
Technology-facilitated abuse is increasing in scale and complexity across every jurisdiction. Frontline police and call-takers are regularly encountering technology-facilitated harms such as sexual extortion, grooming and AI-generated abuse material – often before specialist units are involved.
Our submission seeks $6.6 million over three years to deliver two practical national initiatives.
The first is a National Child Abuse Response Training Program for Frontline Police, equipping general duties officers and call-takers with the skills to recognise, triage and respond safely to online exploitation at first contact. The program is already being piloted and is designed for national rollout in partnership with law enforcement agencies.
The second strengthens the SaferAI for Children Coalition, ensuring child protection expertise directly informs Australia’s AI safety standards, policy and risk frameworks.
Together, these initiatives will lift national frontline capability and embed a child-centred and prevention-first lens in AI governance – helping ensure every child receives safer, more consistent support in an increasingly complex digital environment.
A statement from Dannielle Kelly, Head of Government Affairs and Law Enforcement Outreach.
The national focus on online harm affecting children did not arrive by chance. It exists because victim survivors, frontline practitioners, researchers, and advocates have spent years forcing uncomfortable truths into public conversation. Their persistence has helped move online child harm from a specialist concern into the broader child protection conversation, prompting greater focus from governments, regulators and industry.
As we move through 2026, the challenge is no longer awareness. The challenge is action.
Safer Internet Day plays an important role in focusing attention on online harm. For children, families, and frontline responders, however, these issues are an everyday reality, continuing to evolve as technology moves faster than policy. Grooming, sexual extortion, and child sexual exploitation are being reshaped by generative AI, encrypted platforms, and rapidly changing online behaviours. Families are often confronting this complexity, well before systems are ready to respond.
Recent enforcement action by the eSafety Commissioner has reinforced the importance of accountability in protecting children online. Strong, independent regulation provides a vital foundation for safer digital spaces, particularly when it is complemented by coordinated policy, capable frontline response, and practical collaboration across government, industry and child-focused services.
What we see consistently is fragmentation at the point where harm occurs. Families are left uncertain about where to report, children receive inconsistent responses, and frontline police are asked to manage trauma disclosures, complex digital evidence, and emerging technologies alongside their core policing duties. This gap is where ICMEC Australia focuses its work.
Late last year, we launched the pilot of our Child Abuse Response Training for Frontline Police, with a clear objective: to equip every frontline police officer in Australia with the skills and confidence to respond effectively when online harm intersects with a child’s life. The quality of that first response matters. How a disclosure is received and acted on can shape a child’s recovery and a family’s trust in the system.
Alongside this work, we continue to engage across government, industry, law enforcement, and research to align policy, prevention, and practice. Child safety does not sit neatly within one portfolio. It cuts across technology, communications, education, social services and justice. Progress depends on those parts of the system working together rather than in isolation.
Children’s lives are shaped by digital spaces every day. Our responsibility is to ensure the systems around those spaces such as policy, regulation, industry practice, and frontline response - are equally present and effective. We are past identifying the problem. What is needed now is sustained investment in frontline capability, coordinated action across government, and clear expectations on industry, so that when harm occurs, children and families are met with competence, consistency and care.
About the author
Dannielle Kelly (Danni) is the Head of Government Affairs and Law Enforcement Outreach at ICMEC Australia, where she leads strategic partnerships across government, law enforcement, industry and academia to advance efforts to prevent child exploitation. A former Australian Federal Police leader with more than 17 years’ experience, including work with the Australian Centre to Counter Child Exploitation (ACCCE), she has led organisational reform and long-term operational strategy initiatives. Her work focuses on building research-informed, cross-sector programs that strengthen national and global responses to child exploitation.
An opinion piece by Jacqueline Bottner, Senior Coordinator, Corporate Engagement ICMEC Australia.
A blind spot in responsible travel
When people think about child sexual exploitation and abuse (CSEA) in the travel and tourism industry, their minds often turn to poverty-stricken countries - places with weak child protection laws, high levels of corruption, and perpetrators imagined as shady, dangerous individuals who lurk on the margins of society.
The unfortunate reality is that CSEA is a far-reaching issue that affects children everywhere, including here in Australia. Despite having some of the world’s strongest extraterritorial laws to prosecute Australians who offend overseas, the problem is also domestic. In fact, one in four Australians have experienced child sexual abuse right here at home (ACMS 2023). Many perpetrators blend in, looking like ordinary tourists, business travellers, or even families on holiday.
While sustainability and environmental, social and governance (ESG) initiatives are gaining momentum across the travel and tourism industry, child protection is still largely absent from the conversation. It’s encouraging to see growing commitments to environmental impact and community development, but when it comes to safeguarding children, there is often a lack of awareness. The risks and realities of CSEA fit squarely within the ‘social’ pillar of ESG, which focuses on human activity and civil society challenges, and should be considered a core part of every organisation’s human rights and modern slavery frameworks.
The good news is that hospitality workers are on the front lines of guest interaction, giving them a unique opportunity to recognise warning signs, intervene in unsafe situations, and even prevent children from being harmed.
The first step toward positive change is building awareness and recognising the role the industry can play in protecting children. By starting the conversation, we can begin to create travel environments that are not only sustainable and ethical, but also safe for children.
What is CSEA and how it intersects with travel
Child sexual abuse occurs when an adult takes advantage of a child for sexual purposes - whether that’s in person or online. When abuse involves money, gifts, or any kind of exchange, it becomes exploitation.
While some offenders travel with the deliberate intent to sexually abuse children, others are opportunistic. They may not set out to offend but end up doing so when the opportunity arises and protective measures are lacking. On the flip side, offenders don’t need to be on holiday to exploit travel infrastructure to facilitate these crimes.
The online / offline connection
In today’s digital world, offenders often make first contact with children through popular messaging platforms, social media apps, and online gaming, with the aim of eventually moving the relationship offline. In these situations, young people may be led to believe they’re communicating with someone their own age, when in fact they’re being groomed by an adult.
Once a connection is made, offenders may use travel to facilitate abuse - targeting children outside of their immediate area, where they feel more anonymous and less likely to be recognised. Offenders utilise travel services like rideshare apps, booking platforms, and hotels to arrange meetings and conceal their actions. In this way, ordinary tourism infrastructure can unintentionally become part of the abuse pathway.
Human trafficking and modern slavery
Modern Slavery is an umbrella term for the use of coercion, threats, violence, and deception to exploit people and deprive them of their freedom. One form of this is human trafficking, which involves transporting a child to sexually exploit them across city, state or international borders. It's important to note that children - by virtue of their age are considered inherently vulnerable and unable to consent - which means that their exploitation can still be considered modern slavery or human trafficking even if no one ever threatened or forced them explicitly. The UN estimates that one in three detected trafficking victims worldwide is a child, and in Australia, for every victim identified, another four go undetected.
Hotels, motels, and transit hubs are often misused in these crimes, making the travel and hospitality sector a critical line of defence. According to the Australian Criminal Intelligence Commission, serious and organised crime groups are often closely involved in the perpetration of these crimes, and this can jeopardise the safety of staff and other guests.
CSEA in hotels and holiday rentals
The privacy offered by hotel rooms and short-term holiday rentals can make them attractive settings for offenders looking to exploit children.
The rise of informal and less regulated accommodation options, such as peer-to-peer home rentals and smaller budget hotels with self-service check-in kiosks, has further complicated detection and oversight. However, this crime can happen in any type of accommodation, including luxury hotels.
In a recent Australian example reported on by ABC, the offender was a high-level bank executive staying in the Sofitel Brisbane while travelling for work. The offender tasked an 18-year-old sex worker with the procurement of two young girls for him to sexually abuse. The girls were brought to his hotel room, however, after changing his mind and refusing to open the door for them, the offender contacted the front desk and requested that the girls be removed from the five-star hotel. This phone call alerted the front desk to the suspicious situation, allowing hotel staff to intervene and contact the police. In the end, the offender was arrested and charged with using electronic communication to procure children under the age of 16.
More recently, in Cairns, a 25-year-old man was charged with multiple offences after allegedly arranging to meet a teenage girl at a city hotel via social media. Queensland Police allege he supplied the girl with drugs before sexually assaulting her. The case came to light when a family member raised the alarm, prompting a police investigation that led to his swift arrest. The matter is now before the courts, with police commending the victim’s bravery in coming forward.
These examples illustrate the diverse ways in which hotels, from luxury five-star properties to budget city motels, can be exploited by offenders. Hotels and motels are often a hot spot for human trafficking and sexual exploitation as they offer easy access, the ability to pay in cash, and a level of anonymity that makes it easier for offenders to avoid detection. In some cases, limited staff oversight or infrequent room checks create additional blind spots where abuse can occur unnoticed.
Hotels globally have faced lawsuits for being complicit in human trafficking, particularly where staff failed to identify or respond to red flags. In Australia, if a hotel is found to be knowingly involved in, aiding, or allowing human trafficking or slavery, it can be prosecuted under the Criminal Code Act 1995 (Cth). This legal risk adds another layer of urgency for the industry to invest in staff training, clear reporting procedures, and child protection policies.
Sexual assaults on aircraft
Commercial flights where personal space is limited and cabins are often dimly lit, can create conditions that leave unaccompanied minors or children seated away from their guardians vulnerable to in-flight sexual assault.
According to a 2024 report by the FBI, they investigated 104 cases of sexual assault on aircraft that year, with many of the reports coming from children. These incidents often go unreported due to feelings of fear, confusion, or embarrassment, which can make it harder to intervene or hold perpetrators accountable. The Journal of Australian Lawyers cites the seating of minors in close proximity to adult strangers as an identified issue in cases of sexual assault and harassment on airlines.
But it’s not only other passengers that pose a risk to children during air travel. In some recent cases, airline staff have also been accused of assaulting minors in their care. For example, in 2020, a father sued South American airlines LATAM over the sexual assault of his 6-year-old son by a member of airline staff when the boy was flying as an unaccompanied minor. In another recent case on American Airlines, a former flight attendant taped his phone to the toilet to record people using the bathroom, including a 9 and a 14-year-old girl.
In Australia, there have been several documented cases involving staff of major airlines charged with CSEA related offences. For example, a former flight attendant was recently jailed after admitting to sending thousands of files of child sexual abuse material to contacts worldwide. In another case, an international flight attendant who travelled to the Philippines in both private and professional capacities offended on 25 separate occasions over a seven-year period.
What makes this even more concerning is that there is currently no mandatory requirement for airlines to report incidents of sexual assault onboard to police. This lack of obligation can lead to serious incidents being handled informally or not at all - allowing offenders to walk away without consequence. In some of these cases, passengers have taken legal action against airlines, highlighting a growing area of legal and reputational vulnerability.
Why this matters in travel and tourism
Child protection is not a peripheral issue; it is closely linked to the travel industry’s existing priorities and responsibilities. As more businesses adopt sustainability and purpose-led strategies, there is a growing expectation from customers that human rights risks, including child exploitation, are addressed with seriousness and accountability.
Under Australia’s Modern Slavery Act, certain businesses in every industry must identify and address exploitation risks in their operations and supply chains, and that includes risks to children. By not addressing the risks of CSEA within your organisation head on, the business can face ongoing and significant safety, financial, reputational, and legal ramifications.
Customers are also paying closer attention. Families want to book with brands they can trust, and travellers increasingly expect companies to demonstrate clear ethical standards. Internally, protecting children also means equipping staff with the support they need to feel confident in identifying red flags and responding safely, so they know that their workplace takes this issue seriously. Given that employees can also be parents or caregivers, this training can help them recognise the signs of exploitation in a professional capacity, while also equipping them with the knowledge to keep their own families safe.
When companies take a proactive approach to child protection, it not only reduces risk but strengthens their values, brand, and long-term impact. Importantly, new data highlights that traveller perceptions of sustainability are broadening. Research from Booking.com in 2025, based on global consumer insights, found that for the first time more than half of travellers now consider the social impact of tourism on local communities, not just environmental factors. This shift underscores that responsible travel is no longer just about reducing environmental footprints, but it’s also about safeguarding people in the places we visit.
What the travel and tourism industry can do
Tourism and hospitality businesses can start by incorporating child protection into their ESG and risk management discussions, recognising it as an essential component of human rights due diligence.
While small businesses might feel that they don't have the capacity to implement child safeguarding initiatives, almost all disciplines touch on this topic in some way. Providing basic training for staff, such as how to recognise signs of grooming, exploitation, or suspicious behaviour, can go a long way in creating safer environments for children and guests alike. Partnering with organisations like ICMEC Australia can help businesses access the right resources, guidance, and support to build child-safe practices that align with existing sustainability and compliance goals.
Meaningful progress often begins with small, practical steps that can have a significant impact.
Take the first step
For those in the travel and tourism industry, consider initiating conversations within your organisation about where child protection fits into your ESG strategy. Explore available opportunities to strengthen your detection and prevention of this crime and engage with organisations already working in this space.
Child protection isn’t just an ethical issue; it’s a business responsibility. And it’s one the industry has the power to lead on.
If you’d like to learn more about how to get started, please get in touch with ICMEC Australia at corporate@icmec.org.au.
About ICMEC Australia
ICMEC Australia is a specialist not-for-profit organisation that strengthens Australia’s ability to prevent and respond to child sexual exploitation and abuse (CSEA). ICMEC Australia supports government, industry, and the community to strengthen safeguards, disrupt harm, and build systems that protect children. Our work is grounded in evidence, cross-sector collaboration, and a commitment to creating safer environments for children in Australia and beyond.
References:
What we are witnessing with Grok is not a technical failure. It is a platform governance failure.
In recent weeks, Grok (An AI chatbot embedded within the platform ‘X’) has been used to generate non-consensual sexualised imagery, including ‘nudified’ images of women and, in some reported cases, children. The response so far has largely been reactive: tweaks to safeguards, geoblocking in certain jurisdictions, and statements that shift responsibility back onto users once the damage is already done.
From an Australian perspective, our eSafety Commissioner made it clear this week that this is unacceptable. The Commissioner confirmed a rise in reports relating to Grok producing sexualised and exploitative imagery and has formally engaged X to explain what safeguards are in place – and why they were clearly insufficient. With new mandatory online safety codes commencing in March 2026, the message could not be clearer: generative AI systems must be expected to anticipate misuse, not merely respond after harm occurs.
Australia is not alone in this conclusion. Regulators across the EU and UK have also moved quickly, launching investigations, issuing data-retention orders, and, in some cases, referring matters to prosecutors under the Digital Services Act and national criminal law. Indonesia and Malaysia imposed temporary bans. Countries including France, Germany, Italy, Sweden, India, and the United Kingdom are all examining whether Grok’s design and deployment breach existing safety obligations.
What is striking is how familiar this pattern is. We have seen it before with social media platforms: rapid deployment, minimal guardrails, externalised harm, followed by accusations of ‘censorship’ when regulators step in. But generative AI raises the stakes significantly. When a system can automatically manipulate real images of real people at scale – particularly women and children – the harm is not hypothetical. It is immediate, personal, and enduring.
At ICMEC Australia, we see the real-world consequences of safeguarding and governance failures every day. Our work with law enforcement, regulators, and frontline organisations show us time and time again that harms created by unsafe technology design do not remain online – they follow children into their schools, homes, and communities, for life. Innovation does not have to come at this cost. Safety and technological progress are not mutually exclusive, but safety must be built in from the start.
AI companies cannot credibly claim neutrality when their products are designed in ways that invite abuse, fail to integrate guardrails, and simply deploy under a narrow interpretation of ‘innovation at speed’. Safety by design is not optional. Consent protections are not a ‘nice to have’. And blaming users is no longer a defensible position when predictable misuse was foreseeable from day one.
The Australian Government have signaled they understand what is at stake. Renewed attention towards legislating a digital duty of care, alongside emerging work on AI-related harms, reflects a broader shift towards placing responsibility where it belongs – on companies with the power to prevent harm before it occurs. As Australia moves into 2026, these reforms – and the need for further action – must remain a priority, particularly where children are concerned.
Grok is not under scrutiny because it is controversial. It is under scrutiny because it crossed a line. The question now is whether the AI sector learns from this moment – or whether regulators will be forced to draw much firmer boundaries on its behalf.
About the author
Colm Gannon is the CEO of the International Centre for Missing and exploited Children (ICMEC), Australia with over 20 years’ experience in law enforcement, digital safety, and child protection. He has led national and international investigations into online harms, child sexual exploitation and abuse (CSEA), and cybercrime. Combining this expertise with a technical background in AI and software development, Colm works at the intersection of technology and child safety, advancing ICMEC Australia’s mission to protect children.
References
Australian Government’s National AI Plan marks a critical step forward for child safety.
ICMEC Australia welcomes today’s announcement by the Australian Government outlining
the new National AI Plan, marking a significant step forward in protecting children in a rapidly
evolving digital environment.
Over the past two years, ICMEC Australia has played a central role in shaping national
thinking on the safe use of artificial intelligence to prevent child sexual exploitation and
abuse. As leader of the SaferAI for Children Coalition, ICMEC Australia has brought together
experts from technology, law enforcement, academia and other not-for-profits to develop
practical, evidence-based measures that prioritise children’s rights and safety.
Figures from the US National Center for Missing & Exploited Children show a 1,325% surge
in AI-related child sexual exploitation reports, rising from 4,700 in 2023 to more than 67,000
in 2024.
Reflecting on today’s announcement, Dannielle Kelly, Head of Government Affairs and Law
Enforcement Outreach at ICMEC Australia, said the task ahead is to ensure innovation
progresses safely.
“Children are already growing up in an AI-enabled world. Our job is to make sure they can
do that safely – not by shutting down innovation, but by putting strong regulations in place,
and ensuring the right tools are in the hands of those who protect them. Today’s actions are
a positive step towards that future,” Ms Kelly said.
ICMEC Australia’s work has helped build the national momentum behind today’s decision.
Through its parliamentary roundtables in recent months, ICMEC Australia has convened
national leaders across government, industry and child protection, including key discussions
that contributed to the Government’s ban on nudify apps.
Today’s announcement represents meaningful progress and a welcome response to this
collective effort.
In parallel, ICMEC Australia is working with police across the country to ensure frontline
officers have the tools, training and specialist expertise needed to respond to AI-enabled
offending. Ms Kelly said this work is becoming increasingly urgent as generative
technologies reshape criminal behaviour.
“AI has become a core tool for offenders, and it now must be part of the response for police.
We are focused on making sure officers have practical, current training and access to AIenabled tools that help them identify harm faster, support victims better and hold offenders to
account.”
ICMEC Australia believes that the implementation of the national AI Plan will provide the
coordinated direction needed to adopt and develop new technologies with confidence while
placing children’s rights and safety at the centre of an AI-informed future.
The organisation looks forward to continuing its collaboration with the Australian Government
to ensure that AI is used for children’s safety, not against it.
-ends-
An opinion piece by Rosie Campo, ICMEC Australia's Head of Corporate Engagement
In her latest op-ed, Rosie Campo, shares powerful insights from the ICMEC Australia Symposium 2025: Convergence, where leaders from government, financial services, technology, law enforcement and the non-profit sector came together with a clear message.
Businesses must take an active role in preventing child sexual exploitation to help combat this crime.
Rosie outlines how offenders are exploiting everyday business systems and why companies cannot wait for regulation to dictate their response. Prevention and disruption must sit at the centre of corporate responsibility, with safeguarding embedded into governance, risk and compliance.
Her reflections from the symposium highlight the enormous potential within business to identify risks earlier, protect children and strengthen the systems that offenders use to cause harm.
This is essential reading for anyone working across the corporate sector.
We are pleased to share the Impact Report FY2024–25.
A year of collaboration, innovation and measurable change. Together, we’re working towards creating a world where technology cannot be used to harm children.
“At the heart of our mission is a simple focus, to strengthen the professionals who detect, disrupt and prevent harm.”
— Colm Gannon, CEO, ICMEC Australia
We’re proud to share the ICMEC Australia Impact Report FY2024–25, highlighting another year of progress in strengthening our response to child sexual exploitation and abuse (CSEA).

ICMEC Australia acknowledges Traditional Owners throughout Australia and their continuing connection to lands, waters and communities. We pay our respects to Aboriginal and Torres Strait Islanders, and Elders past and present.