ICMEC on ABC Radio ICMEC Australia CEO, Colm Gannon, recently joined Ali Moore on ABC Melbourne Radio to discuss the growing dangers of artificial intelligence (AI). As AI technology continues to advance at an unprecedented pace, concerns around its ethical use, potential for exploitation, and impact on child protection are becoming increasingly urgent.
During the interview, Gannon highlighted the risks associated with AI-generated content, deepfakes, and the ease with which bad actors can use these tools to exploit children online. With AI now capable of creating highly realistic images, videos, and voice clones, the potential for misuse is alarming. ICMEC (International Centre for Missing & Exploited Children) is at the forefront of advocating for stronger policies and regulations to safeguard vulnerable children from these emerging threats.
One of the key concerns raised in the discussion was the challenge of detecting AI-generated child abuse material. Traditional content moderation tools struggle to keep up with AI’s ability to rapidly produce new forms of harmful content. Gannon called for urgent collaboration between governments, tech companies, and child protection agencies to address this growing crisis.
Ali Moore also questioned the role of social media platforms and their responsibilities in preventing AI-driven exploitation. Gannon emphasized that while some platforms are taking steps to regulate AI use, enforcement remains inconsistent. He stressed the need for global cooperation and proactive AI legislation to ensure child safety remains a top priority.
ICMEC Australia continues to push for policy changes and technological solutions that can help combat the dangers posed by AI. Gannon’s appearance on ABC Melbourne Radio served as a crucial reminder that while AI presents incredible opportunities, it also demands urgent safeguards to protect the most vulnerable members of society. ICMEC on ABC Radio
https://www.abc.net.au/listen/programs/melbourne-drive/drive/105050584
https://icmec.org.au/icmec-australia-in-the-news/: Listen to Colm Gannon talk with Ali Moore on 774 ABC Radio‘Nudify’ economy: AI-powered “nudify” apps are fueling a disturbing rise in non-consensual sexual deepfakes, exploiting victims without their knowledge or consent. These apps use artificial intelligence to manipulate images, generating fake explicit content that can be used for harassment, blackmail, or online abuse. Australia is witnessing a growing concern over the ease with which these tools can be accessed, often through cryptocurrency transactions that make tracking offenders more difficult.
This article from Crikey delves into how these AI-driven exploitation tools work, the legal and ethical implications, and the challenges law enforcement faces in curbing their spread. Despite increasing awareness, the rapid advancement of generative AI makes it difficult to regulate these technologies effectively. Lawmakers and advocacy groups are calling for stronger legal protections to criminalize the creation and distribution of non-consensual deepfakes and hold perpetrators accountable for their actions.
Victims often struggle with the emotional and reputational damage caused by these manipulated images, with limited legal recourse available. Many experience severe distress, as the circulation of fake explicit images can harm careers, relationships, and mental well-being. Social media platforms and online communities are under scrutiny for failing to detect and prevent the sharing of such content, raising questions about their responsibility in protecting users from online abuse.
The article also examines the role of cryptocurrency in facilitating these transactions, making it easier for developers and users to operate anonymously. As concerns over digital privacy grow, experts warn that without urgent intervention, the problem will only escalate. With AI becoming more sophisticated, the need for proactive regulation, stricter enforcement, and awareness campaigns is greater than ever. Governments, tech companies, and advocacy groups must work together to address this pressing issue.
Read the full analysis on Crikey to understand the scope of this alarming trend, the people affected, and the possible legal and technological solutions to combat AI-driven sexual exploitation..‘Nudify’ economy
ICMEC on ID Tech Podcast ICMEC on ID Tech Podcast ICMEC on ID Tech Podcast In a recent episode of the ID Talk Podcast, Colm Gannon, CEO of the International Centre for Missing & Exploited Children (ICMEC) Australia, delved into the complex interplay between technology, privacy, and child protection. The discussion underscored the imperative of harmonizing regulatory frameworks with technological advancements to bolster the fight against child exploitation.The ID Talk Podcast+4LinkedIn+4Apple Podcasts+4
ICMEC's Collaborative Approach
Gannon highlighted ICMEC's multifaceted collaboration with law enforcement agencies, regulatory bodies, and technology companies. This triad partnership is pivotal in addressing the evolving challenges of child exploitation in the digital age. By fostering these alliances, ICMEC aims to create a cohesive strategy that leverages technological innovations while ensuring robust child protection mechanisms.Audacy+3Apple Podcasts+3Apple Podcasts+3
Balancing Privacy and Protection
A significant portion of the conversation centered on the delicate balance between safeguarding individual privacy rights and protecting victims of child exploitation. Gannon emphasized that while privacy is a fundamental right, it should not serve as a shield for criminal activities. He pointed out that certain privacy regulations, in their current form, might inadvertently protect perpetrators rather than victims, thereby necessitating a reevaluation to ensure that victim protection remains paramount. ID Tech Wire+1Apple Podcasts+1
Role of Emerging Technologies
The integration of technologies such as facial recognition and artificial intelligence (AI) in investigative processes was a focal point of the discussion. Gannon elaborated on how these tools can significantly enhance the efficiency and effectiveness of identifying and apprehending offenders. However, he also cautioned about the ethical considerations and potential biases inherent in AI systems, advocating for responsible adoption with appropriate oversight and governance.LinkedIn+2Apple Podcasts+2Apple Podcasts+2
Regulatory Developments and Challenges
The conversation also touched upon recent regulatory developments, including the European Union's AI Act. Gannon discussed how such regulations impact the deployment of AI and other technologies in law enforcement. He stressed the importance of crafting regulations that do not stifle innovation but instead promote the ethical use of technology in protecting vulnerable populations.Apple Podcasts+2LinkedIn+2Apple Podcasts+2
Conclusion
Gannon's insights shed light on the intricate dynamics between technological innovation, regulatory frameworks, and child protection efforts. The podcast episode serves as a call to action for stakeholders across sectors to collaborate, ensuring that advancements in technology are harnessed effectively and ethically to combat child exploitation.
For a more in-depth understanding, you can listen to the full episode of the ID Talk Podcast featuring Colm Gannon.
Anna Bowden's abuser had to get close to her when she was a child before the sexual exploitation began.
Today, people like her perpetrator can victimise Australian kids from anywhere. In fact, they can do it with the click of a button.
The advancement and accessibility of AI technology has triggered a "tidal wave" of sexually explicit 'deepfake' images and videos, and children are among the most vulnerable targets.
"Accessing and using AI software to create sexual deepfake images is alarmingly easy," Jake Moore, Global Cybersecurity Advisor at ESET, tells 9honey.
From 2022 to 2023, the Asia Pacific region experienced a 1530 per cent surge in deepfake cases, per Sumsub's annual Identity Fraud Report.
One platform, DeepFaceLab, is responsible for about 95 per cent of deepfake videos and there are free platforms available to anyone willing to sign up with an email address.
They can then use real photos of the victim (usually harmless snaps from social media accounts) to generate whatever AI image they want; in about 90 per cent of cases, those images are explicit, according to Australia's eSafety Commissioner.
"We've got cases of deepfakes and people's faces being used in images which are absolutely and utterly horrific," reveals Bowden, CEO at the International Centre for Missing & Exploited Children (ICMEC) Australia.
This technology wasn't around when she was abused and it horrifies her to know it's already being used to victimise Aussie kids, many of whom have no idea they're at risk.
"We don't talk about it," she says. "There's no information. There's no idea of what offenders are doing, what we need to look out for.
"We're helping criminals because we're not communicating."
Despite the spike in deepfake cases, 79 per cent of Aussie social media users confessed they struggle to identify AI-generated content online, per a 2024 McAfee survey.
Many parents don't understand the power of AI and the dangers it can pose, so they don't know how to protect or educate their kids.
The worst part is that sometimes children can be perpetrators too.
ESET has recognised a surge in teen AI sextortion cases where teens are generating non-consensual, explicit images and videos of their peers to impress, bully, or intimidate others.
"Constant exposure to online content has desensitised many young individuals, reducing their understanding of the real-world consequences of their actions," Moore explains.
Teen perpetrators can be punishable under Australian law, but the victim may experience shame, fear, humiliation, loss of self-esteem, financial loss, and damage to their social standing.
"Just because those images or videos are AI generated does not mean they're harmless," Bowden says.
This kind of image-based abuse can cause mental and emotional distress, and some victims die by suicide. The worst part is that the deepfakes may never go away.
Advocate Noelle Martin was 18 when she discovered fake non-consensual images of her online and years later, the photos and videos are still online.
"This can destroy someone's entire identity and reputation, their name, and image, and self-determination, and dignity. It can define that person forever," she told 9honey.
A deepfake image or video of a child can spread rapidly and can be almost impossible to have removed from the internet.
Most social media platforms ban non-consensual explicit content, but AI-generated pornography slips through due to the sheer volume of content being posted.
"This makes it difficult for moderators to remove such content quickly enough, while users are swift to share, save, and screenshot these images," Moore notes, revealing they're often shared through private channels.
Earlier this year social media sites struggled to remove an explicit Al-generated image of Taylor Swift from their platforms.
The images were viewed more than 45 million times in 17 hours.
Now imagine if the target was your child, not the biggest celebrity in the world.
It could end up in the hands of child predators seeking out explicit content, who could then use the deepfake to generate even more vile non-consensual photos and videos bearing the face of an Aussie child they've never met.
Bowden describes it as "incredibly traumatic" and warns "it could be around for decades, if not longer.
Thankfully, organisations like ICMEC, ESET, and the Australian eSafety Commissioner are working to combat this.
"It's evolving so quickly that as we create solutions, the technology changes, and then we need a new solution again," Bowden says.
She works with Australian organisations, government and law enforcement to prevent deepfakes and other child sexual exploitation online, but says parents need education too.
That means learning about the risks of AI, explaining them to children in age appropriate ways, and teaching children basic online safety precautions.
It's an uncomfortable conversation and one many Aussies would rather not have, but silence won't help.
"It's just terrifying how low the level of awareness is and how high the risk is," Bowden adds. "We cannot keep just hoping this is going to go away."
Moore advises parents to maintain open communication so their children know they can speak up if something happens.
"If a child becomes a target of AI-generated porn, parents have to remain calm and reassure the child that they are not at fault," Moore adds.
"It is crucial to document all evidence and report the incident to the relevant authorities, such as the school and the police [and] contact the platform hosting the content to request its removal."
AI isn't going to go away, but deepfakes can be tackled and eradicated.
That will require an increase in awareness about the dangers and legal implications of sharing sexual content online, as well as tougher regulations to deter potential offenders, and more accountability from online services.
It will take work and the landscape of AI is changing rapidly, but Bowden is certain that "if the good guys all get together and want to make change, we can outnumber the bad guys".
Anna Bowden is the CEO of The International Centre for Missing & Exploited Children, she has extensive past experience in impact investing, philanthropy and impact strategy.
With an emphasis on building new models for social impact across the globe, Anna’s affinity for the vital work of countering child sexual exploitation, combined with a natural connection to innovative solutions to the world’s wicked problems, makes her a passionate leader. Read on for our interview with Anna!
Describe your career trajectory and how you got to your current position.
My career has always been driven by a passion for making a positive social impact. Over the past 20 years, this has guided me to work with businesses and organisations on programs that integrate social impact with investment and business.
I started my career in the back of a private equity office. This was where I first realised that the money, knowledge, and expertise I was surrounded by could be channelled towards positive social impact (an unusual take at the time). Looking back, this was a transformative moment in my career and a compass that guided me over the decades.
In 2022, I joined ICMEC Australia as Head of the Child Protection Fund before becoming CEO later that year. At ICMEC Australia, we focus on improving the detection, reporting, and prevention of online child sexual abuse. Our work is targeted at helping industry professionals sharpen the tools they need to address child sexual abuse and make a tangible difference.
As a survivor of this kind of abuse myself, I feel a deep sense of purpose in applying my skills to combat one of the fastest-growing crimes in the world. My career has shown that we can make a real difference by combining business and financial expertise with a commitment to social good. Even after 20 years, my sense that we can do better to protect the most vulnerable has only grown.
Take us through a typical day of work for you.
My days are usually quite busy, so I like to start my morning by blowing off steam at the gym at the crack of dawn, which is quickly followed by the madness of getting myself and two young children ready for work and school.
My days are typically split between meetings with our many stakeholders and our team, attending industry events, and working on strategic projects and reporting.
All the travelling to meetings and events means I’m usually squeezing in emails, work, and calls as I travel to and from things as best I can. I try hard to set aside an hour or so to have an early dinner with my family at the end of the day, and then usually log back on to finish a few pressing things before winding down for the night.
What is the biggest challenge you’ve encountered in your career, and how did you overcome it?
I’ve worked in several startups over the years, and it’s always both thrilling and challenging. The intensity of that challenge is heightened when it’s a nonprofit startup. As is the case with many nonprofits, funding is an enormous hurdle, which requires me to make a lot of tough decisions that not only affect me but the entire team and organisation.
There was a time when our organisation was undergoing a significant restructure, and this really challenged me on a personal and professional level. Of course, I knew that avoiding it wasn’t an option and as hard as it would be, we needed to face it head-on as a team to continue our mission of making a difference.
I found the process very emotionally draining. From the very beginning, I promised myself and my team that I would operate with complete transparency. We have a very small, very dedicated team of tight-knit professionals, so there were questions about what this would mean about individual roles and what it meant for the organisation as a whole.
Being vulnerable, open, and honest with my team helped us navigate the restructure and come out stronger on the other side. It wasn’t easy, but it reinforced the importance of having a strong, aligned structure to support our mission. This situation taught me how even when times are tough, your purpose should be your path forward. Everything we do is for the benefit of detecting and preventing child sexual abuse, and sometimes that comes with difficult, confronting, or uncomfortable decisions.
If you could go back in time, what piece of advice would you give yourself as you first embarked on your career?
I would remind myself that the mission always comes before your own comfort. I learned this many years ago when I took on a role thinking I was the right fit, only to find within a few days of starting that I wasn’t the best person for the job. I stayed for many months trying to make it work; partially because I didn’t want to leave the company in the lurch, but also because I was somewhat fearful of how such a short-lived role would look on my resume. I didn’t want future employers to think that I give up when the going gets tough.
Eventually, I realised that by staying in the role, I was hindering the organisation and its ability to fulfil its mission. While I was still concerned about how it would look professionally, I decided that I needed to check my ego and step away so someone better equipped for the role could step in. It was hard, but it’s something that I now look back on with gratitude because it taught me that when you work in the social impact space, you can never put your pride above the mission you’re working towards. I’d tell myself, this will hurt in the short term, but it will feel much better doing the right thing in the long term.
How do you unwind after work?
I won’t lie – learning to unwind is a work in progress for me. I have two young daughters and my passion for what we do at ICMEC Australia makes it very hard for me to switch off. I’m working on it, but I often battle with sending one more email or reviewing one more document instead of closing the laptop.
With that being said, I make an effort to disconnect from work and screens by 8 pm. As someone who has struggled with insomnia all my life, this gives me some time to clear my head before bed.
I’ve also recently started taking ‘quiet breaks’. I don’t think I could quite call it meditating – more like switching off all the stimuli and inputs for a bit. I’m a big believer in starting small – so I spend just five or 10 minutes a day calming my mind during a bit of silence. Hopefully I’ll get better at it as I go – maybe even enough to define it as meditating!
I also really enjoy reading and listening to podcasts, on a wide range of topics. I love learning. I don’t watch much television, but I’ll usually get through at least a couple of books a week.
Imagine a world where some of the most vulnerable people are subjected to the unimaginable, yet their suffering remains largely hidden in the shadows. For more than one in three Australian girls and almost one in five boys who experience child sexual abuse, this world is their reality.
Pretty confronting statistics. I was one of those statistics, having experienced abuse as a child, and it’s one of the many reasons why I am so passionate about bringing this topic out of the shadows and into the light by talking openly about child sexual abuse and exploitation and what we must do to stop it.
Despite its horrifying prevalence, child sexual exploitation (CSE) is rarely talked about. However, there are some things we can all do in our day-to-day lives as well as steps that industries and businesses can take to help put an end to this horrific crime.
Online child sexual exploitation is one of the fastest-growing crimes, so much so that the amount of the most severe category of child sexual abuse material has doubled since 2020. With children spending more time than ever online, this is a major threat to the safety of young people everywhere. A study released this week estimates that globally, over 300 million children were subject to abusive behaviours online, in the last year alone.
Unfortunately, the rapidly evolving nature of technology also makes CSE increasingly difficult to prevent, detect, and prosecute. Investigative approaches and companies who want to address this growing problem need support to develop the right resources, skills, and capacity to fight this crime.
ICMEC Australia works with companies – particularly in the financial services sector – governments, and charities to help them develop the knowledge, tools, and abilities to identify, prevent, and report CSE. We do this by delivering a number of programs, such as data products and training, that help to both empower these organisations and connect them in a united fight against CSE.
The only way to put an end to CSE is for organisations across every industry to look at where and how this crime thrives and to put measures in place that directly address it. It’s a collective effort, but by coming together, we can help prevent and stop children being harmed.
To address CSE, we must first address the significant gaps in the response ecosystem. The confronting truth is that various commercial sectors inadvertently have their digital platforms used by perpetrators seeking to access child sexual abuse material (CSAM). They pay for it via online banking accounts, target and groom children via social media, and share CSAM through communication services, apps and chat platforms.
Thousands of entities hold pieces of the digital evidence puzzle, but because these pieces are not connected, it’s impossible to form a complete picture. This means we all have a role to play in putting a stop to this crime, and that starts with collaboration.
Through our work at ICMEC Australia, we facilitate greater collaboration between industries and organisations through capability building and information and data sharing. To stay across emerging technology innovations, we also have a dedicated catalytic incubator that supports several data and technology projects. By fostering greater transparency, traceability, prosecution, and prevention of CSE, we are contributing to a world where children are safe from the life-changing consequences of this crime.
Another important part of our work is collaborating with law enforcement agencies on the frontlines of detecting and investigating perpetrators of CSE. Despite their pivotal role, the changing landscape of modern technology means they often need more support and resources to uplift their techniques.
Of course, we cannot address CSE without speaking about it – and this is often the hardest part of the process. I get it – there are few topics more distressing and disturbing, so it makes sense that people don’t want to think about it, much less talk about it. However, this only leads to larger gaps in public understanding and advocacy, and it’s in these gaps that this crime thrives.
One of the most significant awareness gaps is in parent-to-child communication. Research shows that while 97 per cent of households with children under 15 have access to the internet, just 52 per cent of Australian parents and carers are having conversations with their children about online safety. This leaves far too many children susceptible to exploitative encounters or material online, highlighting the need for greater awareness and education.
By increasing public awareness, ICMEC Australia drives meaningful change and encourages more proactive measures to protect children. It’s only by having these confronting conversations – whether it’s in our workplaces, in our personal lives, or directly with our children – that we can stand against CSE and create a world without it.
While ICMEC Australia may not work on the frontlines of CSE, it is our privilege to support those who do. Through connection and collaboration, we empower organisations, government agencies, and law enforcement professionals to ensure they are better equipped to tackle this growing problem. Together we can confront the realities of child sexual exploitation and bring justice to the countless vulnerable people who have been forced into silence.
If you or someone you know is experiencing, or at risk of experiencing, domestic, family or sexual violence, call 1800RESPECT on 1800 737 732, text 0458 737 732 or visit 1800RESPECT.org.au for online chat and video call services.
If you are concerned about your behaviour or use of violence, you can contact the Men’s Referral Service on 1300 766 491 or visit http://www.ntv.org.au.
Feeling worried or no good? No shame, no judgement, safe place to yarn. Speak to a 13YARN Crisis Supp
Child sexual exploitation (CSE) is a complex and widespread crime that is showing no signs of abating. The Australian Childhood Maltreatment Study from April 2023 found more than one in four Australians have experienced one or more types of child sexual abuse.
You might be wondering: how does this relate to my small business? The reality is that every small business, including those whose business is conducted online, could play a role in fighting CSE.
Through technological advances, perpetrators are finding more ways to harm and exploit our children through AI, live-streaming, sextortion and a variety of other means resulting in devastating effects on the victims. For instance, the same Australian Childhood Maltreatment study showed that adults who have experienced child maltreatment are 2.8 times more likely to have a mental health disorder.
We implore businesses to start by learning about the issue and understanding where your systems, processes and procedures can play a role. It takes a whole of community response to break the cycle and we all have a part to play.
Awareness is the first line of defence. Our society can’t confront anything we don’t know or don’t understand. With more information comes greater prevention and protection, and knowledge on what steps to take if something would go wrong.
Spread awareness throughout your business and across your wider stakeholders and partners about the prevalence of this heinous issue, especially in areas that are more vulnerable to this crime like financial services, risk and compliance, procurement, and customer service teams.
It can help to think about this issue from the perspective that unfortunately, statistically, there could be many people in your workforce who have lived experience of child abuse. There may also be many parents or carers who would want to be educated about this issue and better equipped to spread the message further. According to the Australian Centre to Counter Child Exploitation, only 52 per cent of parents and carers talk to their children about online safety. Prevention is key to combatting this crime.
To further your understanding of this issue, you can tap into a multitude of online resources from experts such as the National Office for Child Safety, the eSafety Commissioner and the AFP’s Think U Know program.
Look at your business operations that may be affected by this crime, and understand reporting requirements for each area of your business. Some businesses have mandatory reporting requirements, for instance under the AML/CTF Act. For those entities, resources like AUSTRAC’s Financial Crime Guide Sexual Exploitation Of Children For Financial Gain are very useful.
Even if your business doesn’t have mandatory reporting requirements, or these are already well covered, it’s still essential that your organisation has policies and procedures in place should an incident occur. Without these, there is uncertainty about how to address risks connected with CSE.
Establish guidelines for online communication and social media usage, especially if the business has an online presence. Educate employees about the risks of online interactions to children’s safety, and how to report any concerning behaviour.
As hard as it is, we must confront this issue and have conversations with each other and children about how to prevent and stop this serious crime. We need to open the dialogue across society, small businesses included. With more information comes greater prevention and protection, and knowledge on the steps to take if something were to go wrong.
Once we have awareness, a collaborative and networked approach is essential, within and across financial institutions, and across sectors. If you’re still uncertain, ICMEC Australia can provide guidance, resources and connections to expertise.
Child sexual exploitation is all around us, a crime hidden in plain sight thanks to a pervasive culture of silence and stigma. According to current research, one in three women experienced sexual abuse as a child, compared to one in five boys. Further to this, perpetrators of violence against women and girls have often been found guilty of child sexual exploitation and abuse against both groups.
The figures are startling and paint a confronting picture of the terrifying world that many children continue to face. It’s a situation that industry leaders ICMEC Australia chief executive Anna Bowden, National Centre for Action on Child Sexual Abuse chief executive Dr Leanne Beagley and Bravehearts chief executive Alison Geale have devoted much of their professional lives to ameliorating. The three women are coming together to urgently call for an end to violence against women and girls.
Here, we speak with each leader about their roles, their never ending fight to end child sexual exploitation, and how we can all play a part in helping end the abuse.
Anna Bowden, CEO of ICMEC Australia
Anna Bowden, CEO of ICMEC Australia, is one of the leading women tackling child sexual exploitation and ending violence against young women. A powerful voice in a challenging industry, Anna leads the charge against the growing harm to our children online and is responsible for showing resilience every day in the face of darkness.
Lack of awareness is a major driving force behind Anna’s work. “It’s still not broadly understood by society how common child sexual exploitation is - more than one in four Australian children are sexually abused and exploited - and what we need to do to stop it,” she says.
As someone with lived experience of child sexual abuse, Anna says that for her, doing something to contribute feels better than nothing. “There’s a lot of evidence to support that feeling helpless in the face of horrible events can feel really awful,” she says. “Despite how overwhelming the problem can seem, every tiny thing we do to protect children adds up – and, together, we can achieve change.”
Anna also names salt water - whether sweating through exercise, tears, or swimming in the sea - as aids to help her manage the emotional toll of her work. “As the saying goes, sweat, tears, and the sea can make a lot of things feel better. Sweat and exercise are huge coping strategies for me.”
As for the question of legislation, Anna says Australia is very fortunate to have Julie Inman Grant as the eSafety Commissioner, whose team is world-leading in their regulatory response to technology-facilitated crimes.
“We need others to follow Julie’s leadership and put children’s safety first in legislation and regulation. Our commercial and consumer interests should always come after the priority of defending children’s human right to safety.”
Anna sees the solution to child sexual exploitation involving a comprehensive, cross-sector approach to child protection, with government, law enforcement, families, community, and businesses all playing a critical role.
“Children and young people interact with all these systems, and so do perpetrators,” she says. “We can’t continue to say ‘it’s just up to police, or government to sort this out’. They do tremendous work, but this is something we all must participate in.
“As hard as that is, we must confront it and have conversations with each other, and children about how to prevent and stop this. I never got justice against the man who offended against me, because I didn’t know how to vocalise it, or who to tell. We need to open the dialogue across society and with our children. We can’t confront anything we don’t know, or understand.”
As Nelson Mandela said: “There can be no keener revelation of a society's soul than the way in which it treats its children”.
Dr Leanne Beagley, CEO of the National Centre for Action on Child Sexual Abuse
Dr Leanne Beagley is tasked with overseeing the work of the National Centre and providing leadership on integrated responses to child sexual abuse and its impacts across the country.
Leanne began her journey when working as a therapist with children who had been traumatised by abuse from someone they trusted. “It broke my heart,” she says.
As adults with agency and power, she says we must fight for the rights of children to be heard, believed and protected. “What continues to motivate me is the growing roar of those who have lived and living experience of child sexual abuse. They richly deserve the support and healing and change that they are asking for.”
Leanne admits that there are times when she struggles to manage the emotional toll of the work. “For me, it’s about balancing the challenging and demanding and draining experiences with others that are generative, productive, healing and affirming,” she says.
“When we are strong, we can stand with those who feel shaky. When we are shaky, we have new insights into what it’s like for those who live with trauma.”
To fight the issue effectively, Leanne says we need to understand how it happens - whether that’s from data, lived experience, from research - and interrupt the trajectories at every step of the way. “We have to be prepared to do lots of things all at once.” This multidimensional approach, along with the challenges faced, provide the foundation for the National Centre’s five-year strategic plan - Here for Change.
Solutions also lie in several actions that interplay with one another. These include building strong, confiding, safe relationships with the children around you, and taking preventative measures seriously - for example, the National Centre’s resource on online safety.
On an individual level, there are several steps we all can take to make a difference. This includes knowing the signs of a child who is a risk or experiencing grooming, and taking action when you see those signs. “If you think it is happening, then you are probably right,” Leanne says.
“CSE is prevalent and it is a crime perpetrated by manipulative wily people who remain hidden by a culture of silence and stigma. For prevention and healing to occur, it is critical we shed light on the issue and bring it out of the shadows. The time for us all to take action is now.”
Alison Geale, CEO of Bravehearts
Alison Geale believes there is no better reason to rise to the work challenge every day than to protect our most vulnerable – our children. Ensuring children are safe from sexual abuse and helping those who have been impacted.
When it comes to the emotional toll of her work, Alison says she has a barometer ‘to gauge when my internal ledger feels off.’ Generally, that barometer works well, but there are still times when it can be tested.
“Balance is key, the task at hand is so important and can feel never-ending, naturally the desire to do everything can overtake your bandwidth and will test you,” she says.
“I have a trusted team; we all lean on each other and have open dialogue to help each other check in on self-care.”
Alison believes that stopping child sexual exploitation crimes involves approaching them holistically, through both systemic and societal change. Education has an important role to play, with Alison recommending that all people, including young people and children, are educated on this topic just as you would any other safety topic as they develop.
“Children and young people are accessing all the wonders of the world through the internet and conversely all of the dangers are impacting them equally. Having open, appropriate, and informed discussions with children and young people from an educated perspective is a priority,” she says.
The most important lesson Alison has learned is that ‘shame, secrecy, and silence assist the crime to thrive in plain sight every day.’ She says that in order to break down the paradigms and myths around child sexual abuse and exploitation, we must normalise the discussion with our children.
“The responsibility of their safety should not lie solely with children, but with everyone.”
ICMEC Australia acknowledges Traditional Owners throughout Australia and their continuing connection to lands, waters and communities. We pay our respects to Aboriginal and Torres Strait Islanders, and Elders past and present.