Become a partner in our mission of building a world where every child is protected and can thrive.
MAKE A DONATION
Get In Touch

Ensuring that we remain at the forefront of the technology and industry developments that impact child sexual exploitation facilitated online is essential to the success of our mission. Our ability to support our collaborative network of partners in their fight to protect children is influenced significantly by the experts who generously share their knowledge and time to help steer our organisation into the future.

We are delighted to announce that we have appointed one of these industry experts, Colm Gannon, as a non-executive director to the ICMEC Australia board. Colm brings extensive experience in digital safety, cybercrime investigations and software development, combined with over 20 years in law enforcement, to his position on our board.

During his time in law enforcement, which began in his homeland of Ireland, Colm was involved in national and international investigations and prosecutions relating to online harms, child sexual abuse and exploitation, violent extremism, and harmful online communications. He is now a Product Manager with Irish-based organisation, Rigr AI. In this role Colm is responsible for policy development, including child protection, privacy impact assessment and, legal and ethical assessments for Artificial Intelligence and Machine Learning implementations.

Colm’s commitment to protecting children from abuse has seen him represent the New Zealand government as a subject matter expert before the United Nations Committee for the Rights of Children, in addition to training law enforcement, prosecutors and judges on combatting OSEC on behalf of Europol.

Colm’s list of achievements seems endless, and we already experienced the value he brings to our organisation when he presented on the subject of AI at the June Monthly Brown Bag event, our most popular so far.

With the break-neck speed that Al and machine learning is advancing and proliferating our online world, we are acutely aware that it is essential to keep a firm eye on advancements in this space.
Colm’s expertise and involvement in key development projects in AI and machine learning will provide a window into what might be on the horizon for OSEC. And it will ensure that we keep pace with the additional impacts that these advancements will have on our ability to help organisations detect, report, prosecute and prevent child sexual abuse and exploitation.

Céad míle fáilte Colm.

Ahead of presenting at the COBA Financial Crimes Symposium on 12th July, ICMEC Australia Chair, Kara Nicholls, shares her thoughts about how ESG practices achieve better outcomes when approached from the perspective of those who have the lived experience of the problems that organisations are trying to help solve, especially in the case of child sexual exploitation.

The corporate framework set out for Environment, Social and Governance issues has traditionally been created very much from the perspective of the organisation. One of the primary reasons for this is that mitigation of risk is a major concern for organisations, especially in the world of banking and finance.

Having managed the governance function of a major financial institution following the conclusion of the Banking Royal Commission, I have experienced first-hand the role that risk management plays in organisational processes and priorities in relation to ESG issues.

From a risk and governance perspective, a bank or financial institution needs to have effective and adequate systems and processes for a myriad of matters, including: ensuring that they aren’t party to or facilitating criminal activities; regulatory engagement such as identifying and reporting suspicious transactions to AUSTRAC; and protecting customers’ financial assets and interests. Often, when these processes are considered primarily through a risk mitigation and regulation lens, it is understandable how they may lose sight of the human aspect of why these activities are important, and the impact they have on people and the communities in which they operate.

By flipping the switch on how we consider ESG frameworks and processes, the results can be impactful, and life changing. Approaching these issues with a strong people, community and ESG lens can change how an organisation plans, prepares and responds, leading to significantly better outcomes for customers, the organisation, its communities and the economy.

And nothing illustrates this better than how a financial institution manages the detection and reporting of child sexual exploitation (CSE). CSE facilitated online is one of the fastest growing crimes globally. And like it or not, financial institutions play a vital and pivotal role in protecting children from harm.

With our reliance on technology and access to the internet continuing to increase, the methods and opportunities to harm children are also growing at scale. The traditional public perception of this crime, somewhat perpetuated by the media coverage of the Royal Commission into the Institutional Response to Child Sexual Abuse, is centred on a certain stereotype. However, the victims and perpetrators of this crime come from all walks of life, which means both can be customers of any financial institution.

The changing nature of sexual crimes against children, along with the opportunities for new ways to abuse and exploit children through technology and online connection, has increased the necessity for banks and financial institutions to ensure that they remain vigilant, agile and engaged with the problem and become part of the solution. Crimes such as live streaming of abuse, coerced self-generated child sexual abuse material, child sextortion and more are on the rise. These crimes can all leave a digital financial footprint.

Unlike other financial crimes, such as money laundering for example, detecting child sexual exploitation in a financial context is difficult. The low value, often seemingly innocuous transactions are hard to pinpoint and identify.

The consequences of CSE for victims and their families can be tragically swift and devastatingly long lasting, impacting almost every aspect of their, and their family’s lives. This is the unquantifiable cost of CSE to the victim-survivor. There is also the impact on our health and medical systems and the economy.
We need to re-imagine how we consider ESG as we know that the expectations of our customers and our members/shareholders are high. We need to earn and retain our social licence to operate. It is not only important, it is crucial that we embed into our ESG approach, systems and processes thinking that puts prevention and the victim-survivor at the centre, as then we are striving to meet and exceed our stakeholders’ expectations, and seek to deliver the long-term positive social impact for customers and employees.

The true cost and savings of an authentic approach to ESG practices is ultimately a human one. And this is at the heart of every customer-owned banking institution.

Kara Nicholls is the Chair of ICMEC Australia. She has over 27 years’ experience in senior executive positions across ASX listed, private entity, startup and NFP organisations, and has held both corporate and NFP board Chair positions. This article was first published by COBA for their members, and is reproduced with permission.

It seems that we can’t turn on any device, go to any platform or view any news feed without seeing a mention of AI, specifically generative AI. Whether it’s professionals worried about the implications that recently released apps like ChatGPT might have on the security of their jobs, or evangelists espousing the great benefits that companies will reap from this time-saving productivity machine, the debates continue fiercely.

But one area that has far more potential to be impacted, and with much greater and devastating consequences, is the sexual abuse and exploitation of children. For those working in the child sexual exploitation (CSE) response community, the discussion also swings between the concepts of fear about potential negative outcomes, and hope that we could be looking at a powerful tool for those tracking down perpetrators.

There is no question that this technology, which is now easily and often freely available, could be used to create child sexual abuse material (CSAM). The concept of computer generated CSAM is certainly not new. Children have already been subjected to image-based abuse using deep fake technology, as we saw in our May Monthly Brown Bag presentation. But generative AI technology, now available to anyone, is taking this abuse to new level. And it has many in the sector worried, especially by the speed with which the technology has been made available with seemingly no consideration given to what guardrails we might need to put in place.

But we can learn from the past and ensure that we install proper protections for children before it’s too late. The situation with AI has been compared to the introduction and rapid expansion of social media platforms, which were left to their own devices to write the rule book.

Whilst it appears that AI companies like OpenAI have already put measures in place to prevent the creation of CSAM using their technology, open-source platforms have been much slower to consider such protections, if they’ve even included them at all. And it’s these platforms that are enabling the product of Al-generated CSAM. These open-source platforms, many based on a model created by Meta called LLaMA, have strong support from those who claim that they will be the vehicle for accelerated innovation. But they also facilitate easy production of abuse material, using images of real children as source material.

The harms and impact of AI-generated CSAM are significant. The image-based abuse of a child whose picture is has been used to create CSAM, the revictimisation of children by producing additional and progressively violent computer-generated versions of their abuse, or the potential for the material triggering an interest in CSAM that escalates to contact offending, are all potential issues. The images might be ‘fake’, but the dangers and abuse are real.

Researchers from Thorn and the Stanford Internet Observatory have released a research paper this week that examined the potential implications of photorealistic CSAM produced by generative Al. They found that less than 1 percent of CSAM found in known predatory communities was photorealistic Al-generated images. But in a New York Times article, David Thiel, Chief Technologist at the Stanford Internet Observatory and co-author of the report said that “within a year, we’re going to be reaching very much a problem state in this area.”

Regulators around the world have acknowledged the potential for harm created by this technology and recognise that we need strong and cohesive regulation worldwide. Several regions, including Europe, the US, Canada and Australia, which is currently conducting community consultation into Al regulation, are at various stages of introducing legislation to regulate AI.

And just as in the physical world, where our law enforcers are subject to both administering and being administered by the law, Al regulation will incorporate appropriate protections for citizens as investigators harness the powerful potential of Al tools to help them detect, report and prosecute AI-generated CSAM.

If you’re interested in hearing more about AI and its implications for child sexual abuse, Colm Gannon will expand on these concepts at our June Monthly Brown Bag event. Make sure you register today.

As a relatively new organisation in Australia, with a passionate team bringing expertise from a variety of backgrounds, finding ways that our people can connect fully to our cause and mission is paramount.

The Youth, Technology and Virtual Communities (YTVC) conference run by Queensland Police Service’s Task Force Argos on the Gold Coast in April was the perfect opportunity for us to learn more about how our stakeholders are approaching the challenges they face. As a sponsor of the conference, several of our team were able to join 500 other delegates in learning the latest trends and insights from the Child Sexual Exploitation (CSE) response community, immersing themselves in this difficult but vital subject.

Being surrounded by some of the world’s leading experts in CSE was certainly a valuable and eye-opening education for the team. And one of the great benefits in attending the conference was the opportunity to connect with some of our key sector stakeholders in person, many for the first time.

The three-day event was fast-paced and packed with insightful and informative sessions. The team heard about in-depth case studies that revealed the lengths perpetrators will go to access children and avoid detection, as well as the commitment, out-of-the-box thinking, and skill of the law enforcement officers on their trail. Alongside these case studies were presentations on innovative initiatives and programs to help prevent child sexual abuse, as well as several recent research studies into perpetrator behaviour.

Each of the presenters were concerned with how we disrupt the conditions that make child sexual abuse facilitated online possible. And for many of the presenters their focus was on shifting the intervention point to disrupting the crime before it takes place.

The Reverend Desmond Tutu is often acknowledged for his quote: “There comes a point where we need to stop just pulling people out of the river. We need to go upstream and find out why they’re falling in.” This sentiment was certainly shared in many of the presentations at YTVC.

The one recurring theme in almost all the presentations was that we can’t tackle this problem alone. Whether it’s law enforcement collaborating across borders to identify and catch perpetrators, LEAs working with financial institutions and the private sector to develop new technological innovations, or academics and regulators collaborating on research data to provide important insights and risk indicators, the key focus was on how much greater the outcomes are when we pool our resources and knowledge to tackle the issue from a multitude of angles.

Which is precisely our purpose as an organisation, to support and facilitate those organisations working to detect, report and prosecute this crime. And to offer our assistance to those organisations who are working upstream towards preventing the crime from occurring in the first place.

Sharing our collaborative experience with some of our stakeholders in our panel session at the event and seeing this theme echoed throughout the conference was both inspiring and affirming for the team.

The internet has revolutionised many aspects of life, including the exponential growth of CSE. Despite this, YTVC demonstrated that so many different organisations and industries within the sector are willing to do what it takes to help end child sexual exploitation.

With the magnitude of this problem, it would be easy to become discouraged. But, as was the message from many conference presenters, focusing on the opportunities for change and maintaining a victim-centric approach to our work helps to keep people in the game and save children from abuse and exploitation.

After all, this is why we’re all here.

Throughout childhood in Australia, more than 1 in 3 girls and almost 1 in 4 boys experience sexual abuse.

The sexual abuse of children is one of the most wicked crimes, exploiting some of the most vulnerable in our society. Our approach to protecting young people and preserving their childhood not only impacts their physical and mental wellbeing, it also determines the quality of their future and how we shape society as a whole into the future.

Since the advent of the internet more than two decades ago, the rate at which children are harmed has been escalating year on year. Each time contact abuse is filmed or photographed, or a child is coerced into self-generating child sexual abuse imagery, the mere existence of these images or videos is re-traumatising for victim survivors. In 2021-22, the ACCCE received more than 36,000 reports of online child sexual exploitation. And each instance of an image or video is evidence of a crime.

Just as we have regulation to protect children in the physical world, we need the same for the online space. Perhaps more so given the ease with which criminals can hide their identities and location when acting online. While children haven’t always been afforded protection, since the adoption of the United Nations Convention on the Rights of the Child in 1989, it is now accepted that children have the right to be protected and live a life free from abuse and neglect.

And, whilst in principle these rights also apply in the online space, regulating the internet presents unique challenges for those with the job of protecting children.

Australia’s eSafety Commissioner is making waves internationally as the first dedicated online safety regulator tackling the sexual exploitation and abuse of children. Our Online Safety Act, enacted in 2021, was momentous legislation, paving the way for an influx of online safety legislation globally that aims to create a safer online environment for children.

Under this legislation, eSafety began the rigorous process of developing a series of Industry Codes for various online platforms and technology companies to ensure that they have sufficient measures to prevent the online sexual harm of children. The platforms and software delivered by these organisations are often manipulated by offenders to harm, abuse, and exploit children. The second round of this process began in February this year, after Commissioner Julie Inman Grant told industry that the draft codes they delivered at the end of 2022 do not provide adequate community safeguards. During this process the community was once again given the opportunity to submit feedback on the proposed Codes prior to industry submitting the next iteration at the end of March 2023.

While we’re waiting for the outcome of the eSafety Commissioner’s review of the draft Industry Codes in the coming months, we’ve been interested to observe the increased legislative activities around the world in relation to the online safety of children. Although each jurisdiction quite rightly remains responsible for their own legislation, the internet and, therefore this crime, crosses borders. Each country needs to regulate based on their own laws, but having some coherence and alignment of approach will facilitate a better overall outcome for children.

France has made significant strides in recent months towards implementing strong regulatory measures to protect children from online harms, both in terms of restricting their access to harmful adult content and their ability to have unrestricted access to online social platforms. Last month, French MPs adopted legislation restricting social media access to children under 15 without explicit parental approval. This is a positive step forward, however the Bill still needs to pass the plenary session and the Senate before it can be implemented. Age verification is a measure that continues to be a consideration for both policymakers and internet companies, and France has made their position clear with this legislative change.

Meanwhile in the UK, the proposed Online Safety Bill places a responsibility on all online platforms to protect children from harm and remove child sexual abuse material. If this legislation is passed, platforms will have to proactively prevent that material from reaching users. The UK government has also received an open letter in recent weeks calling into question clauses that would allow Ofcom to compel communications providers to take action to prevent harm to users, specifically in relation to end-to-end encryption of messages. This demonstrates the difficult task that faces regulators in preventing the online abuse of children. And whilst the online platforms aren’t to blame for CSE material travelling around the internet – that responsibility rests squarely with those who commit this crime – they do have an obligation to make their platforms as safe as possible for children.

Regulators like eSafety are well-positioned to provide a clear framework for platforms and organisations to work within. The new Global Online Safety Regulators Network, headed by eSafety, demonstrates the importance of a collaborative response to this crime. The intention with the network is to create a more coherent approach to the problem on a global scale. Currently Australia, Fiji, Ireland and the UK have committed to regulatory collaboration and cohesion, with the aim to increase the network over time.

Whilst regulatory involvement in the issue of child sexual exploitation facilitated online is essential, the complexities of CSE means that a multi-pronged approach is critical to enhancing the detection, reporting, and prosecution of this crime. Legislation provides a clear framework and sets out the direction and expectations for how the other organisations in the CSE response ecosystem need to prioritise the safety of children.

We know that offenders communicate and collaborate in forums and communities, helping each other elude detection by sharing information and tips. A collaborative response underpinned by strong and clear legislation is our best chance of disrupting the perpetrator networks.

And we need collaboration at all levels, including the development of strong regulatory guidelines. Each contributor to the CSE response ecosystem brings their different perspective and experience, which is vital to the consistent review and stakeholder input needed. This ensures that the regulation reflects what’s needed to make the online world safe for children.

We’re eager to see the outcome of the eSafety Commissioner’s review of the Codes and the potential impacts when they are applied at a practical level. The collaborative process in developing them has been important to bring all the parties together in a robust discussion that has ultimately brought the issue of online child safety to the fore.

We’re excited to announce one of Australia’s corporate governance leaders, Kara Nicholls as the new Non-Executive Chair of ICMEC Australia.

Over the last nine months our team has almost quadrupled in size to add the essential specialised skills needed to deliver on the next phase of our important mission, shifting gears to embed the organisation as a key player within the child sexual exploitation (CSE) disruption ecosystem.

Reaching this growth milestone and establishing the foundations for achieving significant and impactful partnership and data collaboration targets was the goal of founding CEO and Executive Chair, Paul McCarney. And this has been the catalyst in his handing over the board reins to Kara.

“I’ve been involved in founding over 10 companies, but this four year journey has made me the proudest of what has been created in such a short space of time,” said Paul. “It excites me and fills my heart to think of the impact that the organisation will have in the future.”

“I’d like to thank Paul for his vision and the legacy he’s created for the team to contribute to such an important mission,” said Anna Bowden, ICMEC Australia CEO.

Kara brings deep governance skills and previous experience as a NFP Chair through to ensuring that we continue to meet the targets of the next phase of our journey.

Her career experience includes over 27 years’ experience in senior executive roles in global equity capital markets, commercial, regulatory, and corporate governance across ASX listed, private entity, startup and NFP organisations.

She has also previously held the positions of Chair and Independent, Non-Executive Director of Gidget Foundation Australia, a non-profit organisation supporting the emotional wellbeing of expectant and new parents, and Chair of the Nominations Committee, and Member of the Department of Accounting and Corporate Governance Advisory Board for Macquarie University.
She is currently a Non-Executive Director of Zoom2u Technologies Limited, Chair of their Audit & Risk Committee, and a member of their Sustainability Committee. In addition, she is the inaugural independent member of the Australian Medical Association (NSW)’s Audit & Risk Committee and a Non-Executive Director of social enterprise organisation, Ripple Learning Limited.

“Kara’s specialist expertise in governance and risk, combined with her senior level management and executive skills will be invaluable to steering us through this next pivotal stage of the organisation’s evolution, and will assist the Board in guiding us to deliver on our mission to enhance detection, reporting and prosecution of child sexual exploitation,” said Anna.

“I am delighted to be joining the board of ICMEC Australia. It’s a privilege and honour to serve as Chair of the Board. I look forward to working with the management team, my fellow directors, our collaboration partners, and our stakeholders to deliver on our strategy. I’m committed to supporting the organisation’s vision and mission, to collectively make a difference by facilitating meaningful change through collaboration and data driven innovation, disrupting perpetrator tactics, and working to safeguard vulnerable children.”

We are please to announce RedCompass Labs as the first of several recipients of our Child Protection Fund (CPF) grants.

Launched in July 2022, we established the CPF to support innovative data-driven approaches that help to reduce and prevent child sexual exploitation (CSE).

UK-based payments expert services company, RedCompass Labs specialises in the disruption of financial crime through investigation and analytics services. Their successful proposal to the Fund aims to provide Australian banks and other financial institutions with access to cutting-edge risk management insights relating to CSE. Using big data analysis, the online pilot, the Australian RedFlag Accelerator CSE Portal, will provide Australian financial institutions with additional localised intelligence to assist their work in the detection and reporting of CSE. Developing these localised insights and logic enhances the ability for financial institutions to detect crimes against children.

Paul Jevtovic, Chief Financial Crime Risk Officer & Executive MLRO, NAB says: “Keeping customers safe and criminals out of the financial system is a top priority for NAB and we are delivering data-driven approaches to combat financial crimes. The Child Protection Fund investment in the Australian RedFlags and Portal is a positive step to support critical efforts to combat exploitation. The project should further support the sharing of best-practice across the financial services sector.”

We first introduced the work of RedCompass Labs to Australian financial institutions crimes units in 2022.

“We received a lot of interest from our financial institution partners in the type of data that RedCompass Labs gathers and analyses,” says Anna Bowden, CEO ICMEC Australia.

“This encouraged RedCompass Labs to apply for our funding to be able to provide Australian-specific data and typologies for the first time to our financial institutions via an online portal.”

Our funding approach is catalytic, and outcomes focused. Funding strategies include grants, impact investing and, the approach with RedCompass Labs, venture philanthropy. Essentially operating as a pre-seed investment vehicle, the CPF identifies innovations with significant potential for impact that need initial support to deliver a pilot or proof of concept.

The incubator-style fund combines financial with non-financial support through our Partnerships and Data Products teams. This facilitates collaboration and product adoption by the organisation’s stakeholders and promotes the sustainability and scalability of these disruptive innovations.

“By covering the initial risk capital, we have facilitated access to multiple financial services companies – removing commercial limitations and significantly increasing the impact of the program.”

We have already established support from four of our partner Australian financial institutions, including three of the Big Four banks, to work closely with RedCompass Labs as they design and build the CSE portal.

“This collaboration will ensure that the project will deliver the data in accessible and usable formats and enhance bank processes when detecting this crime type within their transactions.”

The pilot with RedCompass Labs is one of several initiatives that we will facilitate in partnership with the major financial services institutions across Australia. Additional projects will include leveraging and collating data from a variety of internet sources, sponsoring the creation of a global best practices implementation manual in CSE investigation and ongoing initiatives such as the Collaboration Working Group, that brings together senior professionals in financial crime, law enforcement, regulation and child protection NGOs.

“This funding opportunity and partnership with ICMEC Australia is an incredible boost to our mission to disrupt child sexual exploitation. We look forward to leveraging our deep payments and data science expertise and working alongside ICMEC Australia and Australian banks to protect children from exploitation”, said Jonathan Bell, Partner and President, RedCompass Labs.

We will measure and review the successes of this pilot project with RedCompass Labs throughout the pilot phase. The results will help determine further opportunities to scale-up the portal access to more Australian financial institutions, with the ultimate aim to rollout availability to any financial institution in the Asia-Pacific region.

If you would like to know more about the Child Protection Fund please send an email to our Head of Impact, Tiphanie Au at tau@icmec.org.au.

As a community we can all play a critical role in protecting children from harm. Whether you’re a frontline law enforcement officer, a parent, teacher or just someone who is a social media user, changing the public conversation surrounding the sexual abuse and exploitation of children is critical to reaching better outcomes for children. Using consistent and accurate terminology helps to generate awareness of current threats and issues, and can engage more people in taking action.

The news media is a critical platform for raising public awareness, communicating messages, shaping perceptions and breaking taboos about child sexual abuse and exploitation. Media professionals have considerable power and responsibilities through their storytelling role, from the way that they frame stories to the language and terminology they use. These stories and the wider public discourse that they generate can lead to a positive or negative impact on children and victim survivors of sexual abuse. Choosing the words we use in these conversations is an exercise in care and consideration.

In February Dr Kerry McCallum presented an outline of  her research, conducted with her team at the News and Media Research Centre at the University of Canberra, to the attendees of our inaugural Monthly Brown Bag session. Through this research they are pioneering the development of media guidelines for the reporting of child sexual abuse. Inspired by the production of reporting guides on the reporting of suicide and metal ill-health, which was critical to shift the nature of the conversation around these issues, Dr McCallum’s team is working towards a set of guidelines that they hope will shape a new public conversation on child sexual abuse and exploitation.

Early results from their current research indicates that biases, uncovered in previous research, towards institutional and celebrity involvement in child sexual exploitation in media reporting of CSE are still prevalent, with the majority of stories focused on the perpetrator. Of the news stories that Dr McCallum’s team analysed, the perpetrator was the main subject in a vast majority of them. Victims and survivors were the focus in only 14% of articles.

Changing the way that media stories are framed, making them more victim survivor-centric, broadening the topics and creating more awareness through sharing stories of real people has a significant impact on changing public perception around the subject of child sexual abuse and exploitation, and ultimately saves more children from harm.

If you weren’t able to attend Dr McCallum’s presentation, the recording is available to members of the ICMEC Australia Member Portal, managed by our Partnerships Team. Dr McCallum has also taken part in an interview with our team to explore the concepts of her work more deeply and discuss how we can contribute as a response community to creating change. The resulting article will be published in our Member Portal shortly.

If you’re a member of the child sexual exploitation response ecosystem and you’re not yet a Member of the Portal, you can apply to register here. Once approved, you’ll gain access to exclusive content, like the interview with Dr McCallum, to help in the fight against CSE. Being a member of the Portal also gives you the opportunity to connect with cross-sector participants to facilitate collaborative solutions to this wicked problem.

One of the key messages to emerge from our UCD research last year was that the perpetrators of child sexual exploitation (CSE) are networked and collaborative. Our response to combating this heinous crime needs to be equally collaborative. As our mission is to facilitate cross-industry collaboration and support the implementation of data-collaborative solutions that can enhance the detection, reporting, and prosecution of CSE, we needed a tool that could enable continuous connection between stakeholders.

After our beta testing phase in 2022, we are pleased to announce that the Production Edition of our Member Portal is now live! The portal is a collaborative hub for all our partners working tirelessly to fight against child sexual exploitation and abuse. Whether this is across financial services, law enforcement, regulators, academia, online platforms, or the not-for-profit sector, our Member Portal is an online space designed to help cross-industry professionals to connect.

We provide a multi-layered approach to collaboration within the CSE response ecosystem through events, the Collaboration Working Group and initiatives such as the APAC Financial Coalition Forum, in conjunction with our global colleagues. Our Member Portal is an expansion upon these more formal and infrequent methods of collaboration. It enables stakeholders to continue conversations, follow up on information, and connect and share at any time. And it helps those working to end this crime to be as networked as the perpetrators.

The Member Portal is expertly convened by our Community Manager, Francesca Funayama, who is on hand via the portal during business hours to assist with any queries about content, industry news or facilitating connections with other members of the community. Or, you can contact her via email at members@icmec.org.au.

By joining our Member Portal, you’ll gain access to exclusive ICMEC Australia content, curated news, a member directory to connect with other individuals working in the CSE disruption space, and a variety of expert groups where you can share information with likeminded members. And we are already receiving positive feedback from our members.

“I joined this community because I have been impressed with the work that ICMEC do including, and particularly, the advocacy work in this space,” said Julie Green, Research Assistant, Dept. Social Work, University of Melbourne.

“I really, really, really appreciate all the posts and links you have been sharing with the group. Thank you for keeping us all up to speed with important things going on.”

If you’re engaged in the fight against child sexual exploitation, we invite you to join our member portal.

Register here: https://members.icmec.org.au/

It’s time we started having uncomfortable conversations about one of the fastest growing crimes that disproportionately affects girls, leaving a lasting impact on their lives.

Child sexual exploitation is a subject that most people don’t want to talk about, especially in public forums. Much like our aversion to discussing domestic and family violence before Rosie Batty bravely shared her pain to put the issue onto the public agenda almost ten years ago, it’s a topic that is difficult and fraught with misconception.

It’s estimated that one in five Australian girls has experienced sexual violence by the age of 15. And with several studies indicating that the rate of disclosure could be as low as 25%, it’s likely that many more are affected.

It’s an age-old problem, but the advent of the internet has seen the issue escalate as technology and perpetrator access to networks and sharing, both on the clear and dark webs, has improved. It has also added new crime types, ways to generate child sexual abuse material (CSAM) and the ability for perpetrators to more freely groom masses of victims.

The week of International Women’s Day is a fitting time to acknowledge four Australian who women are putting child sexual abuse on the agenda and leading organisations on a mission to putting an end to the sexual exploitation of children in its many forms.

Julie Inman Grant – Australian eSafety Commissioner

Julie Inman Grant is Australia’s eSafety Commissioner, leading the world’s first government regulator dedicated to keeping its citizens safer online. In this role Julie has launched the global Safety by Design initiative and led work to stand up novel and world-first regulatory regimes under the new Online Safety Act 2021, implementing a sweeping new set of online safety reforms. Julie spent two decades working in senior public policy and safety roles in the tech industry.

“We need online platforms to make it much harder for predators to view, produce, share and sell child exploitation material. While the exploitation of girls is an age-old issue, the demand for this type of material has been supercharged by the internet. The overwhelming majority of this deeply harmful and traumatising online material, 96 per cent of it, features girls.

“One of the biggest barriers we face is a lack of information about how far and wide this material is spreading. Our recent world-first transparency report revealed that industry is only tinkering at the edges of this horrific problem.

“And if we don’t know the extent of the harm, how can we possibly be taking the necessary steps to eradicate it?

“It’s time to stop ignoring the fact that crime of horrendous proportions is happening on the platforms we all use every day.”

Anna Bowden – CEO, ICMEC Australia

Anna Bowden’s extensive experience in impact investing, philanthropy and impact strategy is critical to her role as CEO of ICMEC Australia. Having worked across government, social impact organisations, foundations and consulting, Anna provides a deep understanding of outcomes-focused programs. With a very personal connection to the vital work of countering child sexual exploitation, Anna is now leading ICMEC Australia’s programs to implement innovative digital projects that help detect and report the online traces left by perpetrators of child sexual exploitation.

“We need all-in, whole-of-society partnerships to protect girls from the dangers that face them. For too long, we’ve shied away from this topic. It’s too horrendous, too awful, too painful to think about. That’s not helpful. We all need to fight this together. Victims, families, the private sector, government, society.

“I have two young girls, and I want them to grow up into a world which is much safer than it is today. I want them to know we all collaborated to create that world for them.”

“Perpetrators of this crime are networked. They collaborate and they share information, not just materials, but how to avoid detection, how to meet children. They collaborate and move quickly. We need to enable those fighting the crime to collaborate and share information as easily as the perpetrators do.

“We need a much greater focus on prevention, but we also need to be thinking about how we build the systems that will prevent this in two, three, five years, because the technology is changing so rapidly.”

Dr Leanne Beagley – CEO, The National Centre for Action on Child Sexual Abuse

Dr Leanne Beagley is a senior leader within child and family mental health, combining clinical experience with extensive contributions to policy change through roles within the Victorian Government and at a national level with health and advocacy organisations.

As the CEO of the National Centre for Action on Child Sexual Abuse (the National Centre), Leanne leverages her career passion for establishing a safe future for Australian children to drive real change. In her current role, she is doing this by crafting a response to the issue of child sexual exploitation based on the lived experience of victims and survivors of all ages.

“To create a safer future for girls, it is critical that relationships with parents, teachers and other trusted adults encourage sharing, education about personal boundaries and understanding of their own rights with regards to their bodies.”

“To avoid exploitation, it is integral that girls witness respect by other adults for the women in their lives in all settings and contexts.”

“Early discussions about online safety are also essential. One of the key issues preventing us from keeping girls safe is a lack of understanding and education around online grooming and how to prevent it, amongst parents, teachers and the children themselves. We need to understand how this happens by learning from those who have experienced it and then work together to combat the significant harm that it represents.

“We live in a time where children have easy access to inappropriate content, which serves to normalise certain behaviours and puts them more at risk of being groomed. We need to find ways to prevent this access and offer advice to parents and caregivers on managing these risks.”

Alison Geale – CEO, Bravehearts

Alison Geale is committed to innovation and collaboration, with a focus on strengthening sector partnerships, and believes that all organisations working together will create a safer world for children. She has been the CEO of Bravehearts since 2019 and is an experienced leadership professional with over 20 years of experience in Australian media. Ensuring children are safe from sexual abuse fuels the everyday work and strategic direction of Bravehearts, which has a relentless commitment to the prevention and treatment of child sexual abuse.

“Girls and young women remain one of the most heavily marketed to cohorts. Often the messages they receive reinforce unhealthy stereotypes and aspirations for young girls on many levels, including sex. Young girls are so much more than what they look like, or how sexy they are.

“Advertisers, marketers, social media, and those with influence need to be committed to addressing the imbalance of how girls are portrayed, valued, and marketed to. Adults need to understand the scale and depth of the problem, so we need education for parents, carers, teachers to enable them to skill-up to the level of their children. We can’t just leave the education and protection of girls to themselves.”

“We create barriers to improve in this space through the unhealthy norms displayed and aspired to across digital platforms and social media that sadly have led to increased vulnerability and decreased resilience in young girls. Big tech must act to protect them, it’s their backyard.

“As girls continue to be sold unrealistic and unhealthy benchmarks in their formative years on beauty, health, and sex, the more vulnerable they become to exploitation. The shame and stigma of this crime, which can keep young girls in exploitative situations, needs to be removed. Further, when women share their lived experiences of this crime, it helps provide preventative education which can protect young girls today.”

This International Women’s Day let’s start having the conversations that need to happen. Let’s embrace the innovation and technological change that’s necessary. And let’s all be part of the education that’s needed around this subject to keep girls safe from harm and help them achieve lives of equality and empowerment.

Subscribe to the ICMEC Australia newsletter

Stay up to date with the latest news, information and activities.

ICMEC Australia acknowledges Traditional Owners throughout Australia and their continuing connection to lands, waters and communities. We pay our respects to Aboriginal and Torres Strait Islanders, and Elders past and present.

Copyright © 2024. All Rights Reserved. |  Logo by Blisttech. Design by Insil.
magnifiercrosschevron-downtext-align-justify