Ahead of presenting at the COBA Financial Crimes Symposium on 12th July, ICMEC Australia Chair, Kara Nicholls, shares her thoughts about how ESG practices achieve better outcomes when approached from the perspective of those who have the lived experience of the problems that organisations are trying to help solve, especially in the case of child sexual exploitation.
The corporate framework set out for Environment, Social and Governance issues has traditionally been created very much from the perspective of the organisation. One of the primary reasons for this is that mitigation of risk is a major concern for organisations, especially in the world of banking and finance.
Having managed the governance function of a major financial institution following the conclusion of the Banking Royal Commission, I have experienced first-hand the role that risk management plays in organisational processes and priorities in relation to ESG issues.
From a risk and governance perspective, a bank or financial institution needs to have effective and adequate systems and processes for a myriad of matters, including: ensuring that they aren’t party to or facilitating criminal activities; regulatory engagement such as identifying and reporting suspicious transactions to AUSTRAC; and protecting customers’ financial assets and interests. Often, when these processes are considered primarily through a risk mitigation and regulation lens, it is understandable how they may lose sight of the human aspect of why these activities are important, and the impact they have on people and the communities in which they operate.
By flipping the switch on how we consider ESG frameworks and processes, the results can be impactful, and life changing. Approaching these issues with a strong people, community and ESG lens can change how an organisation plans, prepares and responds, leading to significantly better outcomes for customers, the organisation, its communities and the economy.
And nothing illustrates this better than how a financial institution manages the detection and reporting of child sexual exploitation (CSE). CSE facilitated online is one of the fastest growing crimes globally. And like it or not, financial institutions play a vital and pivotal role in protecting children from harm.
With our reliance on technology and access to the internet continuing to increase, the methods and opportunities to harm children are also growing at scale. The traditional public perception of this crime, somewhat perpetuated by the media coverage of the Royal Commission into the Institutional Response to Child Sexual Abuse, is centred on a certain stereotype. However, the victims and perpetrators of this crime come from all walks of life, which means both can be customers of any financial institution.
The changing nature of sexual crimes against children, along with the opportunities for new ways to abuse and exploit children through technology and online connection, has increased the necessity for banks and financial institutions to ensure that they remain vigilant, agile and engaged with the problem and become part of the solution. Crimes such as live streaming of abuse, coerced self-generated child sexual abuse material, child sextortion and more are on the rise. These crimes can all leave a digital financial footprint.
Unlike other financial crimes, such as money laundering for example, detecting child sexual exploitation in a financial context is difficult. The low value, often seemingly innocuous transactions are hard to pinpoint and identify.
The consequences of CSE for victims and their families can be tragically swift and devastatingly long lasting, impacting almost every aspect of their, and their family’s lives. This is the unquantifiable cost of CSE to the victim-survivor. There is also the impact on our health and medical systems and the economy.
We need to re-imagine how we consider ESG as we know that the expectations of our customers and our members/shareholders are high. We need to earn and retain our social licence to operate. It is not only important, it is crucial that we embed into our ESG approach, systems and processes thinking that puts prevention and the victim-survivor at the centre, as then we are striving to meet and exceed our stakeholders’ expectations, and seek to deliver the long-term positive social impact for customers and employees.
The true cost and savings of an authentic approach to ESG practices is ultimately a human one. And this is at the heart of every customer-owned banking institution.
Kara Nicholls is the Chair of ICMEC Australia. She has over 27 years’ experience in senior executive positions across ASX listed, private entity, startup and NFP organisations, and has held both corporate and NFP board Chair positions. This article was first published by COBA for their members, and is reproduced with permission.
In the lead-up to the FIFA Women’s World Cup, It’s A Penalty launched their 2023 #KeepKidsSafe Campaign to help raise awareness of child abuse and exploitation, especially in a sporting context.
In support of this campaign, ICMEC Australia hosted the It’s a Penalty #KeepKidsSafe Launch event on 6th July 2023. The event featured an expert panel discussing the challenges our society faces navigating our response to child sexual exploitation online, in businesses in our communities.
Our special guest speakers discussed how this issue is prevalent in so many different areas of our lives, and what we can all do to help keep children safe. Find the recording of the event below.
Visit the It’s A Penalty website to find our more about their meaningful work, and the #KeepKidsSafe campaign.
It seems that we can’t turn on any device, go to any platform or view any news feed without seeing a mention of AI, specifically generative AI. Whether it’s professionals worried about the implications that recently released apps like ChatGPT might have on the security of their jobs, or evangelists espousing the great benefits that companies will reap from this time-saving productivity machine, the debates continue fiercely.
But one area that has far more potential to be impacted, and with much greater and devastating consequences, is the sexual abuse and exploitation of children. For those working in the child sexual exploitation (CSE) response community, the discussion also swings between the concepts of fear about potential negative outcomes, and hope that we could be looking at a powerful tool for those tracking down perpetrators.
There is no question that this technology, which is now easily and often freely available, could be used to create child sexual abuse material (CSAM). The concept of computer generated CSAM is certainly not new. Children have already been subjected to image-based abuse using deep fake technology, as we saw in our May Monthly Brown Bag presentation. But generative AI technology, now available to anyone, is taking this abuse to new level. And it has many in the sector worried, especially by the speed with which the technology has been made available with seemingly no consideration given to what guardrails we might need to put in place.
But we can learn from the past and ensure that we install proper protections for children before it’s too late. The situation with AI has been compared to the introduction and rapid expansion of social media platforms, which were left to their own devices to write the rule book.
Whilst it appears that AI companies like OpenAI have already put measures in place to prevent the creation of CSAM using their technology, open-source platforms have been much slower to consider such protections, if they’ve even included them at all. And it’s these platforms that are enabling the product of Al-generated CSAM. These open-source platforms, many based on a model created by Meta called LLaMA, have strong support from those who claim that they will be the vehicle for accelerated innovation. But they also facilitate easy production of abuse material, using images of real children as source material.
The harms and impact of AI-generated CSAM are significant. The image-based abuse of a child whose picture is has been used to create CSAM, the revictimisation of children by producing additional and progressively violent computer-generated versions of their abuse, or the potential for the material triggering an interest in CSAM that escalates to contact offending, are all potential issues. The images might be ‘fake’, but the dangers and abuse are real.
Researchers from Thorn and the Stanford Internet Observatory have released a research paper this week that examined the potential implications of photorealistic CSAM produced by generative Al. They found that less than 1 percent of CSAM found in known predatory communities was photorealistic Al-generated images. But in a New York Times article, David Thiel, Chief Technologist at the Stanford Internet Observatory and co-author of the report said that “within a year, we’re going to be reaching very much a problem state in this area.”
Regulators around the world have acknowledged the potential for harm created by this technology and recognise that we need strong and cohesive regulation worldwide. Several regions, including Europe, the US, Canada and Australia, which is currently conducting community consultation into Al regulation, are at various stages of introducing legislation to regulate AI.
And just as in the physical world, where our law enforcers are subject to both administering and being administered by the law, Al regulation will incorporate appropriate protections for citizens as investigators harness the powerful potential of Al tools to help them detect, report and prosecute AI-generated CSAM.
If you’re interested in hearing more about AI and its implications for child sexual abuse, Colm Gannon will expand on these concepts at our June Monthly Brown Bag event. Make sure you register today.
Our June 2023 Brown Bag event covered the key topic of artificial intelligence (AI) which is currently in the spotlight due to its wide applications across aspects of life such as business, education and law enforcement.
Our presenter, Colm Gannon discussed the regulatory landscape, the risks and the potential harms of AI, especially in the case of child sexual exploitation (CSE) and child sexual abuse material (CSAM). Colm is an AI/Machine Learning expert, working as a Product Manager with Irish technology company, Rigr AI. He has over 20 years’ experience in Law Enforcement, involved in national and international investigations and prosecutions relating to online harms, child sexual abuse and exploitation, violent extremism, and harmful online communications.
Colm’s presentation examined the impacts that AI is having on the generation of CSAM and ways that AI can be positively, safely and successfully applied by law enforcement and the financial services industries in the detection, reporting and prosecution of CSE.
View the recording below.
Our May 2023 Brown Bag event discussed the importance of preventing image-based sexual violence and the impact this issue has on children and young people. With a focus on the extent of harms victim-survivors of this extreme form of abuse endure, this session featured key takeaways from survivors and advocates on the path towards justice and healing. As the ways we are able to connect online expand, so do the risks to children and young people.
The session brought a global perspective to inform our local response, and shared how different organisations, companies, and individuals working in this space can collaborate to move toward the greater goal of protecting children from this online harm that creates real-life, devastating consequences.
Andrea Powell, Director of the Image-based Sexual Abuse Initiative at Panorama Global, shared her wealth of knowledge from working extensively with victim-survivors of sexual violence. Andrea’s presentation shared her vision of a world where children and young people are free from the enduring trauma that results from image-based abuse, and other types of online harm.
View the recording of this informative session below.
As a relatively new organisation in Australia, with a passionate team bringing expertise from a variety of backgrounds, finding ways that our people can connect fully to our cause and mission is paramount.
The Youth, Technology and Virtual Communities (YTVC) conference run by Queensland Police Service’s Task Force Argos on the Gold Coast in April was the perfect opportunity for us to learn more about how our stakeholders are approaching the challenges they face. As a sponsor of the conference, several of our team were able to join 500 other delegates in learning the latest trends and insights from the Child Sexual Exploitation (CSE) response community, immersing themselves in this difficult but vital subject.
Being surrounded by some of the world’s leading experts in CSE was certainly a valuable and eye-opening education for the team. And one of the great benefits in attending the conference was the opportunity to connect with some of our key sector stakeholders in person, many for the first time.
The three-day event was fast-paced and packed with insightful and informative sessions. The team heard about in-depth case studies that revealed the lengths perpetrators will go to access children and avoid detection, as well as the commitment, out-of-the-box thinking, and skill of the law enforcement officers on their trail. Alongside these case studies were presentations on innovative initiatives and programs to help prevent child sexual abuse, as well as several recent research studies into perpetrator behaviour.
Each of the presenters were concerned with how we disrupt the conditions that make child sexual abuse facilitated online possible. And for many of the presenters their focus was on shifting the intervention point to disrupting the crime before it takes place.
The Reverend Desmond Tutu is often acknowledged for his quote: “There comes a point where we need to stop just pulling people out of the river. We need to go upstream and find out why they’re falling in.” This sentiment was certainly shared in many of the presentations at YTVC.
The one recurring theme in almost all the presentations was that we can’t tackle this problem alone. Whether it’s law enforcement collaborating across borders to identify and catch perpetrators, LEAs working with financial institutions and the private sector to develop new technological innovations, or academics and regulators collaborating on research data to provide important insights and risk indicators, the key focus was on how much greater the outcomes are when we pool our resources and knowledge to tackle the issue from a multitude of angles.
Which is precisely our purpose as an organisation, to support and facilitate those organisations working to detect, report and prosecute this crime. And to offer our assistance to those organisations who are working upstream towards preventing the crime from occurring in the first place.
Sharing our collaborative experience with some of our stakeholders in our panel session at the event and seeing this theme echoed throughout the conference was both inspiring and affirming for the team.
The internet has revolutionised many aspects of life, including the exponential growth of CSE. Despite this, YTVC demonstrated that so many different organisations and industries within the sector are willing to do what it takes to help end child sexual exploitation.
With the magnitude of this problem, it would be easy to become discouraged. But, as was the message from many conference presenters, focusing on the opportunities for change and maintaining a victim-centric approach to our work helps to keep people in the game and save children from abuse and exploitation.
After all, this is why we’re all here.
Throughout childhood in Australia, more than 1 in 3 girls and almost 1 in 4 boys experience sexual abuse.
The sexual abuse of children is one of the most wicked crimes, exploiting some of the most vulnerable in our society. Our approach to protecting young people and preserving their childhood not only impacts their physical and mental wellbeing, it also determines the quality of their future and how we shape society as a whole into the future.
Since the advent of the internet more than two decades ago, the rate at which children are harmed has been escalating year on year. Each time contact abuse is filmed or photographed, or a child is coerced into self-generating child sexual abuse imagery, the mere existence of these images or videos is re-traumatising for victim survivors. In 2021-22, the ACCCE received more than 36,000 reports of online child sexual exploitation. And each instance of an image or video is evidence of a crime.
Just as we have regulation to protect children in the physical world, we need the same for the online space. Perhaps more so given the ease with which criminals can hide their identities and location when acting online. While children haven’t always been afforded protection, since the adoption of the United Nations Convention on the Rights of the Child in 1989, it is now accepted that children have the right to be protected and live a life free from abuse and neglect.
And, whilst in principle these rights also apply in the online space, regulating the internet presents unique challenges for those with the job of protecting children.
Australia’s eSafety Commissioner is making waves internationally as the first dedicated online safety regulator tackling the sexual exploitation and abuse of children. Our Online Safety Act, enacted in 2021, was momentous legislation, paving the way for an influx of online safety legislation globally that aims to create a safer online environment for children.
Under this legislation, eSafety began the rigorous process of developing a series of Industry Codes for various online platforms and technology companies to ensure that they have sufficient measures to prevent the online sexual harm of children. The platforms and software delivered by these organisations are often manipulated by offenders to harm, abuse, and exploit children. The second round of this process began in February this year, after Commissioner Julie Inman Grant told industry that the draft codes they delivered at the end of 2022 do not provide adequate community safeguards. During this process the community was once again given the opportunity to submit feedback on the proposed Codes prior to industry submitting the next iteration at the end of March 2023.
While we’re waiting for the outcome of the eSafety Commissioner’s review of the draft Industry Codes in the coming months, we’ve been interested to observe the increased legislative activities around the world in relation to the online safety of children. Although each jurisdiction quite rightly remains responsible for their own legislation, the internet and, therefore this crime, crosses borders. Each country needs to regulate based on their own laws, but having some coherence and alignment of approach will facilitate a better overall outcome for children.
France has made significant strides in recent months towards implementing strong regulatory measures to protect children from online harms, both in terms of restricting their access to harmful adult content and their ability to have unrestricted access to online social platforms. Last month, French MPs adopted legislation restricting social media access to children under 15 without explicit parental approval. This is a positive step forward, however the Bill still needs to pass the plenary session and the Senate before it can be implemented. Age verification is a measure that continues to be a consideration for both policymakers and internet companies, and France has made their position clear with this legislative change.
Meanwhile in the UK, the proposed Online Safety Bill places a responsibility on all online platforms to protect children from harm and remove child sexual abuse material. If this legislation is passed, platforms will have to proactively prevent that material from reaching users. The UK government has also received an open letter in recent weeks calling into question clauses that would allow Ofcom to compel communications providers to take action to prevent harm to users, specifically in relation to end-to-end encryption of messages. This demonstrates the difficult task that faces regulators in preventing the online abuse of children. And whilst the online platforms aren’t to blame for CSE material travelling around the internet – that responsibility rests squarely with those who commit this crime – they do have an obligation to make their platforms as safe as possible for children.
Regulators like eSafety are well-positioned to provide a clear framework for platforms and organisations to work within. The new Global Online Safety Regulators Network, headed by eSafety, demonstrates the importance of a collaborative response to this crime. The intention with the network is to create a more coherent approach to the problem on a global scale. Currently Australia, Fiji, Ireland and the UK have committed to regulatory collaboration and cohesion, with the aim to increase the network over time.
Whilst regulatory involvement in the issue of child sexual exploitation facilitated online is essential, the complexities of CSE means that a multi-pronged approach is critical to enhancing the detection, reporting, and prosecution of this crime. Legislation provides a clear framework and sets out the direction and expectations for how the other organisations in the CSE response ecosystem need to prioritise the safety of children.
We know that offenders communicate and collaborate in forums and communities, helping each other elude detection by sharing information and tips. A collaborative response underpinned by strong and clear legislation is our best chance of disrupting the perpetrator networks.
And we need collaboration at all levels, including the development of strong regulatory guidelines. Each contributor to the CSE response ecosystem brings their different perspective and experience, which is vital to the consistent review and stakeholder input needed. This ensures that the regulation reflects what’s needed to make the online world safe for children.
We’re eager to see the outcome of the eSafety Commissioner’s review of the Codes and the potential impacts when they are applied at a practical level. The collaborative process in developing them has been important to bring all the parties together in a robust discussion that has ultimately brought the issue of online child safety to the fore.
We’re excited to announce one of Australia’s corporate governance leaders, Kara Nicholls as the new Non-Executive Chair of ICMEC Australia.
Over the last nine months our team has almost quadrupled in size to add the essential specialised skills needed to deliver on the next phase of our important mission, shifting gears to embed the organisation as a key player within the child sexual exploitation (CSE) disruption ecosystem.
Reaching this growth milestone and establishing the foundations for achieving significant and impactful partnership and data collaboration targets was the goal of founding CEO and Executive Chair, Paul McCarney. And this has been the catalyst in his handing over the board reins to Kara.
“I’ve been involved in founding over 10 companies, but this four year journey has made me the proudest of what has been created in such a short space of time,” said Paul. “It excites me and fills my heart to think of the impact that the organisation will have in the future.”
“I’d like to thank Paul for his vision and the legacy he’s created for the team to contribute to such an important mission,” said Anna Bowden, ICMEC Australia CEO.
Kara brings deep governance skills and previous experience as a NFP Chair through to ensuring that we continue to meet the targets of the next phase of our journey.
Her career experience includes over 27 years’ experience in senior executive roles in global equity capital markets, commercial, regulatory, and corporate governance across ASX listed, private entity, startup and NFP organisations.
She has also previously held the positions of Chair and Independent, Non-Executive Director of Gidget Foundation Australia, a non-profit organisation supporting the emotional wellbeing of expectant and new parents, and Chair of the Nominations Committee, and Member of the Department of Accounting and Corporate Governance Advisory Board for Macquarie University.
She is currently a Non-Executive Director of Zoom2u Technologies Limited, Chair of their Audit & Risk Committee, and a member of their Sustainability Committee. In addition, she is the inaugural independent member of the Australian Medical Association (NSW)’s Audit & Risk Committee and a Non-Executive Director of social enterprise organisation, Ripple Learning Limited.
“Kara’s specialist expertise in governance and risk, combined with her senior level management and executive skills will be invaluable to steering us through this next pivotal stage of the organisation’s evolution, and will assist the Board in guiding us to deliver on our mission to enhance detection, reporting and prosecution of child sexual exploitation,” said Anna.
“I am delighted to be joining the board of ICMEC Australia. It’s a privilege and honour to serve as Chair of the Board. I look forward to working with the management team, my fellow directors, our collaboration partners, and our stakeholders to deliver on our strategy. I’m committed to supporting the organisation’s vision and mission, to collectively make a difference by facilitating meaningful change through collaboration and data driven innovation, disrupting perpetrator tactics, and working to safeguard vulnerable children.”
After her informative presentation at our first Monthly Brown Bag session, we sat down with Dr Kerry McCallum to expand on her research into media reporting on child sexual abuse.
This expert interview paper takes a deeper dive into the influence that media discourse can have on public perception and awareness of child sexual abuse and exploitation, and how it can affect victim-survivors.
Since our event in February, Dr McCallum and her team at the News and Media Research Centre at the University of Canberra have published the Media Guidelines for reporting on child sexual abuse, commissioned by the National Office for Child Safety (NOCS), which you can find here.
Download a copy of this examination into news reporting, public perception, awareness, and how the response community can contribute to better outcomes for children.
We are please to announce RedCompass Labs as the first of several recipients of our Child Protection Fund (CPF) grants.
Launched in July 2022, we established the CPF to support innovative data-driven approaches that help to reduce and prevent child sexual exploitation (CSE).
UK-based payments expert services company, RedCompass Labs specialises in the disruption of financial crime through investigation and analytics services. Their successful proposal to the Fund aims to provide Australian banks and other financial institutions with access to cutting-edge risk management insights relating to CSE. Using big data analysis, the online pilot, the Australian RedFlag Accelerator CSE Portal, will provide Australian financial institutions with additional localised intelligence to assist their work in the detection and reporting of CSE. Developing these localised insights and logic enhances the ability for financial institutions to detect crimes against children.
Paul Jevtovic, Chief Financial Crime Risk Officer & Executive MLRO, NAB says: “Keeping customers safe and criminals out of the financial system is a top priority for NAB and we are delivering data-driven approaches to combat financial crimes. The Child Protection Fund investment in the Australian RedFlags and Portal is a positive step to support critical efforts to combat exploitation. The project should further support the sharing of best-practice across the financial services sector.”
We first introduced the work of RedCompass Labs to Australian financial institutions crimes units in 2022.
“We received a lot of interest from our financial institution partners in the type of data that RedCompass Labs gathers and analyses,” says Anna Bowden, CEO ICMEC Australia.
“This encouraged RedCompass Labs to apply for our funding to be able to provide Australian-specific data and typologies for the first time to our financial institutions via an online portal.”
Our funding approach is catalytic, and outcomes focused. Funding strategies include grants, impact investing and, the approach with RedCompass Labs, venture philanthropy. Essentially operating as a pre-seed investment vehicle, the CPF identifies innovations with significant potential for impact that need initial support to deliver a pilot or proof of concept.
The incubator-style fund combines financial with non-financial support through our Partnerships and Data Products teams. This facilitates collaboration and product adoption by the organisation’s stakeholders and promotes the sustainability and scalability of these disruptive innovations.
“By covering the initial risk capital, we have facilitated access to multiple financial services companies – removing commercial limitations and significantly increasing the impact of the program.”
We have already established support from four of our partner Australian financial institutions, including three of the Big Four banks, to work closely with RedCompass Labs as they design and build the CSE portal.
“This collaboration will ensure that the project will deliver the data in accessible and usable formats and enhance bank processes when detecting this crime type within their transactions.”
The pilot with RedCompass Labs is one of several initiatives that we will facilitate in partnership with the major financial services institutions across Australia. Additional projects will include leveraging and collating data from a variety of internet sources, sponsoring the creation of a global best practices implementation manual in CSE investigation and ongoing initiatives such as the Collaboration Working Group, that brings together senior professionals in financial crime, law enforcement, regulation and child protection NGOs.
“This funding opportunity and partnership with ICMEC Australia is an incredible boost to our mission to disrupt child sexual exploitation. We look forward to leveraging our deep payments and data science expertise and working alongside ICMEC Australia and Australian banks to protect children from exploitation”, said Jonathan Bell, Partner and President, RedCompass Labs.
We will measure and review the successes of this pilot project with RedCompass Labs throughout the pilot phase. The results will help determine further opportunities to scale-up the portal access to more Australian financial institutions, with the ultimate aim to rollout availability to any financial institution in the Asia-Pacific region.
If you would like to know more about the Child Protection Fund please send an email to our Head of Impact, Tiphanie Au at tau@icmec.org.au.
ICMEC Australia acknowledges Traditional Owners throughout Australia and their continuing connection to lands, waters and communities. We pay our respects to Aboriginal and Torres Strait Islanders, and Elders past and present.