

‘Nudify’ economy: AI-powered “nudify” apps are fueling a disturbing rise in non-consensual sexual deepfakes, exploiting victims without their knowledge or consent. These apps use artificial intelligence to manipulate images, generating fake explicit content that can be used for harassment, blackmail, or online abuse. Australia is witnessing a growing concern over the ease with which these tools can be accessed, often through cryptocurrency transactions that make tracking offenders more difficult.
This article from Crikey delves into how these AI-driven exploitation tools work, the legal and ethical implications, and the challenges law enforcement faces in curbing their spread. Despite increasing awareness, the rapid advancement of generative AI makes it difficult to regulate these technologies effectively. Lawmakers and advocacy groups are calling for stronger legal protections to criminalize the creation and distribution of non-consensual deepfakes and hold perpetrators accountable for their actions.
Victims often struggle with the emotional and reputational damage caused by these manipulated images, with limited legal recourse available. Many experience severe distress, as the circulation of fake explicit images can harm careers, relationships, and mental well-being. Social media platforms and online communities are under scrutiny for failing to detect and prevent the sharing of such content, raising questions about their responsibility in protecting users from online abuse.
Read the full analysis on Crikey to understand the scope of this alarming trend, the people affected, and the possible legal and technological solutions to combat AI-driven sexual exploitation..‘Nudify’ economy

ICMEC Australia acknowledges Traditional Owners throughout Australia and their continuing connection to lands, waters and communities. We pay our respects to Aboriginal and Torres Strait Islanders, and Elders past and present.