Deepfakes are a form of gender based violence

Today in Parliament, Abigail contributed to a debate in support of a Bill that criminalises the creation and distribution of sexually explicit AI-generated material without consent, addressing the rise of sexual deepfakes.

Abigail said:

The Greens support the Crimes Amendment (Intimate Image and Audio Material) Bill 2025, which brings New South Wales into line with other jurisdictions in Australia by making it a crime to create and distribute sexually explicit material made or altered using artificial intelligence without consent, also commonly known as sexual deepfake materials. The bill amends the Crimes Act 1900 to expand existing offences related to the production and distribution of intimate images without consent to include the creation and distribution of sexually explicit image and audio materials that have been digitally generated by artificial intelligence [AI].

Technology-facilitated abuse is an insidious and violent form of abuse and is commonly a serious marker of increasing and escalating domestic, family and sexual violence. That form of abuse is often used as a tool to control, coerce, punish, humiliate or otherwise inflict harm on a victim. In recent years the non-consensual creation, alteration and sharing of sexually explicit material using artificial intelligence has been spotlighted as the latest form of technology-facilitated violence being utilised by perpetrators of abuse. To be clear, deepfakes and "nudify" apps are a form of gender-based violence designed for the purpose of sexually harassing and abusing women and girls and inflicting coercive control to extort victim-survivors. A 2023 study found that women make up 99 per cent of the individuals targeted in sexual deepfakes, with people with disability, First Nations peoples, the LGBTIQA+ community and younger people also heavily targeted. Even more concerningly, children are increasingly being targeted as victims of that abuse.

Earlier this month Australia's eSafety Commissioner launched enforcement action against a technology company based in the United Kingdom for enabling the creation of such material. That company operates two of the world's most visited AI-generated nudify websites. According to the eSafety Commissioner, those two services have been visited by 100,000 users every month in Australia alone. With new forms of technological advances presenting new opportunities for perpetrators of abuse every day, it is crucial that we act to not only criminalise that behaviour but also guard against highly dangerous new forms of technology-facilitated abuse before they take hold.

Importantly, the existing offences in the Crimes Act and the expanded offences in this bill only target individuals after offences have occurred or been threatened. While that is important, it fails to adequately prevent abuse and violence before it occurs. That requires targeted frontline prevention work within communities as well as legislative action to prevent the creation of that material by banning the technology being used to do so. Tech companies can and must do much more to mitigate the harm caused through the existence of those tools by safeguarding against deepfake generators, acting as soon as content is flagged to remove it, and using indicators to ensure the origins of materials can be identified. Companies can also act by working with other platforms to proactively share information and flag suspicious activity.

But, of course, we cannot possibly rely on businesses, most of which are multinational corporations that have little regard for ethics, to do the right thing. That is why The Greens have long called for government action to hold tech companies to account for enabling the creation and distribution of illegal deepfakes and nudify apps and impose a positive duty of care to prevent the creation of harmful material. I understand that, in the past month, the Federal Government has announced plans to put the onus on tech platforms for failing to prevent users from creating illegal deepfakes. It is important that that is done urgently and comprehensively and is not kicked further down the road. I urge the New South Wales Government to commit to doing more than what is in the bill before us, including proactively advocating for the Federal Government to take sweeping action to hold tech companies to account and prevent that form of technology-facilitated abuse from occurring in the first place.

I now turn to the contents of the bill. Schedule 1 [2] to the bill inserts new definitions into section 91N of the Crimes Act for "digitally generated", "simulated person", "intimate audio material" and "intimate image material". The definition of intimate audio material applies to any audio that is sexual in nature or relates to engagement in a private act in circumstances where a reasonable person would reasonably expect to be afforded privacy, including real audio, altered audio and audio of a simulated person.

The definition of intimate image material replaces the existing definition of intimate image in the Crimes Act to include images that have been altered to appear to show a person's private parts or a person engaged in a private act and images of a simulated person's private parts or of a simulated person engaged in a private act. Currently it is only unlawful to distribute images of that nature that have been altered. Under the existing provisions that were introduced in 2017, it could be argued that images altered or created using generative AI are not technically altered but are generated or produced, thus falling outside the scope of existing intimate-image offences. Simulated images would also not be captured under the existing offence. The new and expanded definitions in the bill ensure that all forms of non-consensual sexual deepfakes are considered.

New section 91PA introduces a maximum penalty of 100 penalty units or imprisonment for three years, or both, for intentionally altering an image or audio of another person in a way that makes it intimate, or for creating an intimate image or audio material of a simulated person without the consent of the real person it represents. For those offences, prosecution of a person under the age of 16 years for an offence must not be commenced without the approval of the Director of Public Prosecutions. I note that The Greens have some concerns about those new offences resulting in the disproportionate criminalisation of young people. The prosecution of a person under 16 should only occur in exceptional circumstances, given that much of this offending is perpetrated by young people in schools who are not equipped to understand the extent of the harm that their actions cause.

New section 91R (1A) introduces a maximum penalty of 100 penalty units or imprisonment for three years, or both, for threatening to alter or create an image or audio in a way that makes it intimate, or for creating an intimate image or audio material of a simulated person without the consent of the real person it represents. The new offences proposed in the bill only relate to images and audio of adults and not children because such material involving children is already captured under division 15A of part 3 of the Crimes Act 1900. Those existing offences already cover all depictions and descriptions of children in sexual contexts, regardless of how they are made or produced.

The bill makes consequential changes to the language in the Crimes Act to refer to the new offences where necessary and also makes consequential changes to other legislation, including the Child Protection (Working with Children) Act 2012, Crimes (Domestic and Personal Violence) Act 2007, Criminal Procedure Act 1986 and National Disability Insurance Scheme (Worker Checks) Regulation 2020 to insert the new offences where relevant.

New section 91U inserts a requirement for the Minister to review the amendments made in the bill 12 months after commencement, with a report to be tabled in Parliament within six months of that review. The Greens welcome the introduction of a timely statutory review of the provisions to ensure we are monitoring the implementation and enforcement so that any unintended consequences are addressed and rectified, which is important in the context of the rapidly evolving technological world we live in.

I understand that the domestic and family violence sector is broadly supportive of the reform. However, the sector has raised serious concerns about the impact on the ground without adequate resourcing of the investigative mechanisms to police crimes involving sexual deepfake materials. Resourcing on the ground is not keeping pace with the rapid development of AI technology. Police are already not adequately trained on even the base level offence, with substantial ongoing concerns raised about police failing to properly respond to technology‑facilitated abuse offences. The new offences in the bill must be accompanied by training and resources or they will be ineffective at best and actively harmful to victim-survivors and young offenders at worst. We must also ensure that communities are educated about the new offences and that avenues for reporting are accessible, safe and trauma-informed. The Greens are concerned that failing to do so will result in an increase in misidentification of victim-survivors, not to mention the disproportionate criminalisation of marginalised young people.

I also note that the existing offences are almost never prosecuted, similar to the Federal offence of using a carriage service to transmit material of that nature. We already know that sexual violence is the least likely violent crime to be reported, investigated, prosecuted and convicted. In the rare instances that it is reported and investigated, victim-survivors of sexual violence are continually faced with significant obstacles across nearly every stage of the justice system. There is an enormous amount of work to be done to address the underlying systemic failings that allow those forms of violence and abuse to occur and ensure victim-survivors can safely access the support, justice, healing and recovery that they need.

Preventing gendered violence and ending cycles of abuse requires targeted prevention work within communities through education and engagement, as well as fundamental changes in educational policies, structures and environments. Experts have been calling on governments to take far bolder and more ambitious action to prevent gender-based violence for decades. With new forms of technology being advanced every day, it has never been more important to invest in prevention. Specialist frontline domestic, family and sexual violence services are without a doubt the most equipped to understand and navigate emerging forms of technology‑facilitated abuse, with workers on the front line dealing with those nuances every single day.

As I have said countless times in this place, the expert workers on the front line who are best placed to respond to this are already struggling to keep up with demand because the Government is starving them of the funding needed to do so. While legislative reform is an important step in addressing gender-based violence, there is simply no substitute for funding the front line. I once again call on the Government to urgently provide existing specialist domestic, family and sexual violence services with the core funding they need to deal with the ramifications of the passage of this bill and to continue carrying out some of the vitally important work that will turn the gender-based violence crisis around. Finally, I thank the Attorney General and his office for engaging with The Greens on the bill and for the work that is being done in this space beyond that. The Greens support the bill.

18 September 2025

Read the full transcript in Hansard here.

Join 57,392 other supporters in taking action