Legislative Council: Wednesday, June 19, 2024

Contents

Bills

Summary Offences (Invasive Images and Depictions) Amendment Bill

Introduction and First Reading

The Hon. C. BONAROS (17:15): Obtained leave and introduced a bill for an act to amend the Summary Offences Act 1953. Read a first time.

Second Reading

The Hon. C. BONAROS (17:15): I move:

That this bill be now read a second time.

I am pleased to introduce this first of its kind bill, the Summary Offences (Invasive Images and Depictions) Amendment Bill 2024. The bill seeks to create new offences for the creation, distribution and threat to distribute artificially generated invasive images, commonly known as deepfakes and more specifically as sexually oriented deepfakes.

For the unaware, a deepfake is an AI-generated image that may draw on photos or videos or the voice of a real person to create a realistic looking image, video or audio of that person. There have been well known instances of deepfakes impacting public figures, including viral images purporting to be Taylor Swift. I am in no way making light of this, because what you will hear is very serious, but I know the Hon. Tammy Franks in particular will probably understand more and appreciate the Taylor Swift references woven into my contribution today and inevitably the reason for those.

While some may dismiss this as a 'cruel summer' for celebrities, the reality is far from it. The creation, distribution and threat to distribute sexualised deepfakes presents serious problems for young people and young girls in particular. The psychological impact of cyberbullying and harassment has led to young people ending their lives in Australia and, indeed, all over the world. Of course, this is not an issue that impacts only the young. Women are also disproportionately on the receiving end.

The bill introduces new offences and seeks to increase penalties for existing filming and sexting offences to match the new deepfake penalties. The recent surge in the creation and distribution of sexualised deepfake content is the result of significant advances in artificial intelligence capabilities. The world is rapidly changing. AI has made humiliation possible at the click of a button. In the space of just a year, we are talking about territory that was only within the realm of hackers and underground use and is now available literally at the click of a button for anybody who is willing to give it a crack.

The rapidly evolving nature of deepfake technology has made it difficult for law enforcement and laws to keep up. AI has highlighted all too well just how much we need 'Taylor-made' laws. I know our 'red' Premier loves South Australia to be the first at many things and this is indeed an opportunity to be truly nation leading with our response to a contemporary problem that has significant social impacts.

The bill seeks to close the blank space in our laws concerning the creation, distribution and threat to distribute images that have been wholly generated by artificial intelligence. Currently, the Summary Offences Act 1953 addresses instances where a person's actual image has been altered and used in the creation, distribution or threat to distribute sexually explicit pictures. That is where a real person's image has been used. That is captured. This is distinct, though, from a deepfake of a person that is generated wholly by artificial intelligence but is generated in such a way as to resemble that person so closely that it can be mistaken for the actual person in question.

The bill proposes to double penalties for existing filming and sexting offences to match the new deepfake offence penalties. It would be a defence to a charge under the bill—so we are clear—only where the person has consented to the creation or distribution of the images. The consent provisions will not apply to charges involving a minor or to vulnerable persons or where consent was obtained under duress or deception. Proposed penalties are higher across the board for offences where the depicted person purports to be a real person under the age of 17, as well.

Sexually explicit or compromising videos can and are spread quickly online. Although I appreciate she is one of the world's most famous people right now, one of Taylor Swift's deepfakes—just to give you an indication of why I have used her references throughout this—was reported to have been seen over 47 million times before it was removed by X. That is extraordinary. Despite the popularity of the person in question, we should not underestimate that humiliating and denigrating impact it has even on someone as big and famous as Taylor Swift.

In an instant, deepfakes can tarnish a person's reputation, causing shame and humiliation and, of course, have long-term psychological impacts. They can seriously impact on an individual person's professional reputation, causing anxiety, depression and all the things that come with the violation of privacy and public exposure or fear of it. Deepfakes are very damaging, and our laws have to keep pace. We have seen this time and time again.

We cannot just say, 'You are on your own, kid.' We have seen how effective existing sexting laws have been, particularly amongst minors, and this parliament now has the opportunity to extend that effectiveness to combat sexualised deepfakes. Our love story with technology has reached new and frightening heights and we must be fearless in our response.

I flag that we have been working on a separate proposal for aggravated offences involving AI sexualised images used for blackmail purposes. Deepfakes can be exploited by predators to manipulate or coerce young people. In fact, we have examples of that here in Australia. So rather than begin again, on the advice of parliamentary counsel and based on what we have discussed, I am proposing and foreshadowing that I will be moving an amendment to this bill to capture it because it is the right place for it. I look forward to the swift progress of this bill through this place.

Deepfakes, as I said, represent a frightening demonstration of the power of artificial intelligence, and these proposed laws are effectively trying to catch up with a horse that has well and truly already bolted, but we are trying to rein that in as much as possible. We have seen, as I said, how effective our existing sexting laws have been, particularly amongst minors. We have seen that they form part of the education curriculum program for very good reason and that they have done what they were intended to do and that is deter minors in particular from engaging in that sort of behaviour.

But make no mistake, AI is being used to create deepfake revenge porn and deeply disturbing, sexually explicit images of minors. Schools are becoming breeding grounds for its distribution. The degree to which this is happening right now ought to be enough to put the fear of death in parents across Australia. Overwhelmingly, as I said, it is women but it is also young girls who are the targets here.

Chances are it is happening a lot more than we know, because what we have heard from experts is that, particularly in schools, kids are reluctant to actually talk about it. They are hiding the shame. They are not knowing what to do. They are responding, but they are not knowing what to do with it. There has not been a single day in recent months when the risk and harm and dangers associated with AI sexualised deepfakes have not been brought to our attention, and it is truly terrifying. According to an ABC article, one in seven adults have been threatened over sexual images and videos, and the issue is more widespread than even the experts first thought.

Just today, it was reported that Noelle Martin was 18 when she first experienced image-based abuse, with perpetrators editing her image, an image taken from social media, and posting it on pornographic websites. On one occasion, Noelle's perpetrators tried to extort her, saying they would take down the images if she sent them real nudes. That was the extortion here: 'We will take down the fake ones that look just like you if you send us real ones.'

Thankfully, and to her great credit, this young woman has spent the last 11 years speaking out against this sort of stuff, but experts are warning that deepfake porn is getting worse, but it is not just the AI tools enabling that behaviour. It is feeding misogynistic messages and degrading comments that are crushing young people's empathies.

As I said, schools are becoming the breeding ground for this sort of behaviour. According to ABC, police have said in the case of Bacchus Marsh Grammar School in Melbourne that about 50 students' faces were purportedly taken from social media sites before being manipulated and used by AI to create obscene photographs.

If you think it is hard to do, think again. I went to have a look myself. I did a quick Google search. In fact, I could go on there now and find a photo of anyone in this chamber and say, 'Convert this to an obscene, sexually explicit or nude photo,' and the results will be scary. That is what not just adults are doing but minors are doing, each and every day. It is happening in our schoolyards. It is happening across Australia. They are the words of police, not me. This is about 50 students from one school who have all had their photos used on one of these AI apps or websites. Sexually explicit, and in the words of police 'disturbing, obscene photos', have been produced of them as a result.

Make no mistake, as I said this is happening in schoolyards. These are the breeding grounds at the moment. Yes, I have said that the horse has well and truly bolted, but we do have the opportunity to catch up. Kids are not turning to adults for help, because they do not think the adults know how to deal with them, because they are scared, because they are embarrassed, because they do not want their parents finding out that this is happening at school. They are going it alone.

We have been here before. We have been in this very scenario before. We all probably have been around long enough now to know the importance of Carly's Law and why we passed that, of sexting laws and why we passed those, of banning childlike sex dolls and why we passed that. These are exploitation measures that in many cases, sadly, are targeted towards kids.

When it comes to minors and adults, the common perpetrator of sextortion is another part of this discussion. Recent surveys that have been undertaken by RMIT University with Google have shown that in many instances the most common perpetrator was actually a former intimate partner, followed by a current intimate partner.

Adolescents and LGBTIQ+ people also appear more likely to experience sextortion. The author of that report says that the explanation might also be the case of financial extortion scammers, who may be more likely to target men and boys because they assume the males are more likely to share their intimate images than girls and women would with a stranger. Young people under the age of 35 are more likely to report victimisation than those over 35, and LGBTIQ+ groups are more likely to report victimisation as well.

There are some harrowing accounts of what this sort of thing can do to young people. There is a harrowing account of the effect the release of nude pictures without consent, and indeed in instances where consent cannot be obtained, can have on young people. The ABC has reported on a young man, Mac Holdsworth, who was 16 when he was extorted via social media after exchanging photos of his body with somebody who he believed was a teenage girl. It turned out the person was not a teenage girl, and he was being blackmailed into providing money or those photos would be sent to all his contacts. It turns that in that instance the photo was sent to all of Mac's contacts.

Sadly, even though Mac finally told his dad and the matter was reported to the police, and the police were able to find and charge the perpetrator, Mac did not cope with it. He obviously suffered a lot in silence despite having a very supportive and helpful family who were trying to get through it, and, tragically, he died by suicide.

The point is that this is all because of the explosion of what is happening on social media. Mac did not send his photo thinking that it was going to go to a scammer who was then going to sextort him and send this to all his contacts online. This is something innocent that any kid of his age could have done and, tragically, it has resulted in the worst possible outcome. Using AI sexualised deepfakes does exactly the same. Whether it resembles the person so closely that you think it is them or whether it is actually using a photo of them makes no difference, and we should not draw that distinction in law.

There is a growing body of work on this. I think it is really important again to note we have a report from February this year that actually says that since the boom of generative AI in 2023 there has been a rise in diffusion model deepfakes. Diffusion model takes us beyond the realm of pasting a celebrity's face onto the body of an actor or making the mouth say new words. They allow a deepfake to be created from scratch without editing original content.

Our cybersecurity experts are warning us about the rapid changes in this landscape. It is critical that our laws keep pace. They themselves have said that just a year or so ago this technology was only accessible by skilled hackers and experts. Now anyone with a phone or a computer can make a deepfake—a sexualised deepfake. It is as easy as going to the App Store, downloading one of the many apps for free or a small subscription fee, and you are ready to go. Scarily, it is now almost just as easy to create a deepfake that targets a specific person.

I am hoping this all rings a bell, because these are the same sorts of conversations we had when we banned childlike sex dolls. Novice attackers can access a generator and create one with no prior knowledge in as little as a few seconds. I could leave here now and do that to absolutely everybody in this room. That is how accessible it is, and it is scary to think that that can be happening in our kids' bedrooms, in schoolyards and then used on social media, shared at the snap of a finger on Instagram and Snapchat with such horrifying and scary outcomes for those kids who are actually the victim of that material.

Again, this is just as much about improving our laws regarding minors as it is about the general concept of AI sexualised deepfakes. I have now given several examples of the sorts of material we are talking about, but I urge all honourable members to go online. There is a report that was completed, like I said, in February this year, entitled 'Keeping it real: How to spot a deepfake'. The ABC has a story pretty much every day showing how widespread this is and the sorts of impacts it is having. I urge honourable members to have a read of the sort of stuff that is happening, particularly when it comes to revenge porn. I urge honourable members to read about the impacts this is having on young lives.

Also, when it comes to the issue of sextorting, extorting, blackmailing or threatening somebody with these sorts of things, I again foreshadow and remind members that there will be a further change to this bill to ensure that we have looked at this from absolutely every possible angle, because none of this is okay. None of this is okay. Like we said with childlike sex dolls, it is not just teenagers. Heaven forbid, if someone wanted to go and take a picture of our little kids—an eight year old, a 10 year old, a five year old—and put it in these sorts of depictions, they could. That scares me as a parent. It should scare absolutely everyone in here, and I am sure it horrifies parents out there. Just know how readily available, and how rampant a problem, this is. This is not me saying all this; this is what the experts are saying.

We have seen the feds move on this issue in recent days. In response to that I will say that it involves carriage, dissemination and distribution at a federal level, but this bill is creating new offences in South Australia. We are creating an offence that makes it punishable by jail if you create this with a purpose of distributing it without a person's consent. Indeed, where it involves the depiction of someone under the age of 17, there are no issues of consent at all—it just should not happen.

With those words, I look forward to the swift passage of this bill. I would say again that I use those references because it is extraordinary. If you want to know how quickly something can spread, it is extraordinary: 47 million times one person's image gets viewed, and it is a sexually explicit image that I am sure they, just like the rest of us, did not want shared with the whole world.

Debate adjourned on motion of Hon. M. El Dannawi.