Site icon CathNews New Zealand

How deepfakes, nudes and teen misogyny have changed growing up

deepfakes

Gendered misconduct on the rise, female teachers scared of being “deepfaked” and parents protecting badly behaved boys: this is high school in 2024.

“But what if she sent them to him first?”

Devastating words from a teenager that made me shudder.

I used to visit schools and give presentations about consent and respectful relationships. At one school I learned a boy had shared nude images of a girl with his friends.

The young people wanted to know how to feel about the girl’s culpability in this scenario. They were thinking out loud, working through the millennium-old problem of misogyny.

Did it make a difference if they’d been in a relationship at the time? Did it make a difference if she sent him the nudes without him even asking? In other words: did it make a difference if she was a slut?

I explained, of course, that the question was about consent and respect. He knew she didn’t want the images passed on, and he did it anyway. His actions were the problem to focus on here, not hers.

These young people are products of our society, and our society has taught them that when we hear about the incorrect behaviour of a boy or man, we ricochet our focus and blame away from him and onto someone or something else.

A girl or woman, an outfit, alcohol and drugs, et cetera. What was she doing? What provoked him? What was she wearing?

It’s not breaking news that schools and parents are still playing catch-up with the advent of the internet and smartphones being available to young people.

The gendered misconduct that has always existed in real life is also built into the digital realm, and the heads-in-the-sand or abstinence-style approach isn’t working.

In the past six months alone I’ve seen countless stories of schoolboys targeting both peers and teachers with varying degrees of tech-assisted gendered bullying, harassment, and abuse.

In April, a young teacher deciding to leave the profession after too much gendered misconduct, culminating in 15- and 16-year-old boys harassing her and professing their adoration of Andrew Tate in class.

In May, students at Yarra Valley Grammar were caught after compiling the highly offensive spreadsheet rating the attractiveness of their female peers including terms like “wifey” versus “unrapeable”.

Just weeks later, an Instagram post ranking girls from a Gold Coast school used categories such as “abduction material”, “one night stand”, “average” and “unrapeable”.

In July former and current staff at Warrnambool College reportedly faced up to 20 violent and sexist attacks a day.

That same month a substitute teacher went public about the sexist behaviour she faces from students around the country, including Year 9 students projecting pornography on a whiteboard behind her while she took the roll.

And in August, members of a Pembroke School football team devised a ratings scheme using sexist and racist references to female students.

For every one of these cases that actually hits the news, I see the tip of an iceberg.

Gendered misconduct is grossly under-reported, when it is reported it often isn’t dealt with properly, and even when it is dealt with, schools will seek to avoid negative press coverage where possible.

A new fear unlocked

Image-based abuse in particular is pernicious – and prevents victims from coming forward – because of the potential for slut-shaming. But the lack of victim involvement is why I’ve been thinking about deepfakes for a long time.

The deepfakes I’m talking about are created when someone’s non-sexual photos and videos are meshed with explicit photos and videos of someone else or with AI. The results can range from clunky and obvious to absolutely seamless and convincingly genuine.

In January this year deepfakes of Taylor Swift went viral and were viewed more than 47 million times over a 17-hour period before the material was removed from social media.

Whereas a woman or girl complaining of image-based abuse would normally be grilled on why she took or sent nudes in the first place, now, with deepfakes, the only people we have to grill are the ones we always should have: the people who create or share this material without consent.

There’s nowhere for the ricochet to go. Could we find a way forward, free of victim-blaming narratives, in dealing with this latest frontier?

I spoke to a few teachers about these trends and issues. One of them, Ellen*, has been a teacher for 18 years and is currently teaching high school students.

She recently watched a colleague leave the profession after “three separate sexual assault incidents across the state, Catholic, and independent sectors”, and says the level and frequency of disrespect and objectification of women in school environments is definitely getting worse.

While she has not personally dealt with any instances of deepfakes being created or shared at her school, she is afraid of when they will hit. “I genuinely think that it is a matter of when not if.

The expression ‘new fear unlocked’ was the very first reaction I had when deepfakes appeared in tech.” She said it “feels like ratemyteacher on steroids” and the attitudes are either being reinforced or ignored at home.′

Deepfakes are becoming a much bigger problem as the industry that facilitates their creation becomes ever more lucrative.

A report by social media analytics firm Graphika found a 2,000 per cent increase in the number of links promoting websites that use AI to create non-consensual intimate images on Reddit and X since the beginning of 2023.

They’ve moved “from niche pornography discussion forums to a scaled and monetised online business”.

In July eSafety Commissioner Julie Inman Grant addressed the Senate Standing Committee on Legal and Constitutional Affairs and quoted from the description of a popular open-source AI “nudifying app”: “Nudify any girl with the power of AI.

Just choose a body type and get a result in a few seconds.” And another: “Undress anyone instantly. Just upload a photo and the undress AI will remove the clothes within seconds. We are the best deepnude service.”

Ellen described to me a state of what I would call “hypervigilance”, trying to keep photos and videos of herself private, lest a spiteful or just-plain-reckless student get hold of them. Read more

Exit mobile version