Science How Artificial Intelligence Is Fueling Incel Communities

  • 🏰 The Fediverse is up. If you know, you know.
  • Want to keep track of this thread?
    Accounts can bookmark posts, watch threads for updates, and jump back to where you stopped reading.
    Create account
Link (Archive)

How Artificial Intelligence Is Fueling Incel Communities​

In late January 2024, X was flooded with graphic, deepfaked images of Taylor Swift. While celebrities have long been the victims of photo leaks and cyber-attacks, this time it was different because these were generated using artificial intelligence.

The images were quickly reported by the “Shake It Off” singer’s fanbase and taken down after being live on the poster’s profile for less than a day. However, it was enough time for them to go viral despite the platform having policies against non-consensual nudity. A report from disinformation research firm Graphika later found that the images had been created on 4chan, where users encouraged each other to generate sexually charged deepfakes in an attempt to skirt content policies surrounding nudity using famous female celebrities.

Unfortunately, Swift’s experience isn’t a one-off. Marvel actress Xochitl Gomez, who is only 17 years old at the time of reporting, said on the podcast The Squeeze that she struggled to get deepfakes of her taken down from X and shared the mental impact that had on her. Gomez and Swift are just two of the countless women who’ve recently become victims to deepfakes depicting them in sexual ways.

“People have always used media to try and defame people, that hasn’t changed. What’s changed is how accessible it’s now gotten,” Siwei Lyu, professor of Computer Science at the University of Buffalo, told The Daily Beast.

Late last year, AI image generation platform CivitAI became popular for its “Bounties” feature, which encouraged users to create deepfakes in exchange for virtual rewards. Almost all the bounties created were of women, according to reporting from 404 Media. Some included women who weren’t celebrities or public figures either, but rather private citizens.

Experts expect it to only get worse—especially as more and more incel communities online use these technologies. Henry Ajder, an AI and deepfake adviser and expert, told The Daily Beast that this has been a growing problem for years now and CivitAI is an example of a platform heavily linked to that kind of evolution.

He said that CivitAI has become a “hotbed for not just artistically created content, but also content that’s erotic. It’s a specific place to find specific knowledge and people have started using it for pornographic content.”

Ajder also describes the technology on the platform as “agnostic or dual use,” saying once it’s there it can be used in any way, “while, others are explicitly designed for creating pornographic content without consent.” The tools have only gotten popular within incel culture via platforms like Reddit and 4chan.

“There’s such a low threshold,” Hera Husain, founder of Chayn, a nonprofit supporting victims of gender-based violence and trauma, told The Daily Beast. “It’s an easy-to-access method which allows people to fulfill the darkest fantasies they may have. [...]They may feel it is victimless, but it has huge consequences for those people.”

It’s not just deepfakes that have penetrated incel culture either. There’s even research that shows that AI girlfriends will be making incels even more dangerous. With this tech allowing them to form and control their perceptions of a so-called “ideal woman,” there’s a danger that they may push those perceptions on real women. When they find themselves unable to do so or when a woman seems unattainable like in the case of Swift or Gomez, incels begin deepfake campaigns. At least, then, incels can make these women do what they like.

“Governments are simply trying to play catch-up; the technology has gone faster than their ability to regulate,” Belinda Barnet, senior lecturer in media at Swinburne University, told The Daily Beast.

This gets even more dangerous as we look at global contexts. Patriarchal norms in different nations often further endanger women who become victims to such campaigns. In many more conservative countries, even a deepfake of a woman can be enough for her family to ostracize her or, in extreme cases, use violence against her. For example, in late 2023, an 18-year-old was killed by her father over an image of her with a man which police suspect was doctored.

It doesn’t matter that the image is fake. The fact that their image is associated with such a depiction is enough for society to ostracize them. “It’s not so much about people believing the images are real as it is about pure spite. It’s a different kind of trauma to revenge porn,” Ajder explained.

With AI generation becoming more accessible, this also makes it an easier barrier to entry for global incels who may have struggled with language barriers. In South Asia, where Husain focuses much of her work, it also becomes harder to counter incel radicalization, both socially and on a policy level. “They don’t have as strong a counter to the radicalization they’re seeing in the incel community,” she explained.

Lyu says that policies regarding free speech and tech access across the world vary so there can be different impacts. “In the U.S., using AI generation tools to create content... is freedom of speech—but people can take advantage of that as well. Drawing that line becomes very hard. Whereas in China, there’s very strong limitations on the use of this technology, so that is possible but prevents positive uses of the same line of technology.”

Incel culture existed long before AI generation tools became popular. Now that they’re mainstream, these communities will be quick to adopt them to further cause harm and trauma. The issue is sure to get worse before it gets better.

“In terms of incel culture, this is another weapon in their twisted arsenal to abuse women, perpetuate stereotypes, and further make visceral the twisted ideas they have about women,” Ajder said.
 
What I expected: AI is rising, Incels most affected....in a positive way...which is bad somehow?
What it was: Trolls are using AI to make images of me, I bet the incels did this!
 
What do deepfakes have to do with being an incel?
Nothing. The powers that be want to ban/gimp AI because it threatens their power so they're having their buttbuddies in the media write propaganda trying to tie AI in with the most reviled and disliked demographic.

You ever notice how anything the feds want banned or censored is always coincidentally associated with either incels or white supremacists?

:thinking:
 
Disappointing. I thought the thread would be about incels using AI to automatically generate posts full of seething and copium, thereby saving themselves time to go outside, enjoy hobbies and spend time with their girlfriends
 
Disappointing. I thought the thread would be about incels using AI to automatically generate posts full of seething and copium, thereby saving themselves time to go outside, enjoy hobbies and spend time with their girlfriends
How do you know they aren't? At least, r9k (like the rest of 4chan) could be over half automated at this point for all we know.
 
What do deepfakes have to do with being an incel?
If you are making them, you are more likely to be an incel.

Anmol_Irfan_author_afbhzt.jpeg
Anmol is a Muslim Pakistani freelance journalist and editor. Her work focuses on global gender justice, climate, tech and media with a focus on marginalised narratives. She tweets @anmolirfan22
AI-powered bionic incels will rape her.
 
Late last year, AI image generation platform CivitAI became popular for its “Bounties” feature, which encouraged users to create deepfakes in exchange for virtual rewards. Almost all the bounties created were of women, according to reporting from 404 Media. Some included women who weren’t celebrities or public figures either, but rather private citizens

This gets even more dangerous as we look at global contexts. Patriarchal norms in different nations often further endanger women who become victims to such campaigns. In many more conservative countries, even a deepfake of a woman can be enough for her family to ostracize her or, in extreme cases, use violence against her. For example, in late 2023, an 18-year-old was killed by her father over an image of her with a man which police suspect was doctored.

If this isn't an incel behavior i don't know what it is.

I gotta say you have to be special level of scum to do this to people some of these cases mentioned are obvious revenge.
 
If this isn't an incel behavior i don't know what it is.

I gotta say you have to be special level of scum to do this to people some of these cases mentioned are obvious revenge.
First one's... lol, whatever.

Second one's pure islam. Any inceldom involved is inceldental incidental, or a product of - downstream from - islamic culture. Provokes some amusing/mischievously evil ideas for psywar on the invaders, though, if one was into gayops.
 
If you are making them, you are more likely to be an incel.

View attachment 5895696

AI-powered bionic incels will rape her.

Ugly streetshitting zogbot muslim. Israel Man, we need you to do the needful!

Don't have your nudes online, it is that easy. And if your daddy acks you over a digital romp with a real true and honest person like Elmo, his and thus your genes are best left out of the genepool anyway!
 
Don't have your nudes online, it is that easy. And if your daddy acks you over a digital romp with a real true and honest person like Elmo, his and thus your genes are best left out of the genepool anyway!
You don't need nudes online to get deepfaked. Regular photographs can be used. So obviously this is already being done to children by other children and adults:


It will take a while before people in "more conservative countries" like Pakistan stop believing their lyin' eyes and stoning their daughters.
 
I am sure nude photos floating around and being send to your boss and family is totally whatever as long as its not you.
Yes. Obviously.

I just find it funny when foid passive-aggression tactics (gossip, reputational destruction, false accusations, etc) are used against foids. Not so funny when the boot's on the other foot, is it?

Anyway, who gives a fuck if it's FAKE lewds being sent? Real 'revenge porn' is a different matter, I'll concede. And no, it's not something I've done, either. Though I could.
 
Yes. Obviously.

I just find it funny when foid passive-aggression tactics (gossip, reputational destruction, false accusations, etc) are used against foids. Not so funny when the boot's on the other foot, is it?

Anyway, who gives a fuck if it's FAKE lewds being sent? Real 'revenge porn' is a different matter, I'll concede. And no, it's not something I've done, either. Though I could.
well, hypothetically, you might.

Scenario: I know you IRL and have decided to ruin your life.

I deepfake nudes of you.

I then use a burner to send them to a teenager you know's phone, having first secured the phone from them etc etc. I add some suitably groomy texts over the course of a few weeks.

(Note: this will be a lot easier and also way more effective if you are, say, the father of this child and in a custody dispute with me.)

I take my child/niece/nephew's phone to the cops, crying and yelling about these texts I found and THESE DISGUSTING NUDES.

Are the cops going to handwave this OR are they going to investigate, and effectively leave the burden of proof on you to prove those aren't your real nudes? (Note: in many jurisdictions, the way sexual offences against children are framed, this effective reversal of the burden of proof happens.)

What are the likely consequences IRL for you if it gets out - and it will - that the cops are investigating you, sometimes for many months or over a year, for sending wanking pictures to an eleven year old relative?

You know you'd give a fuck then. You would be right to.
 
well, hypothetically, you might.

Scenario: I know you IRL and have decided to ruin your life.

I deepfake nudes of you.

I then use a burner to send them to a teenager you know's phone, having first secured the phone from them etc etc. I add some suitably groomy texts over the course of a few weeks.

(Note: this will be a lot easier and also way more effective if you are, say, the father of this child and in a custody dispute with me.)

I take my child/niece/nephew's phone to the cops, crying and yelling about these texts I found and THESE DISGUSTING NUDES.

Are the cops going to handwave this OR are they going to investigate, and effectively leave the burden of proof on you to prove those aren't your real nudes? (Note: in many jurisdictions, the way sexual offences against children are framed, this effective reversal of the burden of proof happens.)

What are the likely consequences IRL for you if it gets out - and it will - that the cops are investigating you, sometimes for many months or over a year, for sending wanking pictures to an eleven year old relative?

You know you'd give a fuck then. You would be right to.
Totally whatever, as long as it's not me.

Nightmare scenario you described, though - in disturbing detail. If it was a custody dispute, you needn't go to all that trouble, though. Guilt will automatically be assumed, just on the woman's word, and the man's reputation will be just as (permanently) tarnished. And if you did frame me like that, and it was proved to be a malicious fraud, I doubt you, as a wahmens, would face much in the way of consequences either.

Shit, just being a man, you're automatically assumed to be a predator, rapist and pedo, wherever you go anyway. One gets used to it.
 
Totally whatever, as long as it's not me.

Nightmare scenario you described, though - in disturbing detail. If it was a custody dispute, you needn't go to all that trouble, though. Guilt will automatically be assumed, just on the woman's word, and the man's reputation will be just as (permanently) tarnished. And if you did frame me like that, and it was proved to be a malicious fraud, I doubt you, as a wahmens, would face much in the way of consequences either.

Shit, just being a man, you're automatically assumed to be a predator, rapist and pedo, wherever you go anyway. One gets used to it.
It's a pretty powerful genie that's come out of the bottle here, when you consider the amount of random malice in the world. I suspect most of the victims of this kind of shit barely know the people who are doing it to them. In the case of celebrities, I'm sure they don't know them at all.

This video doc/interview is actually really good. Definitely makes you think about the impact of this stuff. (linking because I don't think I can post it right into the thread)


Technology always runs far ahead of social regulation of its use, but I feel that more and more it's starting to run ahead of social convention around its use. I bet most people aren't even aware you can do most of the scenarios we're discussing in this thread with it already. I think most normal people would be utterly horrified. But I don't know. That interface where harmful content meets rights of free expression is probably the single hardest problem post-industrial societies are grappling with currently, in terms of social norms.
 
Marvel actress Xochitl Gomez, who is only 17 years old at the time of reporting, said on the podcast The Squeeze that she struggled to get deepfakes of her taken down from X and shared the mental impact that had on her.

She was only 17 years, 11 months, and 30 days old you Incel perverts! If you only waited 1 more day until she reaches 18, then she can legally star in degrading facial abuse porn and lick a redneck's hairy asshole like Mayli!

 
Back
Top Bottom