Should AI-generated CP be illegal?

  • Want to keep track of this thread?
    Accounts can bookmark posts, watch threads for updates, and jump back to where you stopped reading.
    Create account

$80 Nude Haircut

Big Daddy Bitch
kiwifarms.net
Joined
Oct 31, 2019
Since the Fishtank thread on /tv/ is being spammed with it the question arose.

The obvious answer seems like Yes because it's nasty but at the same time it doesn't seem as obvious to me when I sit & actually consider it. The whole point of outlawing CP is to protect real children from abuse. If an AI generates a pornographic image based off 3D models or medical textbooks then a real child wasn't & isn't being harmed. It's effectively a more realistic-looking lolicon drawing.
 
No victim, no crime. No exceptions. Going beyond that is literal thought crime.

... which is not to say it's a positive thing for a society or an individual, or that you should let somebody who uses AI cheese pizza babysit your kids. Just that it can't reasonably be criminal.
 
Uhhh.... no. This argument has been made by lolicon and shotacon consumers. The main counter-argument to that perspective is that it eventually leads to the harm of children since the material enables nonce behaivour and it let's pedophiles live out their fantasies that may eventually become reality. It can be argued that those are just "fantasies" and we can't exactly arrest or prosecute people on crimes that they haven't commited yet. That may be another reason as to why we prosecute and arrest people for having CP on their hardrives even if they did not produce it themselves. But I personally preffer to not take any chances as far as the safety of children is concerned, if you consume any sort of material be it AI generated or real you should go to the wood-chipper.

I also find it disgusting and I wouldn't like to live in a reality were we let nonces masturbate to AI generated CP. That just sounds like a dystopian hellhole that I wouldn't want for it to become reality. The only solution as stated on a post above is Total Pedo Death.
Ok, that isn't a counter argument, that's just restating the original point.
Someone could say that letting people say nigger leads to real life violence against black people, they could even cite examples of actual shootings done by retards who scroll through /pol/. So how is that different?

And just a reminder: this website Kiwi Farms, which documents all kinds of predators and creeps, would be shut down and Josh would almost definitely be thrown in prison in New Zealand because of extreme pro-censorship laws they passed under the pretense of fighting child porn.

Yes, I'd prefer not to live in that reality either, but unless you can think of some way to enforce that which isn't giving vague powers to the federal government that will without a doubt be turned around and used against us, probably benefitting the pedophiles, then idk what to tell you. Protect yourself, your family, and your community.
 
Unfortunately I cannot find any non-emotional argument for criminalizing any computer-generated drawing, which is a victimless act.

Yes drawn-CP is absolutely disgusting to the overwhelming majority of the population, but it's harmless nonetheless.
You cannot have an argument for wanting to ban it that you cannot also apply to violent movies, drawings, or video games.

Killing innocent people is a terrible thing to do as well, yet you are fine with violent video games, or any other media that portrays the killing of people, including children. You understand that it's fiction, that no one is being harmed, and that it doesn't appear to encourage people to commit those crimes.
For the same reason, you cannot (while remaining consistent) call for the ban of a media where committing another crime (child abuse) is displayed.
 
Degenerate faggots that are worried about the legality of drawn child porn get the rope. I feel you're only trying to legitimize your perversions.

I agree with @Scourge Muffet that we need to bring back shaming and shaming culture. Maybe we bring back branding and brand a big-ass CP on their chest and back to alert others to their retardation.
 
No victim, no crime. No exceptions. Going beyond that is literal thought crime.
To be accurate/realistic AI would need to be trained on a real CP dataset. All generated images would cause harm to victims, there's currently no way around this.

No victim, no crime? No modern legal system on earth agrees with you. There are all kinds of victimless crimes.
 
Yes, I'd prefer not to live in that reality either, but unless you can think of some way to enforce that which isn't giving vague powers to the federal government that will without a doubt be turned around and used against us, probably benefitting the pedophiles, then idk what to tell you. Protect yourself, your family, and your community.
Honestly, you are absolutely right. My first post expressed what my gut instinct thought. But thinking about it a bit more, as much as I do not want for that content to exist, banning it would create a foothold for the goverment to be able to censor other content. At the end of the day the only thing that I can actually do in reality is protect my family and community by other means that do not involve the (often incompetent) goverment.
 
Actual question, when referring to CP are we and OP referring to the real thing or loli shota shit
because this entire time i've always thought the latter because if it's the former the answer is very obvious and I don't think it would be seriously asked
plus the way this question is worded is confusing
 
Last edited:
To be accurate/realistic AI would need to be trained on a real CP dataset. All generated images would cause harm to victims, there's currently no way around this.

No victim, no crime? No modern legal system on earth agrees with you. There are all kinds of victimless crimes.
If it's being trained on real life cheese pizza, then obviously whoever trained it had those images and that's highly fucking illegal, as it should be. But that's a different kind of crime with an obvious victim.

I'm not 100% sure, but from what I know about AI art in general I don't think it's NECESSARILY the case that it would have to be trained on real cheese pizza. A lot of what AI art does is basically compositing. It's the same reason you can tell an AI to draw a cat-dog and get something pretty close, even though it only has pictures of cats and dogs. It's the same reason you can ask for a picture of a car drawn in Italian Renaissance style, even though cars didn't exist back then. So it might be able to, for example, transform drawn loli images into a more realistic style. Somebody with a stronger stomach than me can try it, I'm going to leave it in the realm of speculation. Again, I'm not arguing it's moral or a good idea, only that it might be technologically possible to make something like that without using illegal images directly.
 
How about we just legalize the lynching of people who wanna fuck children? That would sort out a lot more and it would send a message.
I accuse you of wanting to fuck children. As we are a society who punishes off pretense alone there will be no investigation into my claim. Let me know how heaven is (I won't be going there as I am an amoral psychopath who uses the law to kill people she hates.)
 
I always hated the “lolicon is harmless” argument. It may seem harmless because it’s a drawing, but it’s a gateway drug for pedophiles a lot of the time. It’s like starting with cocaine, but when that doesn’t do anything anymore, they move to stronger substances like heroin. Remember that people with an addiction will seek the more extreme when the former doesn’t do anything anymore. With pedophiles, it goes from lolicon to real CP.
 
Again, I'm not arguing it's moral or a good idea, only that it might be technologically possible to make something like that without using illegal images directly.
At some point, you would have to use images of real children explicit or not to create an accurate composition. So my questions would be 1) Where will these images be sourced? 2) How does one create safeguards to prevent real CSAM from being scraped? It's no secret that Instagram, Twitter, Facebook, etc. have all hosted (or host currently) CSAM. There would need to be some sort of public repository where all of the images are vetted by a human at some point.
 
That's a good question with no clear cut answer.
The only solution is TPD.
Kill pedophiles. Behead pedophiles. Roundhouse kick a pedophile into the concrete. Slam dunk a pedophile into the trashcan. Crucify filthy pedophiles. Defecate in a pedophiles food. Launch pedophiles into the sun. Stir fry pedophiles in a wok. Toss pedophiles into active volcanoes. Urinate into a pedophiles gas tank. Judo throw pedophiles into a wood chipper. Twist pedophiles heads off. Report pedophiles to the IRS. Karate chop pedophiles in half. Curb stomp pedophiles. Trap pedophiles in quicksand. Crush pedophiles in the trash compactor. Liquefy pedophiles in a vat of acid. Eat pedophiles. Dissect pedophiles. Exterminate pedophiles in the gas chamber. Stomp pedophile skulls with steel toed boots. Cremate pedophiles in the oven. Lobotomize pedophiles. Grind pedophiles in the garbage disposal. Drown pedophiles in fried chicken grease. Vaporize pedophiles with a ray gun. Kick old pedophiles down the stairs. Feed pedophiles to alligators. Slice pedophiles with a katana.
 
At some point, you would have to use images of real children explicit or not to create an accurate composition.
Do you though? This is the part I'm not sure about. To me this is like saying "the only way the AI can generate a Rembrandt painting of a car is to train on a Rembrandt painting of a car". But that's just not true. It's demonstrably not true. One of the things that I do think is genuinely interesting about AI art (which I hope is used for good) is the ability to combine style and subject matter in ways that have never existed before.

To use an analogy, let's say you feed an AI model HD photos of adult dogs, and good quality drawings of puppies, but no photos of a puppies. Using that data, could it produce a reasonably photorealistic image of a puppy? I bet it could. I think the idea that you need an exact reference is based on a misunderstanding of how AI art works. I'm not going to test it for this scenario just to prove a point though. Just having a KF account has me on enough watch lists already.

If it turns out I'm wrong about this part, I would absolutely change my position on a ban. But let's not ban a technology without fully understanding what it's actually doing.

1) Where will these images be sourced?
This is a general (and very real) problem with AI dataset training in general. How can it be done ethically? How can we be reasonably sure it's not being trained on any illegal material? Not just in terms of CP, but also copyrighted material or whatever else. I don't think anybody has an answer for this yet.

2) How does one create safeguards to prevent real CSAM from being scraped? It's no secret that Instagram, Twitter, Facebook, etc. have all hosted (or host currently) CSAM.
I'm not informed enough on how datasets are normally selected to answer that question very well. My instinct is to say "don't blindly scrape all of Twitter". I don't think such a model would be much good for ANYTHING. And even if your model is for something completely mundane, you could end up including illegal material if you're especially stupid about your data sources.

This isn't an "AI is scraping CP" problem, it's a "CP exists at all" problem. Shut it the fuck down.
 
Back
Top Bottom