> Is it really child abuse if no children were involved?
I think there is an empirical question one step beyond this. Does a pedophile who sees AI child porn get inured to it and then go on to try to act out fantasies with real victims? Or does this give pedophiles a way to satiate their desires indefinitely with only AI-based content, and lead to a lower portion abusing actual kids?
A second order issue is that distributing child porn is claimed to create demand for child porn which leads to more abuse. If there were no criminal penalties for purely AI-generated CSAM and the normal criminal penalties remained for CSAM that in any way derived from images of actual kids, would the cost-benefit difference push most consumers to demand only AI-generated stuff?
I'm not saying this is definitely the case, but I think it's at least plausible that more AI-generated CSAM would reduce actual sexual abuse of children, and from a harm-reduction standpoint, it should be beneficial to have more of it ... and it's also plausible that pedophiles being able to generate more and more extreme material at will would make things worse ... and it's also likely legally and institutionally impossible to do the studies to determine which of these is actually true.
Pedophilia seems, to a large degree, to be under-researched. Mostly because of the taboo, but even questions like "what portion of the general population are sexually attracted to pre-pubescent children" do not have good data.
I’m not sure about other countries but in the USA if you even broach the subject that the guilty individuals are often themselves extremely mentally disturbed, probably themselves victims of sexual abuse at a young age, and maybe should get some help while serving out punishment for whatever society deigns as “the proper amount of time” you will be ostracized from just about any group of people
At the risk of whatever, I believe we do have good data on this sort of thing when it comes to adults.
It's often noted, if not utterly established, that "internet access" (aka porn access) strongly correlates with (and thus, perhaps causes) lower rates of sexual assault.
Allowing AI generated realistic CSAM while prohibiting real CSAM creates a real enforcement problem. Do prosecutors now need to find and identify depicted victims to prove a CSAM charge? Does believing material was AI generated serve as a defense?
We already have severe limitations on fictional depictions of this type of content so prosecuting AI depictions isn't anything particularly new.
> Do prosecutors now need to find and identify depicted victims to prove a CSAM charge?
Surely that would be a good thing if it incentivizes prosecutors to
track down the purveyors and distributors instead of just stopping
with easily targeted consumers. With nothing to establish beyond mere
possession, by some metrics their performance is optimized to the
contrary, much like the war on drugs.
> Does believing material was AI generated serve as a defense?
Believing that stolen property was legitimately acquired is a defense
against a charge of possession of stolen property, as is plausibly
claiming to have been set up. The alternative enables anyone with
physical access to cause anyone else to be guilty of a crime, surely
a net negative for society.
> We already have severe limitations on fictional depictions of this type of content so prosecuting AI depictions isn't anything particularly new.
Are you advocating its expansion to a general principle? Maybe Agatha Christie should
have faced charges for the crimes committed by her characters.
> Surely that would be a good thing if it incentivizes prosecutors to track down the purveyors and distributors instead of just stopping with easily targeted consumers
It would make it harder to prosecute purveyors and distributors. Arguing to some amorphous 'incentive' under some unnamed "metric" is silly since we have much better ways of creating incentives if you think the investigatory or prosecutorial priorties need to shift.
> The alternative enables anyone with physical access to cause anyone else to be guilty of a crime, surely a net negative for society.
No it doesn't as these defenses are unchanged.
> Are you advocating its expansion to a general principle? Maybe Agatha Christie should have faced charges for the crimes committed by her characters.
Work on your reading comprehension instead of making ridiculous claims. I'm not advocating anything. I am describing the current legal state in our country. If you are unfamiliar with the laws about creating fictional porngraphic material with underage characters, then google is your friend.
If watching this stuff makes you "hungry for more" then we have to show gay people just long enough straight porn and they will become straight. Right?
You don't choose who or what you like. Unfortunately.
This comment is great but I feel we're also leaving out the "human element" that necessarily is part of these sorts of questions. Ultimately, we, as a people, don't want pedophiles, we don't want children abused, etc. These I feel are not controversial statements. However, allowing AI-generated CASM as a way to placate pedophiles does a few things:
Firstly, it normalizes this material, even to the tiniest degree you'd like to claim. If you bring something that was formerly illegal into the state of being legal, it becomes a part of our society. Weed is a fantastic example. It's on a great track to become a very casual drug of use in our society, and with each step forward along that path, it becomes less remarkable. When I was in high school, I was taught, by teachers and hired professionals, about the "dangers" of weed and other drugs. Now, I drive down the street and pass a couple of dispensaries selling that product and many derivatives of it, completely without drama. It is simply a thing that exists.
[And to not leave it merely implied, that's a GOOD thing.]
So, that being the case, are we as a society prepared to have a society and to live within one where something like AI-generated CASM is an accepted, to whatever degree, thing to have, sell, and create? Are we okay with that if it satiates pedophiles? Are we prepared to reckon with the consequences of that if it doesn't, and more children are harmed?
Secondly, I think we have to contend with the fact that now that this technology exists, it will continue to exist regardless of legality. This is one of the reasons I was so incredibly opposed to wide-spread and open-source AI in the first place, and at risk of sounding like "I told you so," one of the concerns I outlined many times is that this technology enables people to create... anything, at a near industrial scale, be that disinformation, be that spam, be that non-consensual pornography, be that CASM. I don't think a black box program that can be run on damn near any consumer PC that can create photorealistic renderings of anything you can describe in text is inherently, by virtue of it's being, a bad thing, but I do think it's something that we as a society are not ready for, and I was not alone in that thinking. But now it's here, and now we have to deal with it.
I'd have to look up the data, but the Netherlands has a long history of decriminalizing weed. However, decriminalization did not increase usage in the Dutch population.
I don't think you'd find nearly the unanimous agreement with that statement as mine. I personally love violent media, both the action-oriented John Wick type stuff and shooter games, and also I am an avid fan of gruesome horror too.
this is a false question because neither option is valid. pedos are born pedos and they see and feel attraction to kids every single day. they are just as likely to act on these desires with or without cp
Way to dismiss a solid argument based on an anecdote. I wouldn't be so quick to conclude that porn has nothing to do with seeking out real life sex, especially given how it has been widely studied that the amount of sex young adults are having has been steadily going down since the end of 90's.
I think there is an empirical question one step beyond this. Does a pedophile who sees AI child porn get inured to it and then go on to try to act out fantasies with real victims? Or does this give pedophiles a way to satiate their desires indefinitely with only AI-based content, and lead to a lower portion abusing actual kids?
A second order issue is that distributing child porn is claimed to create demand for child porn which leads to more abuse. If there were no criminal penalties for purely AI-generated CSAM and the normal criminal penalties remained for CSAM that in any way derived from images of actual kids, would the cost-benefit difference push most consumers to demand only AI-generated stuff?
I'm not saying this is definitely the case, but I think it's at least plausible that more AI-generated CSAM would reduce actual sexual abuse of children, and from a harm-reduction standpoint, it should be beneficial to have more of it ... and it's also plausible that pedophiles being able to generate more and more extreme material at will would make things worse ... and it's also likely legally and institutionally impossible to do the studies to determine which of these is actually true.