Hacker News new | past | comments | ask | show | jobs | submit login

The Fourth Amendment didn't help here, unfortunately. Or, perhaps fortunately.

Still, 25 years for possessing kiddie porn, damn.






The harshness of sentence is not for the action of keeping the photos in itself, but the individual suffering and social damage caused by the actions that he incentivizes when he consumes such content.

Consumption per se does not incentivize it, though; procurement does. It's not unreasonable to causally connect one to the other, but I still think that it needs to be done explicitly. Strict liability for possession in particular is nonsense.

There's also an interesting question wrt simulated (drawn, rendered etc) CSAM, especially now that AI image generators can produce it in bulk. There's no individual suffering nor social damage involved in that at any point, yet it's equally illegal in most jurisdictions, and the penalties aren't any lighter. I've yet to see any sensible arguments in favor of this arrangement - it appears to be purely a "crime against nature" kind of moral panic over the extreme ickiness of the act as opposed to any actual harm caused by it.


> Consumption per se does not incentivize it,

It can. In several public cases it seems fairly clear that there is a "community" aspect to these productions and many of these sites highlight the number of downloads or views of an image. It creates an environment where creators are incentivized to go out of their way to produce "popular" material.

> Strict liability for possession in particular is nonsense.

I entirely disagree. Offenders tend to increase their level of offense. This is about preventing the problem from becoming worse and new victims being created. It's effectively the same reason we harshly prosecute people who torture animals.

> nor social damage involved in that at any point,

That's a bold claim. Is it based on any facts or study?

> over the extreme ickiness of the act as opposed to any actual harm caused by it.

It's about the potential class of victims and the outrageous life long damage that can be done to them. The appropriate response to recognizing these feelings isn't to hand them AI generated material to sate their desires. It's to get them into therapy immediately.


> It can. In several public cases it seems fairly clear that there is a "community" aspect to these productions and many of these sites highlight the number of downloads or views of an image. It creates an environment where creators are incentivized to go out of their way to produce "popular" material.

So long as it's all drawn or generated, I don't see why we should care.

> I entirely disagree. Offenders tend to increase their level of offense.

This claim reminds me of similar ones about how video games are "on-ramp" to actual violent crime. It needs very strong evidence to back, especially when it's used to justify harsh laws. Evidence which we don't really have because most studies of pedophiles that we have are, by necessity, focused on the ones known to the system, which disproportionally means ones that have been caught doing some really nasty stuff to real kids.

> I entirely disagree. Offenders tend to increase their level of offense. This is about preventing the problem from becoming worse and new victims being created. It's effectively the same reason we harshly prosecute people who torture animals.

Strict liability for possession means that you can imprison people who don't even know that they have offending material. This is patent nonsense in general, regardless of the nature of what exactly is banned.

> That's a bold claim. Is it based on any facts or study?

It is based on the lack of studies showing a clear causal link. Which is not definitive for the reasons I outlined earlier, but I feel like the onus is on those who want to make it a crime with such harsh penalties to prove said causal link, not the other way around.

Note also that, even if such a clear causal link can be established, surely there is still a difference wrt imputed harm - and thus, culpability - for those who seek out recordings of genuine sexual abuse vs simulated? As things stand, in many jurisdictions, this is not reflected in the penalties at all. Justice aside, it creates a perverse incentive for pedophiles to prefer non-simulated CSAM.

> It's about the potential class of victims and the outrageous life long damage that can be done to them. The appropriate response to recognizing these feelings isn't to hand them AI generated material to sate their desires. It's to get them into therapy immediately.

Are you basically saying that simulated CSAM should be illegal because not banning it would be offensive to real victims of actual abuse? Should we extend this principle to fictional representations of other crimes?

As far as getting them into therapy, this is a great idea, but kinda orthogonal to the whole "and also you get 20+ years in the locker" thing. Even if you fully buy into the whole "gateway drug" theory where consumption of simulated CSAM inevitably leads to actual abuse in the long run, that also means that there are pedophiles at any given moment that are still at the "simulated" stage, and such laws are a very potent deterrent for them to self-report and seek therapy.

With respect to "handing them AI-generated material", this is already a fait accompli given local models like SD. In fact, at this point, it doesn't even require any technical expertise, since image generator apps will happily run on consumer hardware like iPhones, with UI that is basically "type what you want and tap Generate". And unless generated CSAM is then distributed, it's pretty much impossible to restrict this without severe limitations on local image generation in general (basically prohibiting any model that knows what naked humans look like).


> Are you basically saying that simulated CSAM should be illegal because not banning it would be offensive to real victims of actual abuse?

No, it's because it will likely lead to those consuming it turning to real life sexual abuse. The behavior says "I'm attracted to children." We have a lot of good data on where this precisely leads to when entirely unsupervised or unchecked.

> but kinda orthogonal to the whole "and also you get 20+ years in the locker" thing.

You've constantly created this strawman but it appears nowhere in my actual argument. To be clear it should be like DUIs, with small penalties on first time offenses increasing to much larger ones upon repetition of the crime.

> it's pretty much impossible to restrict this

Right. It's impossible to stop people committing murder as well. It's also impossible to catch every perpetrator. Yet we don't strain ourselves to have the laws on the books, and it's quite possible the laws, and the penalties themselves, have a "chilling effect" when it comes to criminality.

Or, if your sensibility of diminished rights is so offended, then it can be a trade. If you want to consume AI child pornography you have to voluntarily add your name to a public list. Those on this list will obviously be restricted from certain careers, certain public settings, and will be monitored when entering certain areas.

Which sounds more appropriate to you?


Laws don't need to be absolutely enforceable to still work. You probably will not go to jail for running that stop sign at the end of your street (but please don't run it).

> > Strict liability for possession in particular is nonsense.

> I entirely disagree. Offenders tend to increase their level of offense.

For an example of unintended consequences of strict liability for possession look at Germany where the legal advice for what to do if you come across CSAM is to delete it and say nothing because reporting it to the police would incriminate you for possession and if you deleted a prosecutor could charge you with evidence tampering on top of it.

Also, as I understand it, in the US there have also been cases of minors deliberately taking "nudes" or sexting with other minors leading to charges of production and distribution of CSAM for their own pictures they took of themselves.

The production and distribution of CSAM should 100% be criminalized and going after possession seems reasonable to me. But clearly the laws are lacking if they also criminalize horny teenagers being stupid or people trying to do the right thing and report CSAM they come across.

> The appropriate response to recognizing these feelings is [..] to get them into therapy immediately.

Also 100% agree with this. In Germany there was a widespread media campaign "Kein Täter werden" (roughly "not becoming a predator") targeting adults who find sexually attracted to children. They anonymized the actors for obvious reasons but I like that they portrayed the pedophiles with a wide range of characters from different walks of life and different age groups. The message was to seek therapy. They provided a hotline as well as ways of getting additional information anonymously.

Loudly yelling "kill all pedophiles" doesn't help prevent child abuse (in fact, there is a tendency for abusers to join in because it provides cover and they often don't see themselves as the problem) but feeding into pedophilia certainly isn't helpful either. The correct answer is therapy but also moving away from a culture (and it is cultural not innate) that fetishizes youth, especially in women (sorry, "girls"). This also means fighting all child abuse, not just sexual.

> It's effectively the same reason we harshly prosecute people who torture animals.

Alas harsh sentencing isn't therapy. Arguably incarceration merely acts as a pause button at best. You don't get rid of nazis by making them hang out on Stormfront or 8kun.


> and it is cultural not innate

> he/they

I sometimes see people make this assertion, and interestingly enough it's usually trans people. What exactly makes you say this?


[flagged]


Icky things were historically made illegal all the time, but most of those historical examples have not fared well in retrospect. Modern justice systems are generally predicated on some quantifiable harm for good reasons.

Given the extremely harsh penalties at play, I am not at all comfortable about punishing someone with a multi-year prison sentence for possession of a drawn or computer generated image. What exactly is the point, other than people getting off from making someone suffer for reasons they consider morally justifiable?


There's no room for sensible discussion like this in these matters. Not demanding draconian sentences for morally outraging crimes is morally outraging.

I think their point was they think the law should be based off of harms, not necessarily "morals" (since no one can seem to decide on those).

GP is saying that people who want this to be a crime are morally outraged that someone else might disagree, and so it's impossible to have a reasonable debate with them about it. They're probably correct, but it never hurts to try.

Oof, I fell victim to Poe's law

Assuming the person is a passive consumer with no messages / money exchanged with anyone, it is very hard to prove social harm or damage. Sentences should be proportional to the crime. Treating possession of cp as equivalent of literally raping a child just seems absurd to me. IMO, just for the legal protection of the average citizen, a simple possession should never warrant jail time.

CP is better described as "images of child abuse", and the argument is that the viewing is revictimising the child.

You appear to be suggesting that you shouldn't go to prison for possessing images of babies being raped?


You don't go to prison for possessing images of adults being raped, last I checked. Or adults being murdered. Or children being murdered.

I don't think making the images illegal is a good way to handle things.


You do in some countries. For instance, knowingly possessing video of the Christchurch massacre is illegal in New Zealand, due to a ruling by NZ’s Chief Censor (yes, that’s the actual title), and punishable by up to 14 years in prison.

Personally, I prefer the American way.


For the record, i'm against any kind of child abuse, and 25 years for an actual abuser would not be a problem.

But...

Should you go to prison for possesing images of an adult being raped? What if you don't even know it's rape? What if the person is underage, but you don't know (looks adult to you)? What about a murder video instead of rape? What if the child porn is digitally created (AI, photoshop, whatever)? What if a murder scene is digitally created (fake bullets, holes+blood made in video editing software)? What if you go to a mainstream porno store, buy a mainsteam professional porno video and you later find out that the actress way a 15yo Traci Lords?


> the individual suffering and social damage caused by the actions that he incentivizes

That's some convoluted way to say he deserves 25 years because he may (or may not) at some point in his life molest a kid.

Personally i think that the idea of convicting a man for his thoughts is borderline crazy.

User of child pornography need to be arrested, treated, flagged and receive psychological followup all along their lives, but sending them away for 25 years is lazy and dangerous because when he will get out he will be even worst than before and won't have much to loose.


Respectfully, it's not pornography, it's child sexual abuse material.

Porn of/between consenting adults is fine. CSAM and sexual abuse of minors is not pornography.

EDIT: I intended to reply to the grandparent comment


Pornography is any multimedia content intended for (someone's) sexual arousal. CSAM is obviously a subset of that.

That is out of date

The language has changed as we (in civilised countries) stop punishing sex work "porn" is different from CASM

In the bad old days pornographers were treated the same as sadists


The language is defined by how people actually use it, not by how a handful of activists try to prescribe its use. Ask any random person on the street, and most of them have no idea what CSAM is, but they know full well what "child porn" is. Dictionaries, encyclopedias etc also reflect this common sense usage.

The justification for this attempt to change the definition doesn't make any sense, either. Just because some porn is child porn, which is bad, doesn't in any way imply that all porn is bad. In fact, I would posit that making this argument in the first place is detrimental to sex-positive outlook on porn.


> Just because some porn is child porn, which is bad, doesn't in any way imply that all porn is bad.

I think people who want others to stop using the term "child porn" are actually arguing the opposite of this. Porn is good, so calling it "child porn" is making a euphemism or otherwise diminishing the severity of "CSAM" by using the positive term "porn" to describe it.


I don't think the established consensus on the meaning of the word "porn" itself includes some kind of inherent implied positivity, either; not even among people who have a generally positive attitude towards porn.

"Legitimate" is probably a better word. I think you can get the point though. Those I have seen preferring the term CSAM are more concerned about CSAM being perceived less negatively when it is called child porn than they are about consensual porn being perceived more negatively.

> The language is defined by how people actually use it,

Precisely

Which is how it is used today

A few die hard conservatives cannot change that


Stop doing this. You are confusing the perfectly noble aspect of calling it abuse material to make it victim centric with denying the basic purpose of the material. The people who worked hard to get it called CSAM do not deny that it’s pornography for its users.

The distinction you went on to make was necessary specifically for this reason.


In that case, we should all get 25 years for buying products made with slave labour.

Who do such harsh punishments benefit?

It's a reasonable argument, but a concerning one because it hinges on a couple of layers of indirection between the person engaging in consuming the content and the person doing the harm / person who is harmed.

That's not outside the purview of US law (especially in the world post-reinterpretation of the Commerce Clause), but it is perhaps worth observing how close to the cliff of "For the good of Society, you must behave optimally, Citizen" such reasoning treads.

For example: AI-generated CP (or hand-drawn illustrations) are viscerally repugnant, but does the same "individual suffering and social damage" reasoning apply to making them illegal? The FBI says yes to both in spite of the fact that we can name no human that was harmed or was unable to give consent in their fabrication (handwaving the source material for the AI, which if one chooses not to handwave it: drop that question on the floor and focus on under what reasoning we make hand-illustrated cartoons illegal to possess that couldn't be applied to pornography in general).


> The FBI says yes to both in spite of the fact that we can name no

They have two arguments for this (that I am aware of). The first argument is a practical one, that AI-generated images would be indistinguishable from the "real thing", but that the real thing still being out there would complicate their efforts to investigate and prosecute. While everyone might agree that this is pragmatic, it's not necessarily constitutionally valid. We shouldn't prohibit activities based on whether these activities make it more difficult for authorities to investigate crimes. Besides, this one's technically moot... those producing the images could do so in such a way (from a technical standpoint) that they were instantly, automatically, and indisputably provable as being AI-generated.

All images could be mandated to require embedded metadata which describes the model, seed, and so forth necessary to regenerate it. Anyone who needs to do so could push a button, the computer would attempt to regenerate the image from that seed, and the computer could even indicate that the two images matched (the person wouldn't even need to personally view the image for that to be the case). If the application indicated they did not match, then authorities could investigate it more thoroughly.

The second argument is an economic one. That is, if a person "consumes" such material, they increase economic demand for it to be created. Even in a post-AI world, some "creation" would be criminal. Thus, the consumer of such imagery does cause (indirectly) more child abuse, and the government is justified in prohibiting AI-generated material. This is a weak argument on the best of days... one of the things that law enforcement efforts excel at is just this. When there are two varieties of a behavior, one objectionable and the other not, but both similar enough that they might at a glance be mistaken for one another, is that it can greatly disincentivize one without infringing the other. Being an economic argument, one of the things that might be said is that economic actors seek to reduce their risk of doing business, and so would gravitate to creating the legal variety of material.

While their arguments are dumb, this filth's as reprehensible as anything. The only question worth asking or answering is, were (AI-generated) it legal, would it result in fewer children being harmed or not? It's commonly claimed that the easy availability of mainstream pornography has reduced the rate of rape since the mid-20th century.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: