I think the judge chose to relax a lot on this one due to the circumstances. Releasing a man in society found with 4,000 child porn photos in his computer would be a shame.
But yeah, this opens too wide gates of precedence for tyranny, unfortunately...
The harshness of sentence is not for the action of keeping the photos in itself, but the individual suffering and social damage caused by the actions that he incentivizes when he consumes such content.
Consumption per se does not incentivize it, though; procurement does. It's not unreasonable to causally connect one to the other, but I still think that it needs to be done explicitly. Strict liability for possession in particular is nonsense.
There's also an interesting question wrt simulated (drawn, rendered etc) CSAM, especially now that AI image generators can produce it in bulk. There's no individual suffering nor social damage involved in that at any point, yet it's equally illegal in most jurisdictions, and the penalties aren't any lighter. I've yet to see any sensible arguments in favor of this arrangement - it appears to be purely a "crime against nature" kind of moral panic over the extreme ickiness of the act as opposed to any actual harm caused by it.
It can. In several public cases it seems fairly clear that there is a "community" aspect to these productions and many of these sites highlight the number of downloads or views of an image. It creates an environment where creators are incentivized to go out of their way to produce "popular" material.
> Strict liability for possession in particular is nonsense.
I entirely disagree. Offenders tend to increase their level of offense. This is about preventing the problem from becoming worse and new victims being created. It's effectively the same reason we harshly prosecute people who torture animals.
> nor social damage involved in that at any point,
That's a bold claim. Is it based on any facts or study?
> over the extreme ickiness of the act as opposed to any actual harm caused by it.
It's about the potential class of victims and the outrageous life long damage that can be done to them. The appropriate response to recognizing these feelings isn't to hand them AI generated material to sate their desires. It's to get them into therapy immediately.
> It can. In several public cases it seems fairly clear that there is a "community" aspect to these productions and many of these sites highlight the number of downloads or views of an image. It creates an environment where creators are incentivized to go out of their way to produce "popular" material.
So long as it's all drawn or generated, I don't see why we should care.
> I entirely disagree. Offenders tend to increase their level of offense.
This claim reminds me of similar ones about how video games are "on-ramp" to actual violent crime. It needs very strong evidence to back, especially when it's used to justify harsh laws. Evidence which we don't really have because most studies of pedophiles that we have are, by necessity, focused on the ones known to the system, which disproportionally means ones that have been caught doing some really nasty stuff to real kids.
> I entirely disagree. Offenders tend to increase their level of offense. This is about preventing the problem from becoming worse and new victims being created. It's effectively the same reason we harshly prosecute people who torture animals.
Strict liability for possession means that you can imprison people who don't even know that they have offending material. This is patent nonsense in general, regardless of the nature of what exactly is banned.
> That's a bold claim. Is it based on any facts or study?
It is based on the lack of studies showing a clear causal link. Which is not definitive for the reasons I outlined earlier, but I feel like the onus is on those who want to make it a crime with such harsh penalties to prove said causal link, not the other way around.
Note also that, even if such a clear causal link can be established, surely there is still a difference wrt imputed harm - and thus, culpability - for those who seek out recordings of genuine sexual abuse vs simulated? As things stand, in many jurisdictions, this is not reflected in the penalties at all. Justice aside, it creates a perverse incentive for pedophiles to prefer non-simulated CSAM.
> It's about the potential class of victims and the outrageous life long damage that can be done to them. The appropriate response to recognizing these feelings isn't to hand them AI generated material to sate their desires. It's to get them into therapy immediately.
Are you basically saying that simulated CSAM should be illegal because not banning it would be offensive to real victims of actual abuse? Should we extend this principle to fictional representations of other crimes?
As far as getting them into therapy, this is a great idea, but kinda orthogonal to the whole "and also you get 20+ years in the locker" thing. Even if you fully buy into the whole "gateway drug" theory where consumption of simulated CSAM inevitably leads to actual abuse in the long run, that also means that there are pedophiles at any given moment that are still at the "simulated" stage, and such laws are a very potent deterrent for them to self-report and seek therapy.
With respect to "handing them AI-generated material", this is already a fait accompli given local models like SD. In fact, at this point, it doesn't even require any technical expertise, since image generator apps will happily run on consumer hardware like iPhones, with UI that is basically "type what you want and tap Generate". And unless generated CSAM is then distributed, it's pretty much impossible to restrict this without severe limitations on local image generation in general (basically prohibiting any model that knows what naked humans look like).
Laws don't need to be absolutely enforceable to still work. You probably will not go to jail for running that stop sign at the end of your street (but please don't run it).
> > Strict liability for possession in particular is nonsense.
> I entirely disagree. Offenders tend to increase their level of offense.
For an example of unintended consequences of strict liability for possession look at Germany where the legal advice for what to do if you come across CSAM is to delete it and say nothing because reporting it to the police would incriminate you for possession and if you deleted a prosecutor could charge you with evidence tampering on top of it.
Also, as I understand it, in the US there have also been cases of minors deliberately taking "nudes" or sexting with other minors leading to charges of production and distribution of CSAM for their own pictures they took of themselves.
The production and distribution of CSAM should 100% be criminalized and going after possession seems reasonable to me. But clearly the laws are lacking if they also criminalize horny teenagers being stupid or people trying to do the right thing and report CSAM they come across.
> The appropriate response to recognizing these feelings is [..] to get them into therapy immediately.
Also 100% agree with this. In Germany there was a widespread media campaign "Kein Täter werden" (roughly "not becoming a predator") targeting adults who find sexually attracted to children. They anonymized the actors for obvious reasons but I like that they portrayed the pedophiles with a wide range of characters from different walks of life and different age groups. The message was to seek therapy. They provided a hotline as well as ways of getting additional information anonymously.
Loudly yelling "kill all pedophiles" doesn't help prevent child abuse (in fact, there is a tendency for abusers to join in because it provides cover and they often don't see themselves as the problem) but feeding into pedophilia certainly isn't helpful either. The correct answer is therapy but also moving away from a culture (and it is cultural not innate) that fetishizes youth, especially in women (sorry, "girls"). This also means fighting all child abuse, not just sexual.
> It's effectively the same reason we harshly prosecute people who torture animals.
Alas harsh sentencing isn't therapy. Arguably incarceration merely acts as a pause button at best. You don't get rid of nazis by making them hang out on Stormfront or 8kun.
Icky things were historically made illegal all the time, but most of those historical examples have not fared well in retrospect. Modern justice systems are generally predicated on some quantifiable harm for good reasons.
Given the extremely harsh penalties at play, I am not at all comfortable about punishing someone with a multi-year prison sentence for possession of a drawn or computer generated image. What exactly is the point, other than people getting off from making someone suffer for reasons they consider morally justifiable?
There's no room for sensible discussion like this in these matters. Not demanding draconian sentences for morally outraging crimes is morally outraging.
GP is saying that people who want this to be a crime are morally outraged that someone else might disagree, and so it's impossible to have a reasonable debate with them about it. They're probably correct, but it never hurts to try.
Assuming the person is a passive consumer with no messages / money exchanged with anyone, it is very hard to prove social harm or damage. Sentences should be proportional to the crime. Treating possession of cp as equivalent of literally raping a child just seems absurd to me. IMO, just for the legal protection of the average citizen, a simple possession should never warrant jail time.
You do in some countries. For instance, knowingly possessing video of the Christchurch massacre is illegal in New Zealand, due to a ruling by NZ’s Chief Censor (yes, that’s the actual title), and punishable by up to 14 years in prison.
For the record, i'm against any kind of child abuse, and 25 years for an actual abuser would not be a problem.
But...
Should you go to prison for possesing images of an adult being raped? What if you don't even know it's rape? What if the person is underage, but you don't know (looks adult to you)? What about a murder video instead of rape? What if the child porn is digitally created (AI, photoshop, whatever)? What if a murder scene is digitally created (fake bullets, holes+blood made in video editing software)? What if you go to a mainstream porno store, buy a mainsteam professional porno video and you later find out that the actress way a 15yo Traci Lords?
> the individual suffering and social damage caused by the actions that he incentivizes
That's some convoluted way to say he deserves 25 years because he may (or may not) at some point in his life molest a kid.
Personally i think that the idea of convicting a man for his thoughts is borderline crazy.
User of child pornography need to be arrested, treated, flagged and receive psychological followup all along their lives, but sending them away for 25 years is lazy and dangerous because when he will get out he will be even worst than before and won't have much to loose.
The language is defined by how people actually use it, not by how a handful of activists try to prescribe its use. Ask any random person on the street, and most of them have no idea what CSAM is, but they know full well what "child porn" is. Dictionaries, encyclopedias etc also reflect this common sense usage.
The justification for this attempt to change the definition doesn't make any sense, either. Just because some porn is child porn, which is bad, doesn't in any way imply that all porn is bad. In fact, I would posit that making this argument in the first place is detrimental to sex-positive outlook on porn.
> Just because some porn is child porn, which is bad, doesn't in any way imply that all porn is bad.
I think people who want others to stop using the term "child porn" are actually arguing the opposite of this. Porn is good, so calling it "child porn" is making a euphemism or otherwise diminishing the severity of "CSAM" by using the positive term "porn" to describe it.
I don't think the established consensus on the meaning of the word "porn" itself includes some kind of inherent implied positivity, either; not even among people who have a generally positive attitude towards porn.
"Legitimate" is probably a better word. I think you can get the point though. Those I have seen preferring the term CSAM are more concerned about CSAM being perceived less negatively when it is called child porn than they are about consensual porn being perceived more negatively.
Stop doing this. You are confusing the perfectly noble aspect of calling it abuse material to make it victim centric with denying the basic purpose of the material. The people who worked hard to get it called CSAM do not deny that it’s pornography for its users.
The distinction you went on to make was necessary specifically for this reason.
It's a reasonable argument, but a concerning one because it hinges on a couple of layers of indirection between the person engaging in consuming the content and the person doing the harm / person who is harmed.
That's not outside the purview of US law (especially in the world post-reinterpretation of the Commerce Clause), but it is perhaps worth observing how close to the cliff of "For the good of Society, you must behave optimally, Citizen" such reasoning treads.
For example: AI-generated CP (or hand-drawn illustrations) are viscerally repugnant, but does the same "individual suffering and social damage" reasoning apply to making them illegal? The FBI says yes to both in spite of the fact that we can name no human that was harmed or was unable to give consent in their fabrication (handwaving the source material for the AI, which if one chooses not to handwave it: drop that question on the floor and focus on under what reasoning we make hand-illustrated cartoons illegal to possess that couldn't be applied to pornography in general).
> The FBI says yes to both in spite of the fact that we can name no
They have two arguments for this (that I am aware of). The first argument is a practical one, that AI-generated images would be indistinguishable from the "real thing", but that the real thing still being out there would complicate their efforts to investigate and prosecute. While everyone might agree that this is pragmatic, it's not necessarily constitutionally valid. We shouldn't prohibit activities based on whether these activities make it more difficult for authorities to investigate crimes. Besides, this one's technically moot... those producing the images could do so in such a way (from a technical standpoint) that they were instantly, automatically, and indisputably provable as being AI-generated.
All images could be mandated to require embedded metadata which describes the model, seed, and so forth necessary to regenerate it. Anyone who needs to do so could push a button, the computer would attempt to regenerate the image from that seed, and the computer could even indicate that the two images matched (the person wouldn't even need to personally view the image for that to be the case). If the application indicated they did not match, then authorities could investigate it more thoroughly.
The second argument is an economic one. That is, if a person "consumes" such material, they increase economic demand for it to be created. Even in a post-AI world, some "creation" would be criminal. Thus, the consumer of such imagery does cause (indirectly) more child abuse, and the government is justified in prohibiting AI-generated material. This is a weak argument on the best of days... one of the things that law enforcement efforts excel at is just this. When there are two varieties of a behavior, one objectionable and the other not, but both similar enough that they might at a glance be mistaken for one another, is that it can greatly disincentivize one without infringing the other. Being an economic argument, one of the things that might be said is that economic actors seek to reduce their risk of doing business, and so would gravitate to creating the legal variety of material.
While their arguments are dumb, this filth's as reprehensible as anything. The only question worth asking or answering is, were (AI-generated) it legal, would it result in fewer children being harmed or not? It's commonly claimed that the easy availability of mainstream pornography has reduced the rate of rape since the mid-20th century.
Is it value, though, that they perceive? Or is the necessity to be in that location that drives prices up? If that's the case, that necessity might not have positive causes either...
What could be more valuable than satisfying a necessity? Your question presupposes some kind of opposition that makes me think you are using words with definitions at odds with the ones I know.
I think 100 Gb of GPU memory will always cost multiples of CPU + regular memory.
Using LLMs and computer vision for these kinds of tasks only make sense in small scales. If the task is extensive and repeated frequently, you're better off using an LLM to generate a script using Selenium or whatever, then running that script almost for free (compared to LLM). O1 is very good at it, by the way. For the $0.10 of 1 page interaction charged by Skyvern, I can create several scripts using O1.
Genuinely interested in your thinking: superficially looking, your anti-bot ideas are a bit contradictory to your Stealth browser, which enables bots. Why did you choose to make your browser useful for bot activity?
Just because web browsers like Firefox, Safari, and Chrome are trackable by default and generally care not about the anonymity of their users - doesn't mean that a web browser should behave that way.
Being able to use the web anonymously is a value that should be held up against moral values. Malicious scraping of websites should be, too. Both ideas can simultaneously true, they don't have to be mutually exclusive. I also support initiatives like the web archive which I consider "good behavior" of web scraping.
If a person asks me for the dataset of my website and they don't have a competitive business that they run, I'm even happy to open source the dataset generation part. Contributors get more rights, abusers get less rights. That's how it should be in my opinion.
Sure, that's why I said "superficially looking", because the back of my head said you obviously have a conscious intention consistent overall.
Would you mind elaborating more on where you draw the line between "good" and "bad" behavior when it comes to scraping?
Is it obeying robots.txt, the monetary purposes (profit / non-profit)?..
I personally have a hard time accepting that a website can allow Google to scrape all their content, but prevent others. In my view, this is anti-competitive behavior favoring a large and evil corporation. Argumenting that Google delivers value back in the form of traffic is unacceptable to me, because it goes against basic principles of net neutrality, which I support.
The analogy would be more like a library forcing everyone to swim through a pool in order to read a particular book. And if you complain you have a disability, someone says that the world shouldn't cater to an individual.
The analogy is ridiculous, yes. As it is ridiculous to build such a website that disabled people cannot possibly read. You don't have to make it perfect for them, just don't make it impossible.
The analogy is to stop making documentary films because one cannot see or hear.
Should businesses and academia strive to make information accessible? Yes. Should every piece of information be put into accessible formats, damn the art? I don't think so.
reply