Hacker News new | past | comments | ask | show | jobs | submit login

> I think "Think of the kids" applies very well to the CREATORS of pornography. Per wikipedia, there isn't any conclusive causal relationship between viewing CP and assaulting children.

“Think of the Kids” damn well applies to the consumers of this content - by definition, there is a kid (or baby in some instances) involved in the CP. As a society, the United States draws the line at age 18 as the age of consent [the line has to be drawn somewhere and this is a fairly settled argument]. So by definition, in the United States, these are not consenting victims in the pictures.

Demand drives creation. Getting rid of it on one of the largest potential viewing and sharing platforms is a move in the right direction in addressing the problem.

What I haven’t seen from the tech community is the idea that this will be shut down if it goes too far or beyond this limited scope. Which I think it would be - people would get rid of iPhones if some of the other cases privacy advocates are talking about occur. And at that point they would have to scrap the program - so Apple is motivated to keep it limited in scope to something everyone can agree is abhorrent.




>Demand drives creation. Getting rid of it on one of the largest potential viewing and sharing platforms is a move in the right direction in addressing the problem.

Yeah, that focus has worked really well in the "war on some drugs," hasn't it?

I don't pretend to have all (or any good ones for that matter) the answers, but we know interdiction doesn't work.

Those who are going to engage in non-consensual behavior (with anyone, not just children) are going to do so whether or not they can view and share records of their abuse.

The current legal regime (in the US at least) creates a gaping hole where even if you don't know what you have (e.g., if someone sends you a child abuse photo without your knowledge or consent) you are guilty, as possession of child abuse images is a felony.

That's wrong. I don't know what the right way is, but adding software to millions of devices searching locally for such stuff creates an environment where literally anyone can be thrown in jail for receiving an unsolicited email or text message. That's not the kind of world in which I want to live.

Many years ago, I was visiting my brother and was taking photos of his two sons, at that time aged ~4.5 and ~2.

I took an entire roll of my brother, his wife and their kids. In one photo, the two boys are sitting on a staircase, and the younger one (none of us noticed, as he wasn't potty trained and hated pants) wasn't wearing any pants.

I took the film to a processor and got my prints in a couple of days. We all had a good laugh looking at the photos and realizing that my nephew wasn't wearing any pants.

There wasn't, when the photos were taken, nor when they were viewed, any abuse or sexual motives involved.

Were that to happen today, I would be sitting in a jail cell, looking at a lengthy prison sentence. And when done "repaying my debt to society" I'd be forced to register as a sex offender for the rest of my life.

Which is ridiculous on its face.

Unless and until we reform these insane and inane laws, I can't support such programs.

N.B.: I strongly believe that consent is never optional and those under the age of consent cannot do so. As such, there should absolutely be accountability and consequences for those who abuse others, including children.


> Were that to happen today, I would be sitting in a jail cell, looking at a lengthy prison sentence.

No you would not, I was ready to somewhat agree with you but this is just false and has nothing to do with what you were talking about before. The law does not say that naked photos of (your or anyone else's) kids are inherently illegal, they have to actually be sexual in nature. And while the line is certainly not all that clear cut, a simple picture like you're describing would never meet that line.

I mean let's be clear here, do you believe the law considers to much stuff to be CSAM, and if so why? How would you prefer we redefine it?


> The law does not say that naked photos of (your or anyone else's) kids are inherently illegal, they have to actually be sexual in nature.

But that depends on who looks at it.

People have been arrested and (at least temporary) lost custody over their children because someone called the police over perfectly normal family photos. I remember one case a few years ago where someone had gotten into trouble because one photo included a toddler wearing nothing (even facing away from the camera if my memory serves me correctly) playing at the beach. When police realized this wasn't an offense, instead of apologizing they got hung up on another photo were kids were playing with an empty beer can.

Recently this was also linked https://jonathanturley.org/2009/09/18/arizona-couple-sues-wa...

which further links to a couple of previous cases.

I'd say we get police or health care to talk to people who think perfectly normal images are sexual in nature, but until we get laws changed at least then keep us safe.

> I mean let's be clear here, do you believe the law considers to much stuff to be CSAM, and if so why? How would you prefer we redefine it?

Another thing that comes up is that a lot of things that are legal might be in that database because criminal might have a somewhat varied history.

Personally I am a practicing conservative Christian so this doesn't bother me personally at the moment since for obvious reasons I don't collect these kinds of images.

The reason I care is because every such capability will be abused, and below I present in two easy steps how it will go from todays well intended system to a powerful tool for oppression:

1. today it is pictures but if getting around it is as simple as putting it in a pdf then obviously pdfs must be screened too. Same with zip files. Because otherwise this so simple to circumvent that is worthless.

2. once you have such a system in place it would be a shame not to use it for every other evil thing. Dependending on where you live this might be anything: Muslim scriptures, Atheist books or videos, Christian scriptures, Winnie the Pooh drawings - you name it and someone wants to censor it.


As soon as it is used in a negative way beyond CSAM scanning, it will cause people to sell their phones and stop using apple products.

If Apple starts using the tech to scan for religious material, there will be significant market and legal backlash. I think the fact that CSAM scanning will stop if they push it too far will keep them in check to only do CSAM scanning.

Everyone can agree on using the tech for CSAM, but beyond that I don’t see Apple doing it. The tech community is reacting as if they already have.


Problem one is Apple doesn't know what they are scanning for.

This is by design and actually a good thing.

It becomes a problem because problem number 2:

No one is accountable if someone gets their life ruined over a mistake in this database.

I'd actually be somewhat less hostile to this idea if there was more regulatory oversight:

- laws that punishes police/officials if innocent people are harmed in any way

- mandatory technical audits as well as verification that for what it is used for: Apple keeps logs of all signatures that "matched"/triggered as well as raw files, these are provided to the court as part of any case that comes up. This way we could hopefully prevent most fishing expeditions - both wide and personalized ones - and also avoid any follow up parallel reconstructions.

I'm not saying I'd necessarily be OK with it but at that point there would be something to discuss.


It may be worth taking very seriously that you might be overestimating both how quickly regular people become aware of such events and how emphatically people will react.


> I'd say we get police or health care to talk to people who think perfectly normal images are sexual in nature, but until we get laws changed at least then keep us safe.

Personally I don't find anecdotes convincing compared to the very real amount of CSAM (and actual child abuse) we already know exists and circulates in the wild, but I do get your point. That said personally I don't think changing the laws would really achieve what you want anyway - I don't think a random Walmart employee is up-to-date on the legal definitions of CSAM, they're going to potentially report it regardless of what the law is (and the question of whether this is a wider trend is debatable, again this is an anecdote).

With that, they were eventually found innocent, so the law already agrees what they did was perfectly fine, which was my original point. No it should not have taken that long, but then again we don't know much about the background of those who took them, so I'm not entirely sure we can easily determine how appropriate the response was. I'm certainly not trying to claim our system is perfect, but I'm also not convinced rolling back protections for abused children is a great idea without some solid evidence that it really isn't working.

> Another thing that comes up is that a lot of things that are legal might be in that database because criminal might have a somewhat varied history.

That didn't really answer my question :P

I agree the database is suspect but I don't see how that has anything to do with the definition of CSAM. The legal definition of CSAM is not "anything in that database", and if we're already suggesting that there's stuff in there that's known to not be CSAM then how would changing the definition of CSAM help?


> Personally I don't find anecdotes convincing compared to the very real amount of CSAM (and actual child abuse) we already know exists

First: This is not hearsay or anecdotal evidence, this is multiple innocent real people getting their lives trashed to some degree before getting aquitted.

> I don't think a random Walmart employee is up-to-date on the legal definitions of CSAM, they're going to potentially report it regardless of what the law is (and the question of whether this is a wider trend is debatable, again this is an anecdote).

Fine, I too report a number of things to the police that might or might not be crimes. (Eastern European car towing a Norwegian luxury car towards the border is one. Perfectly legal in one way but definitely something the police was happy to get told about so they could verify.)

> With that, they were eventually found innocent, so the law already agrees what they did was perfectly fine, which was my original point.

Remember the job of the police is more to keep law abiding citizens safe than to lock up offenders. If we could magically keep kids safe forever without catching would-be offenders I'd be happy with that.

Making innocent peoples lives less safe for a marginally bigger chance to catch small fry (i.e. not producers), does it matter?

The problem here and elsewhere is that police many places doesn't have a good track record of throwing it out. Once you've been dragged through court for the most heinous crimes you don't get your life completely back.

If we knew police would always throw out such cases I'd still be against this but then it wouldn't be so obviously bad.


> First: This is not hearsay or anecdotal evidence, this is multiple innocent real people getting their lives trashed to some degree before getting aquitted.

"multiple" is still anecdotal, unless we have actual numbers on the issue. The question is how many of these cases actually happen vs. the number of times these types of investigations actually reveal something bad. Unless you never want kids saved from abuse there has to be some acceptable number of investigations that eventually get dropped.

> Remember the job of the police is more to keep law abiding citizens safe than to lock up offenders.

Maybe that should be their purpose, but in reality they're law enforcement, their job has nothing to do with keeping people safe. The SCOTUS has confirmed as much that the police have no duty to protect people, only to enforce the law. However I think we agree that's pretty problematic...

> Making innocent peoples lives less safe for a marginally bigger chance to catch small fry (i.e. not producers), does it matter?

I would point out that the children in this situation are law abiding citizens as well, and they also deserve protection. Whether their lives were made more or less safe in this situation is debatable, but the decision was made with their safety in mind. For the few cases of a mistake being made like the one you presented I could easily find similar cases where the kids were taken away and then it was found they were actually being abused. That's also why I pointed out your examples are only anecdotes, the big question is whether this is a one-off or a wider trend.

If reducing the police's ability to investigate these potential crimes would actually result in harm to more children, then you're really not achieving your goal of keeping people safer.

> The problem here and elsewhere is that police many places doesn't have a good track record of throwing it out. Once you've been dragged through court for the most heinous crimes you don't get your life completely back.

Now this I agree with. The "not having a good record of throwing it out" I'm a little iffy on but generally agree, but I definitely agree that public knowledge of being investigating for such a thing is damaging even if it turns out your innocent, which isn't right. I can't really say I have much of a solution for that in a situation like this though, I don't think there's much of a way to not-publicly take the kids away - and maybe that should have a higher threshold, but I really don't know, as I mentioned earlier we'd really need to look at the numbers to know that. For cases that don't involve a public component like that though I think there should be a lot more anonymity involved.


A large? portion of “sexual predators” are men peeing on the side of the interstate[1]. So it’s not far-fetched to think that a random pic would also land you in jail.

[1]: I looked up the cases of sex offenders living around me several years ago.


Random pictures aren’t going to be in the CSAM database and trigger review. And to have multiple CSAM hash matches on your phone is incredibly unlikely.

An unsolicited email / having photos planted on your phone or equipment is a problem today as much as it will be then, but I think people over-estimate the probability of this and ignore the fact it could easily happen today, with an “anonymous tip” called into the authorities.

If they are scanning it though iMessage they will have logs of when it arrived, and where it came from as well - so in that case it might protect the victim being framed.


That is such a tortured, backwards argument, but the only one that has even a semblance of logic so it gets repeated ad nauseam. Why be so coy about the real reasons?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: