Hacker News new | past | comments | ask | show | jobs | submit login

> As I understand it, human reviewers for this program do not see the photo itself, but instead see the hash and make a determination from that.

No, this is not correct. Human reviewers see a visual derivative which is separate from the hash. It’s basically a blurred thumbnail - enough to visually confirm that the image is not a false positive, buy not enough that the reviewers are constantly exposed fo child porn.

Also remember that multiple matches are required to even get to the human review.

The rest of your comment really doesn’t seem to match the system being described. It’s not predictive policing or anything like it, and it is obviously very much against Apple’s interest for it to generate false positives.




> It’s not predictive policing or anything like it, and it is obviously very much against Apple’s interest for it to generate false positives.

It is not predictive policing. However it's a pretty close cousin. Automated policing. I'm also skeptical that a blurred image will be enough to confirm/deny CP. I'm pretty sure that it's a system similar to YouTube's content ID and will work out in a very similar fashion. Also they have a very good incentive to err on the side of false positives in order to reduce liability for hosting CP on their servers.

It worked for the DMCA, now law enforcement are trying something similar for CP.


> It is not predictive policing. However it's a pretty close cousin. Automated policing.

It’s not policing. This is a mechanism to detect if people are uploading known child pornography images to Apple’s servers without giving Apple access to your photos. That is the only use case for this system.

Yes, if you try to upload such a collection, a police report may be filed, but only if you do this specific thing and it is verified by humans.

> I'm also skeptical that a blurred image will be enough to confirm/deny CP

It doesn’t have to confirm CP - it only has to confirm that the image matches the known CP from the database.


I'm not buying it. If you can recognize what it is, enough to be able to be sure, you're exposed to it. If not, you're just guessing - and your guess is heavily biased towards confirming what the machine said, because Apple spend $MASSIVE_AMOUNT on this technology, and who are you to question it based on a blurry picture? Also, you'd be saving children, and if you're wrong (which you're likely not - remember, industry-leading AI!) - well, the police surely will find that out very quickly and everything will be fine.

It's not like it's the first time such systems are built. We have a cases of Big Social banning people for random stuff and then saying "it was a technical error" when the noise in the media is strong enough. We have chess channels banned for racist hatespeech. Only this time the question is not whether you will be denied the opportunity to post cat pictures on facewitter for a week. It's pretty much the most shameful accusation one can be subjected to in our society. Once the press gets hold of it - and it'd get hold of it the minute the police does - there's no coming back from it for the person affected (well, maybe if they are Hunter Biden, but not otherwise). And all that will be hinged on an anonymous drone looking at a blurry picture?


> I'm not buying it. If you can recognize what it is, enough to be able to be sure, you're exposed to it. If not, you're just guessing

That’s a misunderstanding of what is happening. The reviewer doesn’t have to see whether the visual derivative looks like child porn. They only have to see whether it matches the visual derivative of the child porn image that the hash matched.


So they'd just compare two small very blurry thumbnails? Not sure that makes it any better.


It’s not literal blurring - it’s a transform that means you don’t see the actual image, but you do see features and detail that lets you easily tell two source images apart.

The point is that you aren’t looking to see if there is porn in the image. You are looking to seem if the image matches the porn image.

It’s a good mechanism.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: