Hacker News new | past | comments | ask | show | jobs | submit login

Thanks for pointing that out, I have updated my comment to provide the links from the quote, which were hyperlinks in the EFF post.

> it doesn't seem to explain what the actual link is to NCMEC

The problem I see is the focus that is put onto the CSAM database. I quote from Apples FAQ on the topic [1]:

Apple only learns about accounts that are storing collections of known CSAM images, and only the images that match to known CSAM

and

Our process is designed to prevent that from happening. CSAM detection for iCloud Photos is built so that the system only works with CSAM image hashes provided by NCMEC and other child safety organizations. This set of image hashes is based on images acquired and validated to be CSAM by child safety organizations.

Which is already dishonest in my opinion. Here lie the main problems and my reasons to find the implementation highly problematic derived from how I personally understand things:

- Nothing would actually prevent Apple from adding different database sources. The only thing the "it's only CSAM" part hinges on is Apple choosing to only use image hashes provided by NCMEC. It's not a system built "specifically for CSAM images provided by NCMEC". It's ultimately a system to scan for arbitrary image hashes, and Apple chooses to limit those to one specific source with the promise to keep the usage limited to that.

- The second large attack vector comes from outside, what if countries decide to fuse their official CSAM databases with additional uses? Let's say Apple does actually mean it and they uphold their promise: There isn't anything Apple can do to guarantee that the scope stays limited to child abuse material since they don't have control over the sources. I find it hard to believe that certain figures are not already rubbing their hands about this in a "just think about all the possibilites" kind of way.

In short: The limited scope only rests on two promises: That Apple won't expand it and that the source won't be merged with other topics (like terrorism) in the future.

The red flag for me here is that Apple acts as if there was some technological factor that ties this system only to CSAM material.

Oh and of course the fact that the fine people at Apple think (or at least agree) that Electronic Frontier Foundation, the Center for Democracy and Technology, the Open Privacy Research Center, Johns Hopkins, Harvard's Cyberlaw Clinic and more are "the screeching voices of the minority". Doesn't quite inspire confidence.

[1]: https://www.apple.com/child-safety/pdf/Expanded_Protections_...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: