Hacker News new | past | comments | ask | show | jobs | submit login

It's not quite the same though. These images where uploaded to Google, they are required by law to not host child porn, so of course they are going to classify the images.

The Apple thing was them proposing to run the classification on your phone. On personal, not uploaded anyware, images. People felt that was crossing a line.




Google isn’t obligated to scan private photos on their cloud for csam. Apple doesn’t for example.

The Apple thing was only attaching a hash code to each image and doing nothing else unless you uploaded it Apple’s servers.


Apple was only going to scan images that were going to be uploaded to iCloud. It would be the same amount of coverage as what google does, and if you believe apples claims “more private” because it happens locally and would later let them roll out encrypted images on iCloud while still scanning for CSAM to satisfy the government.

Of course you would have to trust that the policy of only iCloud images would stay in place after they implemented a tool to scan everything on your phone, which seems like a terrible bet.


You’re remembering it wrong. Go look it up. Apple also already does this for photos in iCloud.


What am I remembering wrong? I never said anything about whether they scan images in iCloud. The proposal was to scan photos client side before upload and send a hash, and it was killed after backlash.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: