Hacker News new | past | comments | ask | show | jobs | submit login

the 'visual derivative' of photos on your device are sent to apple



No, they aren't. You're conflating two completely separate features, and this thread is specifically about the iMessage one.

I'd suggest you read up a bit on this - https://www.apple.com/child-safety/

What we're discussing is this feature:

"The Messages app will add new tools to warn children and their parents when receiving or sending sexually explicit photos.

When receiving this type of content, the photo will be blurred and the child will be warned, presented with helpful resources, and reassured it is okay if they do not want to view this photo. As an additional precaution, the child can also be told that, to make sure they are safe, their parents will get a message if they do view it. Similar protections are available if a child attempts to send sexually explicit photos. The child will be warned before the photo is sent, and the parents can receive a message if the child chooses to send it.

Messages uses on-device machine learning to analyze image attachments and determine if a photo is sexually explicit. The feature is designed so that Apple does not get access to the messages."

There are no visual derivatives, no neural hashes, and nothing is sent to apple.


That is the CSAM iCloud photo stuff; iMessage has nothing to do with that.


The subheadline- >More than 90 policy and rights groups ask company to abandon plans for scanning phones of adults for images of child sex abuse.


This thread is about iMessage, and the sub headline you quoted is about CSAM scanning that only concerns iCloud Photo Library.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: