Hacker News new | past | comments | ask | show | jobs | submit login

When the algorithm is known, adversaries can figure out how to minimally modify images to certainly avoid detection. Uncertainty about detection capabilities may have been a deterrent.



One could claim that knowing the algorithm exists, for all cloud photo services (scanning isn't just an Apple iCloud thing), means that these people just don't upload to cloud services and not worry about any of this. For the Apple implementation, avoiding detection is as simple as switching off iCloud sync in the settings. In the case where they've modified the images to enable iCloud sharing, that would be risky game since those tweaked images would most likely make their way back into the database, and any tweak to the algorithm would mean it's all detected. My naive assumption would be that the latest model would be required to interact with the iCloud service (seems like a good idea at least).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: