It's not quite the same though. These images where uploaded to Google, they are required by law to not host child porn, so of course they are going to classify the images.
The Apple thing was them proposing to run the classification on your phone. On personal, not uploaded anyware, images. People felt that was crossing a line.
Apple was only going to scan images that were going to be uploaded to iCloud. It would be the same amount of coverage as what google does, and if you believe apples claims “more private” because it happens locally and would later let them roll out encrypted images on iCloud while still scanning for CSAM to satisfy the government.
Of course you would have to trust that the policy of only iCloud images would stay in place after they implemented a tool to scan everything on your phone, which seems like a terrible bet.
What am I remembering wrong? I never said anything about whether they scan images in iCloud. The proposal was to scan photos client side before upload and send a hash, and it was killed after backlash.
The Apple thing was them proposing to run the classification on your phone. On personal, not uploaded anyware, images. People felt that was crossing a line.