Hacker News new | past | comments | ask | show | jobs | submit login

> I'm not buying it. If you can recognize what it is, enough to be able to be sure, you're exposed to it. If not, you're just guessing

That’s a misunderstanding of what is happening. The reviewer doesn’t have to see whether the visual derivative looks like child porn. They only have to see whether it matches the visual derivative of the child porn image that the hash matched.




So they'd just compare two small very blurry thumbnails? Not sure that makes it any better.


It’s not literal blurring - it’s a transform that means you don’t see the actual image, but you do see features and detail that lets you easily tell two source images apart.

The point is that you aren’t looking to see if there is porn in the image. You are looking to seem if the image matches the porn image.

It’s a good mechanism.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: