'the other one is plain hash matching, which does not “save the children” but rather is “catch the viewer” — exclusively. This has no impact on the abuse of the CSAM subjects because it only matches publicly-known content.'
Are you claiming victims of child sexual abuse wouldn't care if images of the abuse are circulating freely and being viewed with impunity by pedophiles?
So stop the circulating. This is like trying to stop people from using drugs by searching EVERYONE’S homes and reporting suspicious looking substances.
"So stop the circulating." Good idea. Any ideas how we could do that? Perhaps one thing that would help is to implement a system that hashes circulating images to ensure they aren't known images of child sexual abuse. Just a thought.
A lot of unwanted kids out there that end up in broken homes or on the street. There are also a lot of kids born in abusive families.
Good ideas to stop the circulating would be to increase birth control education and access, increase foster care funding and access, implement common sense policies for adoption, and increase funding for Child Protective Services.
It prevents CSAM from existing in the first place. Much like it would be ridiculous to search everyone's houses to stop drug use, it is far more effective to prevent the causes of drug dependency.
Are you claiming victims of child sexual abuse wouldn't care if images of the abuse are circulating freely and being viewed with impunity by pedophiles?