Hacker News new | past | comments | ask | show | jobs | submit login

Image filtering is now trivial even using simple transfer learning, leading to >99% accuracy. It's just the training datasets might be super disgusting and question one's will to live when confronted with the savagery that is going on around the world.



Wasn't there some issue a while ago that the UK police were having trouble distinguishing pictures of sandy deserts from nude bodies?


How is it not a crime to possess the training set?


AFAIK large companies have deals with the government and have special units that deal with that type of content (nobody survives longer than 12 months there).

You can get legal datasets for sick pr0n or warzone stuff you would like to keep off-platform that might have the same psychological effect on you.


Should be simple enough to have the government/LEAs posses the training data and have a standard harness/workflow you integrate your training program to. Then you send your training peogram and get the resulting trained model files back.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: