Hacker News new | past | comments | ask | show | jobs | submit login

The problem is that that’s essentially what most “facial recognition” systems used for law enforcement and adjacent do: they filter by skin color - false positives are overwhelmingly black.



In that case, they can use human review before accusing anyone.


This could be reasonable policy if the police in the US didn't have racial bias in enforcement. It's not even historically


Photographic equipment “accidentally” being bad at taking pictures of dark skinned people- the more things change, the more they stay the same


Why would the false positives be overwhelmingly black unless the fare jumpers are also overwhelmingly black?




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: