Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
olliej
9 months ago
|
parent
|
context
|
favorite
| on:
MTA banned from using facial recognition to enforc...
The problem is that that’s essentially what most “facial recognition” systems used for law enforcement and adjacent do: they filter by skin color - false positives are overwhelmingly black.
cute_boi
9 months ago
|
next
[–]
In that case, they can use human review before accusing anyone.
Larrikin
9 months ago
|
parent
|
next
[–]
This could be reasonable policy if the police in the US didn't have racial bias in enforcement. It's not even historically
Shawnj2
9 months ago
|
prev
|
next
[–]
Photographic equipment “accidentally” being bad at taking pictures of dark skinned people- the more things change, the more they stay the same
RecycledEle
9 months ago
|
prev
[–]
Why would the false positives be overwhelmingly black unless the fare jumpers are also overwhelmingly black?
Join us for
AI Startup School
this June 16-17 in San Francisco!
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: