Hacker News new | past | comments | ask | show | jobs | submit login

But the algorithm enshrines and possibly amplifies it. Or as the old saying goes:

> To err is human, to really foul things up requires a computer.




Humans have much more potential for racism in making predictions about the future than in judging things that have already happened. So while I agree that racism through biased input data is a problem, I think that even with that problem machines should be substantially less racist in their judgement than the humans they're replacing even if they're not perfect.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: