> So the algorithm wasn't tested on dark-skinned people. Is this really something new?
No. Racism in the legal system, and our society overall, is not "new." However it's still a real problem in our society. As such, race- and gender-bias that's "accidentally" hard-coded into software sold to police is problematic. Every new product containing this flaw is news.
> Why should it come as a shock that facial recognition product developers suffer from the same bias?
At the very least, it should come as a shock that one of the biggest software development firms in the world is attacking critics in defense of buggy behavior. Especially since, as you note, this phenomenon is common knowledge and there's a simple and well-known workaround.
Well, here's the question though: should she be naming and shaming the development company or instead the organization that uses the technology? I think the responsibility here lies solely on law enforcement as long as Amazon didn't lie. In fact, I would expect law enforcement to test this software thoroughly themselves, because that's their job.
> ...should she be naming and shaming the development company...
Yes, if you're offering a product for sale, it's fair for people to review that product in public. And that's not "naming and shaming," it's a critical review of an irresponsibly-developed product being sold to law-enforcement agencies.
> In fact, I would expect law enforcement to test this software thoroughly themselves, because that's their job.
Yes, that is also their responsibility. But where is the recourse? Law enforcement agencies have abysmal records of self-investigation, and the judicial system is unreliable at best, in holding law enforcement agencies accountable.
The public has a right to know what technology is being used to police them. If you want to call investigative journalism "naming and shaming," then yes, absolutely, she made the ethical choice in speaking out.
No. Racism in the legal system, and our society overall, is not "new." However it's still a real problem in our society. As such, race- and gender-bias that's "accidentally" hard-coded into software sold to police is problematic. Every new product containing this flaw is news.
> Why should it come as a shock that facial recognition product developers suffer from the same bias?
At the very least, it should come as a shock that one of the biggest software development firms in the world is attacking critics in defense of buggy behavior. Especially since, as you note, this phenomenon is common knowledge and there's a simple and well-known workaround.