My question was unironic, because to date there have been a large amount of issues resulting from black people using e.g. Zoom's autobackground, and it detecting them as the background, because black people were not featured in the training sets or considered when the AI was being constructed.
Likewise, many cameras do not properly pick up the skin tone of black, indigenous, people of colour. This is partly because of technological limits with respect to the number of F-stops available in commercial cameras. But also because there is a wide variety of human skin tones existing in the world, and camera manufacturers do not test for a majority of them.
Perhaps these papers speak louder for me, given that Hacker News only accepts neoliberal anecdotes :)
"Until recently, due to a light-skin bias embedded in colour film stock emulsions and digital camera design, the rendering of non-Caucasian skin tones was highly deficient and required the development of compensatory practices and technology improvements to redress its shortcomings"
" The Gender Shades project revealed discrepancies in the classification accuracy of face recognition technologies for different skin tones and sexes. These algorithms consistently demonstrated the poorest accuracy for darker-skinned females and the highest for lighter-skinned males."
Ah, I see. Well diagnosed, by the way - I did indeed think you were being ironic, and was going along with the joke, as opposed to being antagonistic in the knowledge you were being sincere.
To respond in sincere mode: while I don’t think it’s terribly important whether black people are rendered correctly by some Zoom feature, nevertheless including a reasonable number of black people in one’s training data sounds like it shouldn’t require any extra effort, and so I think it’s a reasonable enough point to make.
Likewise, many cameras do not properly pick up the skin tone of black, indigenous, people of colour. This is partly because of technological limits with respect to the number of F-stops available in commercial cameras. But also because there is a wide variety of human skin tones existing in the world, and camera manufacturers do not test for a majority of them.
Perhaps these papers speak louder for me, given that Hacker News only accepts neoliberal anecdotes :)
"Until recently, due to a light-skin bias embedded in colour film stock emulsions and digital camera design, the rendering of non-Caucasian skin tones was highly deficient and required the development of compensatory practices and technology improvements to redress its shortcomings"
https://cjc-online.ca/index.php/journal/article/view/2196/30...
"For a fixed decision threshold, the African-American image cohort has a higher false match rate and a lower false non-match rate. "
https://arxiv.org/abs/1904.07325
" The Gender Shades project revealed discrepancies in the classification accuracy of face recognition technologies for different skin tones and sexes. These algorithms consistently demonstrated the poorest accuracy for darker-skinned females and the highest for lighter-skinned males."
https://sitn.hms.harvard.edu/flash/2020/racial-discriminatio...
https://sitn.hms.harvard.edu/flash/2020/racial-discriminatio...