Hacker News new | past | comments | ask | show | jobs | submit login

I bit my tongue off at one point in my life (jumped from a high height and my knee hit my chin as I was screaming). The fact that it captures most of the details of the scar where it was reattached is phenomenal. Majorly impressed.

There’s some weird banding/rainbow effects around my glasses and the background (not on my face), but that’s the only major artifact that stood out to me.




Thanks!

Glasses are sometimes a little bit of a problem, I don't have enough of those in my training sets.


I mean the obvious question here is... how many BIPOC (Black, Indigenous, People of Colour) do you have in your training sets?


Nah, it's "how many Black, Indigenous, People of Colour do you have who wear glasses and have facial scars from having fallen from a great height while screaming?". If you can't find enough preëxisting BIPOCWWGAHFSFHFFAGHWS people, I suppose you're limited to finding other BIPOC people, giving them glasses, and throwing them from a great height. (Manufacturing them the other way around might be too offensive.)


My question was unironic, because to date there have been a large amount of issues resulting from black people using e.g. Zoom's autobackground, and it detecting them as the background, because black people were not featured in the training sets or considered when the AI was being constructed.

Likewise, many cameras do not properly pick up the skin tone of black, indigenous, people of colour. This is partly because of technological limits with respect to the number of F-stops available in commercial cameras. But also because there is a wide variety of human skin tones existing in the world, and camera manufacturers do not test for a majority of them.

Perhaps these papers speak louder for me, given that Hacker News only accepts neoliberal anecdotes :)

"Until recently, due to a light-skin bias embedded in colour film stock emulsions and digital camera design, the rendering of non-Caucasian skin tones was highly deficient and required the development of compensatory practices and technology improvements to redress its shortcomings"

https://cjc-online.ca/index.php/journal/article/view/2196/30...

"For a fixed decision threshold, the African-American image cohort has a higher false match rate and a lower false non-match rate. "

https://arxiv.org/abs/1904.07325

" The Gender Shades project revealed discrepancies in the classification accuracy of face recognition technologies for different skin tones and sexes. These algorithms consistently demonstrated the poorest accuracy for darker-skinned females and the highest for lighter-skinned males."

https://sitn.hms.harvard.edu/flash/2020/racial-discriminatio...

https://sitn.hms.harvard.edu/flash/2020/racial-discriminatio...


Ah, I see. Well diagnosed, by the way - I did indeed think you were being ironic, and was going along with the joke, as opposed to being antagonistic in the knowledge you were being sincere.

To respond in sincere mode: while I don’t think it’s terribly important whether black people are rendered correctly by some Zoom feature, nevertheless including a reasonable number of black people in one’s training data sounds like it shouldn’t require any extra effort, and so I think it’s a reasonable enough point to make.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: