Hacker News new | past | comments | ask | show | jobs | submit login

This is creepy. First, there is no serious scientific proof that your facial expression can tell if you are homosexual or not. Correlations here are total garbage. Sexuality is a really complex thing to discuss and, most importantly, is a private thing. Second, software detecting if someone is gay or not seems to me quite similar to jews being forced to wear the Star of David in Nazi Germany, so everyone could spot them and act against them, including people being wrongly labeled. Seriously guys, stop this. We have to be really careful about the potential uses our software. There can be serious impacts in society and in people lives.



If the paper does what it says it's telling user reported labels.

The problem isn't the presence of software producing likelihoods of someone being gay. The problem is people interpreting the results in a reductionist way and reducing people into labels. And another problem is you just don't like seeing what you don't like. There are companies out there profiting from the same stuff. They just don't talk about it in this way.


The ethical and moral implications are enormous. But machine learning and AI are not going away.

We must make our best effort to understand how this works, not bury our head in the sand as hostile actors use this for ill.


So what do you suggest, for example in this case?


I can easily imagine this being used by an anti-gay government to target gays. Doesn't matter to them if it is not 100% accurate.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: