FYI, when orchestras started doing blind auditions, the numbers of women hired skyrocketed. Even though you'd think music would be the one area where what a person looked like & whether they were male or female would be irrelevant.
Yeah, doing blind hiring won't fix most biases. People's careers are reflective of the oppourtunites that they have had and blind hiring seems like it has the potential to just reinforce existing societal biases instead of eliminating them.
They explored a lot of factors to do with gender in a data set of several years' worth of public pull requests on GitHub. One of the more interesting findings was that women's pull requests get accepted more often, but that a couple factors can strongly affect that, including whether the person reviewing the PR knows the author of the PR is a woman:
For outsiders, we see evidence for gender bias: women’s acceptance rates drop by 12.0% when their gender is identifiable, compared to when it is not (χ2(df = 1, n = 16, 258) = 158, p < .001). There is a smaller 3.8% drop for men (χ2(df = 1, n = 608,764) = 39, p < .001). Women have a higher acceptance rate of pull requests overall (as we reported earlier), but when they are outsiders and their gender is identifiable, they have a lower acceptance rate than men.
This doesn't necessarily imply causation - that being a woman causes people to accept your pull requests.
Identification online is totally voluntary, so perhaps there is something about people who feel the need to specify their gender online, when it isn't necessary (such as when coding) that causes their pull requests to be rejected more.
Yeah but I mean it's quite plausible. There aren't many woman in IT, so I guess those who are, try to be especially careful, efficient etc. On the other hand there are these biases...
In fact this matches an observation an acquaintance of mine of is Post doc in the Physics department made. He is giving courses at University for small groups, that go with the lecture course where students get there corrected exercises and present them. So he said that women in those courses are more or less the only ones motivated and actually being interested in the stuff and that the exercises are also better.
I guess this doesn't have much to do with biology and Google Echo chambers etc but rather that people are aware of biases and counteract them.
That is possible but without some data, any data, to back that hypothesis up it becomes fairly useless. Science doesn't really work by dismissing papers by throwing out unsupported alternative explanations for data.
I had meant to say "many" people---as in many people in my experience.
To me, it's not very surprising as in many countries your picture is customarily included in your curriculum vitae, and as Github is seen as a professional site akin to LinkedIn, people simply put up their pictures there as well.
This is anecdotal, but it's very rare for me to interact with someone on Github and have no idea as to their real name, gender, and face.
---
I'll also add, there does seem to be cultural divide between us---as noted that you believe that people have to "feel a need" to add any information (photo/real name) that would show their gender when it isn't "necessary".
Whereas I feel that "real life" identification is the default, and pseudo-anonymity is a considered choice.
This seems like a problem that needs a cultural solution, not a technical one?
Because knowing who submitted the code, I can better manage my time. E.g. if the submitter is known for producing high quality code, I only need to spend minimal time browsing through it.
But, if the submitter is known to cut corners I can prepare myself to spend more time and review each line carefully because I most certainly will find issues.
I think it could be helpful for screening first time contributions at the very least. You're right that a cultural solution is necessary, but in the mean time one way to deal with bias when you know you are biased is to remove the ability for it to operate (when practicable).
Otherwise, you're stuck in this zone where you know you're liable to make bad decisions, but you rationalise to yourself that you can correct them. No matter how you slice it, it's never going to be possible to exactly correct for your biases, you're going to undershoot or overshoot. Better to minimize their scope of operation.
I really like this line of thinking because it crashes into this realm of philosophy where some people think you can feel your way to a good result, whereas the other approach is to initially intuit the just result, and rationally design a system that achieves it regardless of your instantaneous emotions. Of course, such systems thinking is very very sensitive to getting the right intuition up front, so it's very important to train it and build in feedback systems in case you start getting things you didn't expect.
But isn’t that sort of the issue? If you make assumptions about overall code quality, you’ll probably let smaller stylistic things slide because “this is a good coder.” I think about this in terms of some of the more contentious project (think bitcoin, this would be great for removing some of the personality clashes there).
Interesting idea, but everyone I've worked with has their own distinct flavors of how they code, so this wouldn't really hide identity much, not unless you didn't know the coder in the first place.
I think this is a great idea.