> I find it necessary to invest substantial effort in calibrating a new question. Before I ask a candidate a new question, I try use it to mock interview at least 10 people I have worked closely with.
You could do that some process, with the interviewee.
It would be a great way to see how collaborative they are.
i.e. you say "I've never actually worked through this problem myself, so I don't know how deep the rabbit hole is, but let's give it a crack and see what we come up with together."
Sounds a lot more like real-life to me than a manufactured question you've already worked through to the nth degree.
I don't think this would give me sufficient data to compare candidates in an unbiased way.
Without establishing objective criteria with which to evaluate candidates, it's much easier to fall back on decision making processes that are prone to unconscious bias.
A manufactured question may seem impersonal, but that's the point.
You could do that some process, with the interviewee.
It would be a great way to see how collaborative they are.
i.e. you say "I've never actually worked through this problem myself, so I don't know how deep the rabbit hole is, but let's give it a crack and see what we come up with together."
Sounds a lot more like real-life to me than a manufactured question you've already worked through to the nth degree.