Hacker News new | past | comments | ask | show | jobs | submit login

I think (but cannot prove) that along the way, it was decided to explicitly measure ability to 'study to the test'. My theory goes that certain trendsetting companies decided that ability to 'grind at arbitrary technical thing' measures on-job adaptability. And then many other companies followed suit as a cargo cult thing.

If it were otherwise, and those trendsetting companies actually believed LeetCode tested programming ability, then why isn't LeetCode used in ongoing employee evaluation? Surely the skill of programming ability a) varies over an employee's tenure at a firm and b) is a strong predictor of employee impact over the near term. So I surmise that such companies don't believe this, and that therefore LeetCode serves some other purpose, in some semi-deliberate way.




I do code interviews because most candidates cannot declare a class or variable in a programming language of their choice.

I give a very basic business problem with no connection to any useful algorithm, and explicitly state that there are no gotchyas: we know all inputs and here’s what they are.

Almost everyone fails this interview, because somehow there are a lot of smooth tech talkers who couldn’t program to save their lives.


I think I have a much lazier explanation. Leet code style questions were a good way to test expertise in the past. But the same time everyone starts to follow suit the test becomes ineffective. What's the saying? When everyone is talking about a stock, it's time to sell. Same thing.


> If it were otherwise, and those trendsetting companies actually believed LeetCode tested programming ability, then why isn't LeetCode used in ongoing employee evaluation?

Probably recent job performance is a stronger predictor of near future job performance.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: