Hacker News new | past | comments | ask | show | jobs | submit login

> I get paid absurd sums because of supply and demand, not because of my skills;

There's limited offer because it's a rare skill, not sure what you mean by "not because of my skills" since your skills are rare hence valuable to the market.

> Thankfully for my mortgage, that hasn't happened yet and probably won't happen soon.

The keyword is "hopefully". This already happened to physicists with nuclear power if I recall my history lectures.




> There's limited offer because it's a rare skill, not sure what you mean by "not because of my skills" since your skills are rare hence valuable to the market.

Leetcode doesn't make companies money. They can't tie it to their revenue. Being paid because of my skills means I can walk into an interview, say "I'm a domain expert in X and my last client saved $Y thanks to my expertise when I optimized their Z".

The leetcode analogue is "I'm a fungible engineer, and since your company basically prints money as long as you have enough fungible engineers to keep the ship running, you should hire me"


> "I'm a domain expert in X and my last client saved $Y thanks to my expertise when I optimized their Z".

And I verify that... how?

LC is basically a proxy for being able to pick up random shit quickly, learn how it works, and demonstrate/use that knowledge. The particular knowledge isn't that important. It probably won't be that important at the big tech co - what's more important is how fast you can pick up their novel processes and stack and start getting effective work done.


>> LC is basically a proxy for being able to pick up random shit quickly

This is the exact opposite of LeetCode. It proves they've crammed on the front-end of the long tail wave of problems. It shows absolutely nothing about recognizing general problem domains and being able to identify long term solutions.

You're right it has no long term importance. It's just a different, still artificial signaling value.


I do think the fact that people cram is a problem with LC. Cramming is still a pretty big barrier.

Personally, and for many people I know, cramming was not necessary because we had a conceptual understanding - maybe a week or two of doing 2-3 a day and you would be set.

Anything you do in an interview is going to be an artificial signaling value. I think LC is a much better signal than the talk-out-your-ass approach you seem to favor, and I've done both sorts of interviews.

The difference is also pretty obvious in the quality of your peers based on what sort of interviews your firm does.


> Personally, and for many people I know, cramming was not necessary because we had a conceptual understanding - maybe a week or two of doing 2-3 a day and you would be set.

Apologies for being pedantic about word use but that time commitment is longer than what's normally meant by cram. Traditionally a cram would be up to a day or two of hard work before exams. What you outlined would just be study (an even larger barrier).

> Anything you do in an interview is going to be an artificial signaling value. I think LC is a much better signal than the talk-out-your-ass approach you seem to favor, and I've done both sorts of interviews.

I prefer the talk out your ass approach personally, but maybe with a detour to asking the candidate to describe something technically challenging they worked on. If the interviewer is fast their your feet and the candidate is skilled at explaining then you can go reasonably deep. After this the interviewer can usually weigh the talk-out-your-ass claims accordingly.

> The difference is also pretty obvious in the quality of your peers based on what sort of interviews your firm does.

Our firm (fast growing series a startup) struggles with this, so take all of the above with a grain of salt. Our problems seem to be more related to not being able to generate a large enough talent pool, not with inaccuracies cherry picking candidates. We hire about ~25% of genuine applicants and find our main problems post hiring are if they are willing to take initiative and be self directed.


Recognizing the problem archetype is the most important skill for leetcoding. Is that the same as recognizing general problem domains? No, but it's similar enough to be correlated.

The point of leetcode isn't to find the best engineer, since that's not possible in one day of interviews. It's to find someone who has the thinking skills and is willing to put in the time required to be a great engineer. Cramming being helpful is largely the point, companies want people who are committed to being the best.


I'd rather have a dev who can put a DB in third normal form than one with the highest scores on leetcode


If you take someone with a high score on leetcode and tell them to google third normal form then I guarantee they will be able to do it.

If instead you surprise them with it in an interview, they can't do it, and you don't hire them because of it, well, they are the ones that dodged a bullet not you.

I know about third normal form because I learned it from my girlfriends open-on-the-ground textbook in college. It's just not that hard.


I think the idea with that is not that it's hard, but that you know it. You won't get a product person telling you to implement a DB in 3NF - so it doesn't matter how quickly you'd learn via Google, as you wouldn't Google it.

Same goes for Leetcode. Anyone can Google how to implement some algorithm, but you won't be told what algorithm to use in your project requirements.

Of course, an interview could just involve getting you to give general descriptions of things instead of implementing them, which would satisfy this to some degree. But then everyone with a bachelors would pass the interviews, so they had to make them more intensive in order to limit the pool further.


In 1995 I "invented" Navigation Meshes. But I didn't call them Navigation meshes because I didn't learn the term until I read about them in AI Game Programming Wisdom in 2004 (a book published in 2002).

Likewise there are plenty of people who have learned database design through a combination of optimizing execution plans and dealing with the practicalities of schema changes. Such people may never have heard the term "Third Normal Form", but will run absolute rings around a college newbie who has. Indeed, in my career I have met at least two people responsible for database design who did know normal forms but did not know about execution plans, or covering indexes for that matter. Or had a nice normal set of tables, and then proceeded to build a reservation system (hit by 100,000+ users when registration opens) that takes a transactional lock on a single row using an ORM. They understood database theory from 1971, but they didn't know what this specific database implementation actually does.


Perhaps, yeah. Likewise there will be people who've never heard of Big-O notation, and when asked to explain the efficiency of their algorithm and justify why it's optimal they won't be able to, even though they're masterful algorithm builders and did get the optimal solution. Or people who know to build a solution that isn't optimal in theory, but harnesses caches, parallelism, and other real-world features of a computer to be the most effective. And they'll still get docked points on interviews, where a bored engineer is just sitting there looking for someone to regurgitate the right answers. If we're going to take a stand against database questions, I feel like it'd be disingenuous to not do the same against algorithms questions.


I think it is better to conduct interviews hinging around concepts rather (is. LC) rather than "gotcha" questions over whether you have heard a particular term of art.


IMO there's not much difference. Like you aren't going to figure out how to do dynamic programming from first principles, or re-invent the Floyd-Warshall algorithm during a 45 minute interview. Or even come up with the idea of trees on your own. Almost all LC questions are predicated on having heard of and learned about various algorithms and data structures, which really isn't any different from questions predicated on having heard of and learned about various database concepts. We're just used to it because LeetCode is the standard, so everyone grinds it.


> I can walk into an interview, say "I'm a domain expert in X and my last client saved $Y thanks to my expertise when I optimized their Z".

Billions of people have the skills to say that. Being able to say something doesn't mean that you have the skills to do it. And being able to talk about something for a few minutes after that first sentence also doesn't necessarily mean that you have the skills to do it.

If you had an airtight system for assigning credit for performance at companies, and then being able to show evidence of what you did to other companies, then what you are talking about would work. But right now nobody even knows how to properly assess the value of employees they work directly with, being able to assess the value of employees you just heard a few sentences from would be magic.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: