Hacker News new | past | comments | ask | show | jobs | submit login
A framework for grading your performance on programming interview problems (docs.google.com)
251 points by danielgh7 on Oct 19, 2021 | hide | past | favorite | 232 comments



I do find it a little ironic that as an industry we started doing these interviews as a "least common denominator" - from senior engineers to fresh college grads, everyone occasionally needs to do a search or a sort or a lookup table, and people should recognize when "loop in a loop" n^2 solutions aren't ideal.

And we tuned it from there to into its own sort of mini-SAT with an accompanying preparatory industry and culture.

I find it as a reminder of my fungibility as an engineer: I get paid absurd sums because of supply and demand, not because of my skills; anybody who can game the leetcode exam (and the architectural diagram quiz and the personality quiz) can walk in and take the same 300k income I have. If enough people do so, wages could be pushed down by employers.

Thankfully for my mortgage, that hasn't happened yet and probably won't happen soon.


I'm honestly completely fine with this, and my worry is that the industry wants to move towards a more "accredited" role with organizations similar to doctors, lawyers, or other professionals to keep others out.

Why am I worried? Because if you're a competent programmer you can spend 3-months to a year studying for these interviews and make $200k+ in TC. You don't need a degree or even much work experience to achieve this.

You don't have to come from an ivy league school or have powerful connections, you simply need to understand a few common leetcode patterns and system design.

If you can understand for loops, variable reassignment, objects & arrays, and some other unique operators like modulo or booleans you can likely answer 70% of LC type questions. That's an extremely low bar, and all it takes is dedicated practice.

Even if you don't want to leetcode, there are still plenty of well paying jobs that pay more than the average household income whose barrier to entry is even lower.

What other job or industry honestly has this option? I honestly feel like software develop is the truly last "middle class" industry (if salaries kept pace in other jobs for the last 40 years).


> You don't need a degree

The low barrier to entry is one the greatest things about programming. I don't have a CS degree, but for a while I spent some time actively studying data structures and algorithms on my own (and solving hundreds of programming problems at "online judge"-type sites like SPOJ; it's fun!). I don't feel in any way "inferior" to people with a degree.


> I don't feel in any way "inferior" to people with a degree.

I did the same thing, but I do feel inferior to degreed folks. Not skill-wise; I know people are all over the board with their skill level. It comes down to this statement: "I have two candidates who are equal in every way except candidate A never finished college. Candidate B gets hired!"

Going this route you have to be better than everyone else with a degree: If you're not, the person with the degree will win, every time. I did this with personal projects/passion/odd jobs, and now with a decade of work experience as an engineer.

Sometimes I want to finish my degree (I got an associates, so 1/2 way there!) but the ROI doesn't make sense with school being so expensive. My big fear is that I'll never get hired after this job because there are lots of qualified candidates who went to school, and I just don't stack up.

EDIT: It's also affected my willingness to change jobs, confidence in myself, and as a result my salary. I make $120K/yr in a high COLA. If you're reading this as a youngster: GO TO COLLEGE AND FINISH IT!


I've sat in on probably a couple thousand hiring decisions by now. (It's not an achievement, it's a battle scar ;)

I've literally never seen a decision being made on the basis of having a degree. This might still happen in very small companies, but the industry as a whole doesn't care much about degrees when it comes to the hiring stage.

The thing the degree buys you is getting in front of the hiring people in the first place, if you don't have any work experience.

So, if you don't have any, yes: GO TO COLLEGE AND FINISH IT. (Don't worry too much if it's not Stanford, or Berkeley, or MIT, or whatnot - but get that degree. It's a shibboleth)

But if you've gotten past that point (like OP has) - you now have experience. That's incredibly valuable, especially right now. I'd suggest you interview. (The market is extremely generous right now). Don't sweat the degree or lack thereof.


I have never interviewed two candidates who I wanted to hire who were remotely close to equal. To the extent that two or more candidates are equal, it's that they are equally bad. Among competent developers the amount of variability among candidates is absolutely enormous.


> [snip] It comes down to this statement: "I have two candidates who are equal in every way except candidate A never finished college. Candidate B gets hired!"

> Going this route you have to be better than everyone else with a degree: If you're not, the person with the degree will win, every time.

> [snip] My big fear is that I'll never get hired after this job because there are lots of qualified candidates who went to school, and I just don't stack up.

Oh boy, this resonates so strongly with me. I even have actual proof that the opposite is true (when I asked about other candidates at my first "real" job while I was still in college, my boss said they hired me over people with a degree because of my github), I still felt this way!! In the end, I ended up finishing my degree not because it made me more intrinsically valuable or a 10x rockstar code god, but because of that exact fear you're talking about. I wish I was the kind of person who didn't have to do that.


That's basically why I finished my degree. My dad had the same experience has you, and told me lots of times about it. I finished my degree out of respect for his experience, and knowing that I wouldn't want to live the same thing.


>I have two candidates who are equal in every way

This never, ever, ever, ever happens. It's a complete non-story.


This happens with graduate degrees also (not having one hindered a number of prospects)


> I don't feel in any way "inferior" to people with a degree.

While I don't feel inferior to them either, I think people with degrees have two major advantages.

1. They are pre-selected.

The ability and means to get into good schools and finish courses are powerful signals. It may or may not translate into competence but it certainly means something. People generally have positive perceptions.

2. They network.

Students are surrounded by like-minded peers, professors. Knowing these people personally can and often does make all the difference in the world. Sometimes people start companies together. Sometimes people recommend others they know personally and they get a job. Sometimes companies will recruit only from top schools.


Uh, coding is just a small part of software development. That job interviews are exclusively focusing on that aspect is pretty frustrating (and somewhat concerning).

I got the impression that the lack of formal education (in CS) left me with a gap in background information which I only partly closed over the years.


While I agree, as someone who is effectively self-taught, it's really troubling to me what we get paid for the quality and quantity we produce collectively.

I bet a lot of folks here won't want to hear this, but we really are vastly overpaid for what we do.

Everywhere I've worked, and I mean everywhere, there's a lot of frankly bad practices that are bad for users and create bullshit work for everyone involved. Just because you've installed a bunch of linters, have 80%+ coverage, and use Code Climate does not mean that your code isn't fundamentally flawed and a liability.

I do think this stems largely from there being effectively no credentials required to be a software engineer. Whether or not your code is "good" mostly depends on your title, seniority, and social clout. If the software breaks, you just keep fixing it and you look like a hero because you're constantly opening and closing Jira tickets. Junior developers learn to code without bothering to consider the philosophy, history, and discipline of code.

It's all just to "get shit done", and there's consequences to this. Things are written to not be accessible, and if they are written poorly then it can be difficult to implement accessibility, thus many are cut out of participation. People's account data and credentials get exposed. Countless millions of $ are spent deciphering code written by people who believe their shit doesn't stink (the code documents itself bruh), and countless hours of lifetime are depleted. People are outright ripped off when companies release software, namely games, that are written atrociously.

Do we really need this many engineers to write GUIs around databases? I seriously question this. Should we really get paid this much for a job where we can be inefficient with almost no consequence?

I argue that we should have standardized accreditation.

How we would implement it, I don't really know. The obvious counterargument is how bass-ackwards many college computer science and programming courses are.


>I bet a lot of folks here won't want to hear this, but we really are vastly overpaid for what we do.

Yeah, coz this view is driven largely by the just world fallacy. Either that or the perplexing idea that quality of life is driven by iphone ownership and cheap clothes rather than availability of medical care, education and a roof over your head.

The fact that education, home ownership and medical care are still reasonably priced compared to our incomes but almost nobody else's while wealth inequality has taken off like a rocket suggests that it is most other professions that are underpaid - not us who are overpaid.

The fact that our incomes kept up with productivity while median incomes decoupled in 1981 suggests likewise.

I'm shopping for houses not quite as good as what my teacher parents were 40 years ago and I'm in the top 2% of incomes while they were a bit above average.


> Yeah, this view is horribly toxic.

Yeah, how dare I have a viewpoint. Goodness gracious.

> The fact that education, home ownership and medical care are reasonably priced compared to our incomes but nobody else's while wealth inequality has taken off like a rocket suggests that it is most other professions that are underpaid - not us who are overpaid.

Both can be true at the same time. In my personal opinion, if a lot of common practices seen in the software industry went away, the cushiness of our jobs would be glaringly obvious. Did you not see that my point is that I witness us creating busywork for ourselves? If it could be measured and validated that my experience matches reality, what would be the argument for being paid this much?

And yes, I know that different parts of the world pay engineers differently. I can only really comment on how engineers in much of the United States are paid.

If I can't call out work for being shoddy and excessive, then that kind of proves my point. Why should I get paid more than other jobs that require more skill despite if my work is garbage?

> The fact that education, home ownership and medical care are reasonably priced compared to our incomes

Come again? I wouldn't call any of those things reasonably priced even at my income.


>Come again? I wouldn't call any of those things reasonably priced even at my income.

To me that says underpaid.

Personally, I can afford them. I'd say that makes me middle class. Most of my friends cannot. They are assuredly not.

In the 1950s these things were available on a non-college degree wage.


Whether something is reasonably priced doesn't really have a 1-to-1 relationship with whether you can afford it.

I could certainly afford a home. Would it provide the same value that a home did to my ancestors? Probably not. It wouldn't be a good use of my wealth at this time.

> In the 1950s these things were available on a non-college degree wage.

Yet I am being underpaid? Sure, technically speaking, we are all underpaid if you compare things like workforce participation to wages and CPI. None of that really undermines my point that I don't think we should be paid as much as we are for the quality of work we are producing.

You don't have to like it or agree, but no, I refuse to believe that I'm a bad guy for concluding that there's a racket-like quality to the software industry. In politics there are clubs where political performance has little bearing on the distribution of wealth, so why would any other industry necessarily be different?


>Yet I am being underpaid? Sure, technically speaking, we are all underpaid if you compare things like workforce participation to wages and CPI. None of that really undermines my point that I don't think we should be paid as much as we are for the quality of work we are producing.

Except it does...?

Just saying it doesn't doesn't make it true.

>You don't have to like it or agree, but no, I refuse to believe that I'm a bad guy for concluding that there's a racket-like quality to the software industry.

Software engineering isn't a racket.

The reason we're paid middle class wages (unlike, say, teachers) is because:

A) capital controls the stocks and flows of most society's wealth and because

B) we can potentially trigger large increases to productivity so the ceiling on our value add is very very high.

C) there are not enough experienced programmers to satisfy capital's insatiable demand for automation.


> Except it does...?

These things are relative to one another. The way you present your case that we are all underpaid is very absolutist and lacks the nuance that someone can be underpaid in the grand scheme but overpaid at a smaller scale.

> The reason we're paid middle class wages (unlike, say, teachers) is because:

I don't really have the time to adequately address every one of those reasons, so all I have to say is that if economics were that straight forward and honest then we'd have a lot more millionaires and billionaires. At best, reason B is true absent any sort of incompetence or meaningful quality-control around production. When you consider how the revenue of many companies is not tightly coupled with productivity and profits, your perspective doesn't seem to add up.


>These things are relative to one another. The way you present your case that we are all underpaid is very absolutist

I didn't really disagree that "underpaid/overpaid is relative", I just used a different default frame of reference to you. One that made you angry I guess?

> and lacks the nuance that someone can be underpaid in the grand scheme but overpaid at a smaller scale.

This point sounds horribly confused to me. Earlier you said we are all vastly overpaid. Now we are both underpaid and overpaid? Depending on the scale of... something?

>I don't really have the time to adequately address every one of those reasons, so all I have to say is that if economics were that straight forward and honest then we'd have a lot more millionaires and billionaires.

I really have no idea what you're getting at here. It doesn't make any sense to me.


Nah, I wasn't angry. In any case, I may have just misunderstood you.


> Why should I get paid more than other jobs that require more skill despite if my work is garbage?

it's a mistake to think that degree of skill is or should be the only input to compensation. whether the skill is actually useful to other people is at least as important. there is still a lot of low-hanging fruit in software. even a mediocre engineer who messes up a lot of stuff can still provide a lot of value. I think that's likely to end given enough time, but for now, it's reality.

also as an aside, I think it is easy to forget that software is actually kind of challenging when you're surrounded by other people that do it as a full-time job. it's not rocket science or open-heart surgery, but when I help my friends who are trying to break into software without much tech background, I realize there is a lot of knowledge I take for granted.


This is the typical “just world” rant of the junior to mid-range devs. Did this exact rant back in my day.

I’ve been in those shops that “consider the philosophy, history and discipline of code”. Your counterparts in those organizations? They’re ranting against the waste they see going on.

The reality is more nuanced. If you were handed a multi-thousand developer organization, business goals, and a fixed budget by the BoD via the CEO and CIO, your theories of “how it should be” won’t make it past the first fiscal quarter.

This is the next level awareness of what Brooks meant by “no silver bullet”. He wasn’t just talking about coding tooling, he also meant these kinds of diatribes we all evolve through in our career.

We aren’t paid to only cut code. We’re paid to help deliver an outcome. Outcomes at the business level are not evaluated by the same criteria you personally use, despite what your immediate business liaisons tell you; you need the C-to-upper-management perspective to start to understand the real decision factors. Before you make assumptions about companies not knowing at industry aggregates level how to pay developers, first go manage a multi-hundred to multi-thousand person engineering organization with balance sheet and P/L LOB insight into their real impact as crunched by the finance wonks. Then you’d know the real cost benefit you currently assume the finance people in aggregate don’t know.

We absolutely can and must do better as an industry, and I see us every month making incremental progress (this is much more difficult to see in the first half of your career because you struggle to get a feel for what is and isn’t progress that business cares about). That doesn’t preclude us from obtaining useful outcomes given prevailing conditions (including pay as a very small factor).


> we are vastly overpaid for what we do.

I highly disagree. I think everyone else is underpaid.

Will not explain further because i don't think discussions are allowed to get more political than that here.


I agree. It's hard to argue software developers are overpaid when the executive leadership are making hundreds of millions (some even billions).

Seems absolutely terrible to attack the smaller fish compared to the truly 0.01%. Especially when the billionaires are directly influencing the country.

Let me know when a staff engineer at Twilio or Square is able to influence the media people consume, and I might feign worry.


The "high tech illusion" concept spoken about by the book "Peopleware" relates to this. It's the idea that software engineering is viewed as a high-tech profession on par with something like rocket engineering. In reality, software engineering is closer to an assembly line process wherein we simply arrange and connect pre-made components (e.g. APIs).


I don't think it's true that your educational background doesn't matter at all.

At least at the tech co where I work in ML, every single one of my peers has a PhD and while I don't, I went to Harvard.


How do you tell if someone went to Harvard? Don't worry, they'll tell you in the first 30 seconds.


Honestly, I never mention it in real life unless directly asked because people (especially in tech, I've noticed) seem to be very sensitive around "name-brand" schools, ie. with comments like this.

But I think it has direct relevance to the GP comment.


I've read this here before but I honestly think there should be more regulations about who can be hired. The name of the school should not matter in the slightest if one has studied the same thing.

Making it impossible for the employer to know that detail would also incentivize all the schools to provide a good education. Also many top universities are mostly about making connections. While their curriculum is usually very hard, this is also reflected in the grades.. people may still pass but their average drops.

In the US it's even worse, because tuition has to be paid. This already filters out many people from going to a "more elite" school.


Or more likely, they vaguely say they "went to school in Boston"


If you answer that when asked point-blank what school you went to, I think you are an asshole.

That said, I can understand the impulse unless you want to be called "Mr. Harvard" every time you do some dumb shit in the office.


"went to school somewhere northeast of NYC"


If they are Vegetarian and they went to Harvard which one do they tell first?


Better yet - I actually am vegan and went to Harvard :)

So I guess Harvard wins out! In real life, veganism definitely does though :)


There may be a difference between data science or ML work and software engineering. It's commonly stated, accurately IMO, that you don't really need an advanced math background to be a software engineer or developer. I tend to agree, as long as I'm also allowed to say that I believe it's a very difficult, complicated, and intricate job. It just doesn't require calculus, linear algebra, or higher.

For ML? Well, I do know people who have never taken a formal course in anything above calculus who absolutely do apply these techniques properly and are well versed in how they work. But to understand how k-means clustering works, or how to construct a gradient vector... yeah, this requires multivariable calculus and non-linear optimization across systems of equations. And while you don't need to know how the underlying math works to use kmeans or neural nets, the mathematics is a lot closer to your work.

Under these circumstances, I can see why a lot of companies might just start looking for STEM degrees from known programs in a way they don't really with SEs.


The Wright brothers never had a pilots license.


No, but they did own a successful bike shop for the better part of a decade. Don't buy the narrative that the first airplane only cost $28,000 so someone with the same grit as the Wright brothers could have achieved that first flight. Or do, but do be careful not to draw unrealistic conclusions.

If you have a decade of industry experience, no one cares that you went to some no-name college. It's pretty easy to get callbacks as a senior software developer. But don't mistake that for meaning that educational background is irrelevant. It's far harder to get callbacks if your resume has a bootcamp for education, and no experience. Throw in a felony criminal record, and the job you'll get a Lyft is fixing bicycles for $17/hr. Which isn't a bad job, actually, but it's a far cry from $200k/yr. Meanwhile, someone at an internship at Google or Facebook while working on two degrees from MIT will be making the equivalent to $100k before ever graduating.

Experience matters far more than education, and the software industry isn't some egalitarian ideal, no matter how many millionaires were minted in the 90's and 2000's, and continue to get minted.

The argument that someone could go make their own app and sidestep the need to get a pilot's license is a poor one. If you look at all the shovelware, spyware, and straight up scams on both Apple and Google's app stores, it's clear that there's more to being a successful app developer than writing a good app, and the exceptions there (like Christian Selig) are akin to literal rock stars. Good for them and all, but the rock star's backup singers' names and stories get lost to history, nor do they get the same fame or fortune.

It's like when exceptional black, women candidates are held up as an example of equality. The exceptional candidate isn't the one to ask about, truly exceptional people always bubble up. Where do the average, or lower-than-average candidates (with reasonable educations) end up? How many don't even make it into the field, in a failure-to-launch scenario because they never get that first break that you did?

The Wright brothers never had a pilots license, but who here's heard about Charlie Taylor, their machinist?


The Wright brothers weren't trying to get hired by large, inefficient at hiring, aeronautics firms.


The idea is that if you don't need a license or a degree to work in new fields. Software engineering is full of new fields in their infancy.

To say you only need a certain background to work is horribly misguided. You can be productive in many fields with a wide array of knowledge and education.

I'm not saying a frontend developer can write novel ML algorithms, but surely they can contribute in other capacities (like creating interfaces to work with the novel ML algorithms, which is meaningful in itself and requires new UX paradigms surely).


You're completely misinterpreting my point, and perhaps conflating is and ought.

In the industry, your educational background does matter, especially for the type of firm that asks leetcode questions.

I was not saying you need a certain background to be able to be productive in tech, that is a strawman.

But I would defend that your educational background is a signal (very far from a perfect one) at how productive you would be at knowledge work.


if they had to pilot modern aircrafts with much more complicated controls, carrying hundreds of passengers they would definitely be required to have one


Is Harvard known for having a particularly strong ML program compared to other schools which aren't household names, like UIUC and Georgia Tech?


Per capita it is probably stronger than many schools, overall I doubt it is better - and certainly not than top CS schools.

That said, in undergrad for ML, I think some of the more important things are giving you a strong basis in linear algebra, vector calculus, and numerical computing - and I certainly got that there.


Ah I think it's overrated signalling. I've worked in R&D tech teams where people doing some of the best work on say functional language compilers and authoring type theory papers haven't had degrees. Lead of that team had a BSc. Hired many PhDs but there wasn't much of a correlation between that and output.


Oh - I wasn't trying to say it's a good signal. PhDs in particular I think are overrated as a signal - but maybe I'm biased because I only have an AB.


there's a lot of wiggle room to differ about what you need to learn. If you managed to prep for this with no degree in 3 months, well dayum. I consider that impressive, and I'd say that's not a low bar at all.

I'm not allowed to discuss my specific google interview questions, but I'd say they're similar to some of the harder questions in "cracking the coding interview". On the line of "find all matching subtrees in a binary tree". Remember, you have to do this in 45 minutes, and the impression I got from my feedback is that you need to make substantial coding progress (again, in 45 min at a whiteboard).

That's hardly impossible, but I think it's notably tougher than "loops, variable reassignment, objects & arrays, and some other unique operators like modulo or booleans " There's no way I'd suggest someone go into a google SWE interview with only that prep. I'd say you need to be very sharp on recursion, tree traversal, sorting, set permutations, and a number of other algorithms - and I disagree that you can memorize solutions. You should have them down cold, but you won't be asked to regurgitate them - instead, you will need to be able to think on your feet to figure how these algorithms can be adapted to a variant of a problem you've seen but not quite in the form they're asking about.


Not everyone can think lol.


I've begrudgingly been doing leetcode for the last few weeks in prep for looking for a new job. And a lot of it is stuff that I have never needed in 20+ years of coding.

But I will say reviewing the graph algorithms hasn't been a complete waste of time. It's good to at least know what they are, what they can do, and what their tradeoffs are. I realize I was a bit fuzzy on these.


Same. Some of the the questions/solutions are very aesthetically pleasing. While I find it pretty satisfying, I still have a million things I'd rather be doing.


I should also add that of the several interviews I've done so far all but one asked practical implementation questions. And the one that did ask more leetcode style questions only asked easy ones.

It would be different if I were applying at FAANG style companies I'm sure.


If only it were like the SATs. With the SATs, the questions are eventually published, and you get a score report indicating exactly what you missed and why, along with a percentile breakdown.

Google's interview exams are graded in secret, and you have to sign a non-disclosure to take the exam. I suppose scores for my interview exam are stored in some database somewhere internally at google, along with photographs of the whiteboard at the end of my various 45 minute sessions and notes, but I'm not allowed to know what they are (they're secret).

I'm ok with google doing these interview exams (and fully embrace that it isn't up to me). But I do think that a certain bill of rights between examinee and examiner slowly evolved over the years - one of which is that the people who took the test should get some feedback on how or why they failed or passed, along with a score.

Instead, due in part to fear of lawsuits, you get very vague feedback from a recruiter. Mine was: analyzed the problem well, but didn't make sufficient progress on the coding section in the time allotted. I have no idea if this is true, someone being nice, how much more progress I would have needed to make...

I no longer participate in exams under these conditions, but I'm just one person with zero clout, and with $300k on the line, no, I'm not even slightly surprised that many people will do so.


I have changed my mind about a dozen times with how I feel about the whiteboard problems for interviews, and I am somewhat on the side of "they're not good", largely for the reasons that you mentioned; it's not hard to find online tutorials to practice specifically for these kinds of problems, which doesn't necessarily correlate with a good worker. I mean, hell, I took advantage of this very feature when I started out; until very recently I was a college dropout who bought a few books on data structures and algorithms, and managed to find engineering work relatively quickly because I knew enough to do well in whiteboard interviews.

I know several engineers who are several times smarter than me, who could build basically any application asked of them, who always have trouble finding work because they don't do well with these whiteboard problems. Of course they could study and learn to get good at these problems, but at that point, what exactly are we testing?

I also think these problems are kind of bad for the other direction. When I interviewed at Apple, they gave me typical big-O notation whiteboard problems, I did well enough on them, got an offer, and accepted the job. I just assumed that if they're asking me hard CS theory problems, then there will be a lot of CS theory work at my position there. I was completely wrong about that; the job ended up being extremely bureaucratic and I spent substantially more time replying to emails and attending meetings than I ever did coding, let alone solving interesting problems [1]. I managed to hang on for about two years, but after a certain point I felt that I had had enough and left, and I always felt that I wouldn't have wasted my time if the interview process had more-accurately reflected the actual work I ended up doing.

[1] I think I had a particularly bad experience at Apple, I think other people working there have enjoyed it substantially more than I did. I think it's highly dependent on the team you're on.


There are certainly individuals who refuse to learn how to do whiteboard problems on principle, and if they are having trouble finding a job, I'm fine with that. They'll probably be a pain in the ass to work with.

However, my concern with whiteboard problems is that there are plenty of people who are really good, but get terrified in a whiteboard situation. And some of the absolute smartest people aren't able to think quickly. That is to say, for most questions, everyone else will get the answer first, but for the hardest questions, nobody else will get the answer at all.


> I get paid absurd sums because of supply and demand, not because of my skills;

There's limited offer because it's a rare skill, not sure what you mean by "not because of my skills" since your skills are rare hence valuable to the market.

> Thankfully for my mortgage, that hasn't happened yet and probably won't happen soon.

The keyword is "hopefully". This already happened to physicists with nuclear power if I recall my history lectures.


> There's limited offer because it's a rare skill, not sure what you mean by "not because of my skills" since your skills are rare hence valuable to the market.

Leetcode doesn't make companies money. They can't tie it to their revenue. Being paid because of my skills means I can walk into an interview, say "I'm a domain expert in X and my last client saved $Y thanks to my expertise when I optimized their Z".

The leetcode analogue is "I'm a fungible engineer, and since your company basically prints money as long as you have enough fungible engineers to keep the ship running, you should hire me"


> "I'm a domain expert in X and my last client saved $Y thanks to my expertise when I optimized their Z".

And I verify that... how?

LC is basically a proxy for being able to pick up random shit quickly, learn how it works, and demonstrate/use that knowledge. The particular knowledge isn't that important. It probably won't be that important at the big tech co - what's more important is how fast you can pick up their novel processes and stack and start getting effective work done.


>> LC is basically a proxy for being able to pick up random shit quickly

This is the exact opposite of LeetCode. It proves they've crammed on the front-end of the long tail wave of problems. It shows absolutely nothing about recognizing general problem domains and being able to identify long term solutions.

You're right it has no long term importance. It's just a different, still artificial signaling value.


I do think the fact that people cram is a problem with LC. Cramming is still a pretty big barrier.

Personally, and for many people I know, cramming was not necessary because we had a conceptual understanding - maybe a week or two of doing 2-3 a day and you would be set.

Anything you do in an interview is going to be an artificial signaling value. I think LC is a much better signal than the talk-out-your-ass approach you seem to favor, and I've done both sorts of interviews.

The difference is also pretty obvious in the quality of your peers based on what sort of interviews your firm does.


> Personally, and for many people I know, cramming was not necessary because we had a conceptual understanding - maybe a week or two of doing 2-3 a day and you would be set.

Apologies for being pedantic about word use but that time commitment is longer than what's normally meant by cram. Traditionally a cram would be up to a day or two of hard work before exams. What you outlined would just be study (an even larger barrier).

> Anything you do in an interview is going to be an artificial signaling value. I think LC is a much better signal than the talk-out-your-ass approach you seem to favor, and I've done both sorts of interviews.

I prefer the talk out your ass approach personally, but maybe with a detour to asking the candidate to describe something technically challenging they worked on. If the interviewer is fast their your feet and the candidate is skilled at explaining then you can go reasonably deep. After this the interviewer can usually weigh the talk-out-your-ass claims accordingly.

> The difference is also pretty obvious in the quality of your peers based on what sort of interviews your firm does.

Our firm (fast growing series a startup) struggles with this, so take all of the above with a grain of salt. Our problems seem to be more related to not being able to generate a large enough talent pool, not with inaccuracies cherry picking candidates. We hire about ~25% of genuine applicants and find our main problems post hiring are if they are willing to take initiative and be self directed.


Recognizing the problem archetype is the most important skill for leetcoding. Is that the same as recognizing general problem domains? No, but it's similar enough to be correlated.

The point of leetcode isn't to find the best engineer, since that's not possible in one day of interviews. It's to find someone who has the thinking skills and is willing to put in the time required to be a great engineer. Cramming being helpful is largely the point, companies want people who are committed to being the best.


I'd rather have a dev who can put a DB in third normal form than one with the highest scores on leetcode


If you take someone with a high score on leetcode and tell them to google third normal form then I guarantee they will be able to do it.

If instead you surprise them with it in an interview, they can't do it, and you don't hire them because of it, well, they are the ones that dodged a bullet not you.

I know about third normal form because I learned it from my girlfriends open-on-the-ground textbook in college. It's just not that hard.


I think the idea with that is not that it's hard, but that you know it. You won't get a product person telling you to implement a DB in 3NF - so it doesn't matter how quickly you'd learn via Google, as you wouldn't Google it.

Same goes for Leetcode. Anyone can Google how to implement some algorithm, but you won't be told what algorithm to use in your project requirements.

Of course, an interview could just involve getting you to give general descriptions of things instead of implementing them, which would satisfy this to some degree. But then everyone with a bachelors would pass the interviews, so they had to make them more intensive in order to limit the pool further.


In 1995 I "invented" Navigation Meshes. But I didn't call them Navigation meshes because I didn't learn the term until I read about them in AI Game Programming Wisdom in 2004 (a book published in 2002).

Likewise there are plenty of people who have learned database design through a combination of optimizing execution plans and dealing with the practicalities of schema changes. Such people may never have heard the term "Third Normal Form", but will run absolute rings around a college newbie who has. Indeed, in my career I have met at least two people responsible for database design who did know normal forms but did not know about execution plans, or covering indexes for that matter. Or had a nice normal set of tables, and then proceeded to build a reservation system (hit by 100,000+ users when registration opens) that takes a transactional lock on a single row using an ORM. They understood database theory from 1971, but they didn't know what this specific database implementation actually does.


Perhaps, yeah. Likewise there will be people who've never heard of Big-O notation, and when asked to explain the efficiency of their algorithm and justify why it's optimal they won't be able to, even though they're masterful algorithm builders and did get the optimal solution. Or people who know to build a solution that isn't optimal in theory, but harnesses caches, parallelism, and other real-world features of a computer to be the most effective. And they'll still get docked points on interviews, where a bored engineer is just sitting there looking for someone to regurgitate the right answers. If we're going to take a stand against database questions, I feel like it'd be disingenuous to not do the same against algorithms questions.


I think it is better to conduct interviews hinging around concepts rather (is. LC) rather than "gotcha" questions over whether you have heard a particular term of art.


IMO there's not much difference. Like you aren't going to figure out how to do dynamic programming from first principles, or re-invent the Floyd-Warshall algorithm during a 45 minute interview. Or even come up with the idea of trees on your own. Almost all LC questions are predicated on having heard of and learned about various algorithms and data structures, which really isn't any different from questions predicated on having heard of and learned about various database concepts. We're just used to it because LeetCode is the standard, so everyone grinds it.


> I can walk into an interview, say "I'm a domain expert in X and my last client saved $Y thanks to my expertise when I optimized their Z".

Billions of people have the skills to say that. Being able to say something doesn't mean that you have the skills to do it. And being able to talk about something for a few minutes after that first sentence also doesn't necessarily mean that you have the skills to do it.

If you had an airtight system for assigning credit for performance at companies, and then being able to show evidence of what you did to other companies, then what you are talking about would work. But right now nobody even knows how to properly assess the value of employees they work directly with, being able to assess the value of employees you just heard a few sentences from would be magic.


>> And we tuned it from there to into its own sort of mini-SAT with an accompanying preparatory industry and culture.

Seriously. Interview Kickstart recently quoted $8500 for two months to prep for the FAANG exams. The entire session felt like a high-pressure sales pitch from a timeshare salesperson. Here we are SF/SV 2021!


I work at a FAANG, but I don’t really understand what a “leet code” interview is? I’ve interviewed hundreds of people, and hired a number of them. What makes a great engineer is enthusiasm, curiosity, good communication skills, and a willingness to adapt to the processes of the team.

Initiative & leadership are nice, too, if I think I might need a manager in the wings.

Ambition is ok? It’s hard to temper highly ambitious people.

Coding is a mechanical skill you can assess in 5 minutes at the whiteboard during your interview while you look for the important skills.


> Coding is a mechanical skill you can assess in 5 minutes at the whiteboard during your interview while you look for the important skills.

If this was true, we wouldn't have leetcode style interviews.

> I don’t really understand what a “leet code” interview is?

To me it's the puzzle style questions; or questions you have no way of answering without knowing or being exposed to a certain trick.


It's ironic, given that algorithmic whiteboarding questions were developed as a reaction to '90s Microsoft "how do you move Mount Fuji" brain teasers. I would not call Leetcode questions puzzles in the same sense, yet many questions do require some foreknowledge of having seen the question to get to the optimal solution.

There could be an article or analysis written on that subject. What is the pedagogical gap between common CS education (both academic and not) and being able to intuitively leetcode? Students are learning the data structures and algorithms, but perhaps not in a way to approach problems in the same way that leetcode style platforms force them to really utilize that knowledge. Maybe there are problem-solving approaches not being covered?

A survey of the most commonly asked leetcode problems in industry would be illustrative as well. Some questions rely on advance knowledge that aren't really found in academic computer science. What CS course teaches two pointer - isn't that a competitive programming technique?


> I would not call Leetcode questions puzzles in the same sense, yet many questions do require some foreknowledge of having seen the question to get to the optimal solution.

Agreed. So far what I've observed is that once you understand dynamic programming, memoization and tree traversal you're set to solve a lot of the leetcode style questions. However every once in a while you encounter a question such as "add two binary numbers without using mathematical operators" which essentially expects you to xor two binary numbers and then determine the cary using a shift operator. I'd argue that this is similar to the Mount Fuji question.


Hm. Okay … I consider things like “dynamic programming, memoization, and [graph] traversal” to be basic skills of a systems programmer. I didn’t realize people didn’t look for those skills to weed out the flerbs.


> If enough people do so, wages could be pushed down by employers. Thankfully for my mortgage, that hasn't happened yet and probably won't happen soon.

You know that already happened? Like 15 years ago was what it looked like before every started practicing, today most people are practicing. If they fail to take your job today then it is because they failed to pass the interviews even with extensive practice.


Right now engineering demand is far outstripping supply.

If the tides shift and big tech starts tightening their belt, leetcode practitioners will start competing 100 engineers for 10 positions at big tech, and that's when salaries get pushed down.

Most books and blogs on professional growth say to have some sort of talent or skill that makes an organization need you. That way, you aren't fungible and have leverage. Right now my "skill" that I demonstrate is leetcode and the only leverage I have is that big tech needs more engineers and have to bid against each other for my fungible work. The moment tides shift, myself and every other leetcode practitioner are basically fucked.

When will this happen? Ask a fortune teller.


There is a finite supply of STEM minded people, the extreme salaries in programming has already shifted many of those to programming. The job climate today is basically the same as 10 years ago, the interviews looks the same, the salaries are a bit higher but around the same etc. So as far as I know this is the stable point where the people staying in other STEM jobs prefer those jobs over the money, and you wont attract more of those unless you pay even more than companies do now.


I think this is probably spot-on.

Anyone who has taught introductory CS courses at an average public college/university (so, not a huge R1 state flagship) can tell you that getting into programming is not easy. Plenty of students who I would describe as "smart enough" have an incredibly hard time learning how to program.

The bar is a lot higher than most people here seem to believe.


> If the tides shift and big tech starts tightening their belt, leetcode practitioners will start competing 100 engineers for 10 positions at big tech, and that's when salaries get pushed down.

it's already like this just for positions at random software companies paying $80-120k. the percentage of applicants to google that end up getting offers is estimated to be in the low single digits (if not below 1%). and frankly, I suspect big tech companies already hire more engineers than they "need", just to deny them to rising competition.

I don't think you're wrong about the moment that the tides shift, but it seems we are still quite far from that.


> And we tuned it from there to into its own sort of mini-SAT with an accompanying preparatory industry and culture.

Agreed - interview prep is a cottage industry.

It should be worth reflecting and discussing why that is. If tech interviews are not horribly flawed, why is there a small industry of paid prep resources for passing those interviews?


There are plenty of industries that have cropped up on convincing people they need something despite the fact that they probably don't need it.

For instance: SAT prep schools, the evidence on private college counselors, etc. - there is little to no evidence these actually have a substantial impact, but people pay for it anyways.

In the adult world, try managed funds. Again, little-to-no evidence that you have any way of knowing ahead of time which of these will beat the market and which will underperform, but people still pay massive fees to get included in an "exclusive" investment fund.

This is a human phenomena, it doesn't mean the selection criteria isn't working - just like the exist of managed funds doesn't mean the stock market isn't working.

The world is full of lots of people, and some proportion of those people will be marks.


> there is little to no evidence these actually have a substantial impact

Is that true?

https://slate.com/technology/2019/04/sat-prep-courses-do-the...

> Once scholars control for all these factors as best they can, they find that coaching has a positive but small effect: Perhaps 10 or 20 points in total on the SAT, mostly on the math section, according to careful work by Derek Briggs of the University of Colorado Boulder and Ben Domingue of Stanford University.

> A 2010 study led by Ohio State’s Claudia Buchmann and based on the National Education Longitudinal Study found that the use of books, videos, and computer software offers no boost whatsoever, while a private test prep course adds 30 points to your score on average, and the use of a private tutor adds 37.

Also, managed funds are not analogous to interview/test preparation in any way.


That demonstrates my point.

10 to 20 points on a 2400 scale (at the time of study) test is not a substantial impact. I also think there are probably test-oriented lurking variables they are not able to capture.

Managed funds also sell you on the idea that you can buy your way into performance, and there is also little evidence that they actually work.


Read on to the conclusion:

> Let’s assume the effects of short-term coaching are really just a 20- or 30-point jump in students’ scores. That means they ought to be irrelevant to college admissions officers. Briggs found otherwise, however. Analyzing a 2008 survey conducted by the National Association for College Admission Counseling, he noted that one-third of respondents described a jump from 750 to 770 on the math portion of the SAT as having a significant effect on a student’s chances of admissions, and this was true among counselors at more and less selective schools alike. Even a minor score improvement for a high-achieving student, then—and one that falls within the standard measurement error for the test—can make a real difference.

> Compare that with students from families that earn less than $20,000 per year, who have a mean score on the math portion of the SAT around 450. According to the same admissions counselor survey, a 20-point improvement to a score in this range would have no practical meaning for students who are trying to get into more selective schools. Even when it comes to less selective schools, just 20 percent of the counselors said a boost from 430 to 450 would make a difference. In other words, research has debunked the myth that pricey test prep gives a major bump to students’ scores, but it’s also hinted that whatever modest bumps they do provide are more likely to help the people who are already at the top.

======

As for managed funds, they promise to sell performance within an environment composed of many many people. Test prep and interview prep promise to improve performance for individuals. One is about mastery over an external environment, nature. The other is about internal improvement. I don't think promises of personal improvement through coaching, training, or teaching is comparable to promises of gains within a large chaotic unpredictable environment. They're entirely different categories of "buying your way into performance."


> he noted that one-third of respondents described a jump from 750 to 770 on the math portion of the SAT as having a significant effect on a student’s chances of admissions

But did the SAT prep really help people go from 750 to 770? I'd guess that most of that 20 point jump would be for people who start out at 500 or so, and going from 500 to 520 isn't terribly significant. Fixing the last points is much much harder than the middle, so I really doubt that people at that level would see the same absolute gain as people in the middle.


Agreed. Indeed, I would expect the "average" to mostly consist of large boosts for very low initial scorers rather than improvements among high-scorers.


So, in a poll about how their own industry works, 66% vs. 33% said that a 20 point difference (from 750 -> 770, which is not the standard score change from prep and the changes are lowest at the high end of the score range) would not make any significant difference in the chance that a student would have.

I don't follow the distinction between market as external environment and college admissions as internal environment. Could you elaborate?


> would not make any significant difference in the chance that a student would have

So a third is not statistically important?

> I don't follow the distinction between market as external environment and college admissions as internal environment. Could you elaborate?

It's a very simple point: fund managers claim to have mastery over a vast, complex external environment. Interview and test prep courses, like any sort of pedagogy, claim to teach and coach candidates towards self-improvement. The latter is self-evidently simply more achievable than the former. And even if you were to claim that SAT prep schools make no substantiative improvement for students, what makes interview prep most analogous to those, as opposed to other forms of tutoring, or even personal fitness training, which do yield greater benefits?

I'm just saying it's a lot easier to teach someone how to be better at something than to beat the market. They are completely different endeavors.


> I get paid absurd sums because of supply and demand, not because of my skills

This is how everything works. You can get paid a lot for selling a stock that has appreciated (demand for it went up) without having any skills in investing, just luck. You can also be an incredibly talented musician or artist and get no money for it because there's no demand for classical musicians these days (I've seen this happen to every single one of my classical musician friends. They all have to go into teaching or change careers).


Sure, my point is just that (among top corporations), engineers have been largely reduced to commodity goods.

To tie it back into the subject of interviewing: If you are interviewing for a PostgreSQL expert, then it will be both harder to find someone who fits the bill and that person will have more leverage to negotiate a better salary. That person knows how few engineers (relative to the total pool) have worked on the PostgreSQL codebase and can explain the internal workings and implementation details and relate it to your projects.

In a hypothetical future where there's 100 engineers for every 10 commodity engineer positions, that PostgreSQL expert position will still be hard to fill.


> Sure, my point is just that (among top corporations), engineers have been largely reduced to commodity goods.

But that was always the case. Lets say you can save a company millions of dollars by being competent at SQL. Guess what? Any of the millions of people who are competent at SQL could also have saved that company millions of dollars. Why not just hire anyone of those? You are a part of the sea of programmers with similar skills as yourself, that is what you are selling and what companies are buying. So to set yourself apart from the sea of others you show you are smart enough or hard working enough to pass coding interviews, since most of your peers who also know SQL can't pass those.


For programmer or engineer though, it's now a gamble to invest heavily and be that expert in a single technology. In medicine, expert/specialists have large population pools and simply accident or genetic defect rates they can rely on to have customers for their speciality. You can specialize on a specific body part and become a dentist or ophthalmologist and be assured there will be sick people who need you and are willing to pay (unless of course we cure/eliminate the problem you solve which rarely happens).

If you're in software and know the internal workings of Postgres, well that's fine and dandy iff a business decides to use Postgres. Maybe they chose MariaDB or something else arbitrary instead. That highly specialized knowledge you have about Postgres is highly dependent on demand as it always has been. The difference now is that for any given technology, there seem to be a sea of competing options, so the risk of your expert time investment having no demand is quite high. The salaries may be high but I'd argue they're not high enough for the time, effort, and risk to become a specific technology expert anymore. I have expert level knowledge in systems that now have almost no demand. It's a waste of my time to maintain them at this point, the risk and opportunity cost is far far too high.

There was a time when you could become an expert or near expert of about 1-3 competitors for a specific library of infrastructure and those skills were incredibly valuable, highly transferable, and in demand because the pool of competing tech was so low. Now I can't even keep track of number of just say database or database like applications teams may choose to use. There's more diversity than there is demand to justify and sustain the evel of diversity we now have, IMO.


"Preparatory industry" is a bit much. The only thing people really need or use for interview prep is Leetcode, which costs about as much as Netflix if you buy an annual subscription, and provides hundreds of hours of what should be at least somewhat fun to anyone who actually enjoys programming

If you wanna get fancy you can throw in a $59/month Coursera Plus sub for those big-name-university math and CS courses. There are some good ones that I think would help people who focus exclusively on LC improve faster and/or get a deeper understanding

Ok, so we are now talking about ~$900 over the course of a year. Even throwing in some a la carte courses from edX, like the Programming Languages sequence from Washington that OSSU recommends, you can now give yourself a thorough and marketable CS education in a couple years of hard work for under $3k. That doesn't sound like a predatory industry....that sounds like excellent progress towards equitable opportunity

P.S. And you can even be a cheapskate and just use the Sedgewick booksites and do the UCSD and Stanford discrete math and algos courses on Coursera and get your interview prep done for a grand total of like $200

P.P.S. The reason you don't need algos in your day to day is because you are not good enough at algos to get a funner, better paying, more impactful job where you do


> P.P.S. The reason you don't need algos in your day to day is because you are not good enough at algos to get a funner, better paying, more impactful job where you do

there are a comically high number of jobs at "prestigious" companies where this is absolutely not true


Ok, maybe not the "better paying" part. Should have put an "and/or" in there somewhere


Impactful how? If you want to believe you are not just a cog for some for profit enterprise, but are doing impactful work, cool. I likely disagree, not as a diss on you. I generally see cogs as cogs, myself included. OTOH, your wording wants to believe you and others good enough in algos are better than others.


"Preparatory industry" is a bit much.

I ran into a YT channel run by a guy currently employed by FAANG as a hiring manager and selling courses for how to pass his own hiring interviews. With that, plus bootcamps, I think "preparatory industry" is a fair term.


> P.P.S. The reason you don't need algos in your day to day is because you are not good enough at algos to get a funner, better paying, more impactful job where you do

[citation needed]


indeed.com


I always find it strange the way people talk about the importance of Leetcode on here. I've been in the industry 7.5 years and I've never used or been asked to use leetcode or really any site like it. I've gotten the CS questions in interviews and some coding problems that are perhaps leetcode-like; but the latter are usually not super intense, either in difficulty or in grading harshness. My solutions never had to be perfect, they just wanted to watch me problem-solve. And most of the companies I've worked for would be called "trendy"; we're not talking about stodgy enterprise-type orgs.

I don't get it. Is it because I'm not pursuing FAANG? Is it because I'm in Austin instead of SF?


I don’t understand this view. Learning to “game the exam” (by which you mean pass) is the gateway to all careers, whether you’re a truck driver or a brain surgeon. What’s special about software?


> brain surgeon. What’s special about software?

brain surgeon doesn't take USMLE at every job interview.


Brain surgeons gets sued and you get a public paper trail when they do a bad job. Also they often have a lot of data of all their performed surgeries they can refer to. Software engineers however fails silently and nobody will know. Lawyers, the other big profession that are paid a lot, also have a lot of data they can go by to prove that they deliver value.

Can you give a single profession that pays $200k+, isn't a manger job, doesn't have easily tracked metrics like doctor/lawyer, and doesn't have tests like the algorithm interview?


Actuaries?


If the suggestion is that we change our field to be more like the medical profession, I'll just say I'm thankful you're not in charge.


What if the accreditation for this industry is simply do the existing LC exam, except once you pass it you're good for 5-10 years until you renew your license? Instead of the current system of having to retake it every single time you change jobs.


> I find it as a reminder of my fungibility as an engineer: I get paid absurd sums because of supply and demand, not because of my skills; anybody who can game the leetcode exam (and the architectural diagram quiz and the personality quiz) can walk in and take the same 300k income I have. If enough people do so, wages could be pushed down by employers.

Entry is one part of the puzzle. The actual skill bit hopefully comes into play after you get in, at which point your skill(s) improve your chances to survive / thrive at these companies.


> anybody who can game the leetcode exam

The best CS class I ever took made all previous exams available to us. They also assured us that our exams would only include questions that have appeared before.

The only way to game that was to learn everything. Leetcode exams, especially at places like Google, are effectively the same thing.

This is good because it levels the playing field. Anyone motivated enough to learn the material can break into the career.


Domestically, there is a limited supply of very smart technically minded people.

It's sometimes possible to game the system by doing hundreds of LC, but it's still hard to beat people for whom it just clicks.


I think that LC style interviews are an attempt to solve for the fact that programming is a field where a few things happen to be true and broadly true:

1. Significant amounts of people cheated during college, and if asked probably wouldn't even identify their behavior as cheating.

2. Significant amounts of competent people don't even have a degree.

3. Many people with degrees are incapable of actually programming even basic algorithms.

4. There is an incredibly high demand for this job because of its pay, leading many people to pursue this job in the market that have no actual interest or aptitude for the work.

5. Significant amounts of people lie on their resume, or work for body shops that lie on their behalf with or without their knowledge.

These truths make it so that something like a degree is not a useful filter and create a situation where every job openings has thousands of applicants, the majority of which are not even remotely qualified. How do you filter out applicants rapidly to narrow down the number of people that your engineers take time away from producing value for the company to interview?

Personally, like every other engineer, I hate LC-style interviews, but at the same time, the issues they are intended to solve seem intractable any other way. Fundamentally the LC-interview is intended to be an on-the-spot test of your abilities because nothing on your resume or in your past credentials is trustworthy.


I disagree with a lot you have to say, even though you have valid concerns with are shared with a lot of hiring managers I talk to.

> 1. Significant amounts of people cheated during college, and if asked probably wouldn't even identify their behavior as cheating.

Unless you have some real data backing this up, you can't claim this is true.

> 2. Significant amounts of competent people don't even have a degree.

Don't disagree here, though the question of whether or not leetcode proves competency for day-to-day software engineering (of which changes given the type of job you are interviewing for) is still debatable.

> 3. Many people with degrees are incapable of actually programming even basic algorithms.

I have seen this phenomenon and it is a valid concern. However, you can still test for basic algorithms without resorting to leetcode hard questions on phone screens, which happens often (i am speaking from experience).

> 4. There is an incredibly high demand for this job because of its pay, leading many people to pursue this job in the market that have no actual interest or aptitude for the work.

Agreed, sort of. Its supply and demand - where the supply is artificially kept low at the interview stage be eliminating too many candidates who can't solve leetcode hards. Many of those candidates could be adequate, especially with minimal training.

A lot of your concerns are 1st order, surface level, but the solutions to the problem of tech interviews are a couple layers deeper in the stack. Leetcode is a monkey patch, an adequate one but it has its own set of problems that are ignored. That's why we keep seeing discussions about leetcode on HN.


> > 1. Significant amounts of people cheated during college, and if asked probably wouldn't even identify their behavior as cheating.

> Unless you have some real data backing this up, you can't claim this is true.

Universities has no reason to crack down on cheating, students who cheat pays just as much money as those who don't. Rather it is better the more students cheat since it means they are less likely to drop out and stop paying. Only cost is reputation, but I have never seen a university get called out or cancelled over not catching enough cheaters, so even the reputation cost is tiny.

I have personally seen a lot of cheaters, I have heard people talk about cheaters etc. Cheating might not be common at every single university, but it is definitely common if you look at higher education as a whole. And of course none of those will ever try to track that data since it doesn't benefit them anyway. As long as they don't track it they have plausible deniability.


On the cheating point, I think casual academic cheating is a subset of behaviors of "making the grade without actually learning", which ostensibly leads to credential inflation. Cramming material without retention is one example. One can cheat oneself out of learning without actually cheating.


>> 1. Significant amounts of people cheated during college, and if asked probably wouldn't even identify their behavior as cheating.

>Unless you have some real data backing this up, you can't claim this is true.

As a nitpick, you should be able to claim whatever you want. But besides that... from my experience in college (and as a TA for an intro class) I tend to agree with the sentiment that a significant amount of people cheat. It doesn't mean MOST people cheat, it's just "significant". I remember one week where 30 of our students (10-20% of the class) pasted an answer from Stack Overflow word-for-word. One student went so far as to put a sheet of looseleaf over their friend's paper, and trace every symbol by hand.


What company gave you a LC hard on a phone screen and what was the exact question? To take a quote from your book, “Unless you have some real data backing this up, you can't claim this is true.”

What many people call a “LC hard” is really a fairly trivial DFS or DP problem which with proper preparation shouldn’t be much of a problem, and is more likely an east or medium.


Unless you have some real data backing this up, you can't claim this is true.

You'll never get real data on this because the people doing it and not getting caught won't admit to it. You just had to listen carefully to what your friends-of-friends were talking about at parties to know what they were up to.


> Unless you have some real data backing this up, you can't claim this is true.

* Pick any undergrad CS program

* Pick any course at that program

* Do a Google search for “<coursenumber> project site:GitHub.com”

I never cheated in school and what it got me was a middling GPA, less time to drink and network, and job prospects that pay roughly half of what I could be making.

If you’re in school, learn from my mistakes: you’re there to get a degree and a high gpa, not to learn. You should absolutely cheat as much as possible, just don’t get caught.


LC stands for LeetCode → https://leetcode.com/ in case anyone is wondering. I find it funny how the parent comment is so long but didn’t want to include the full name of the website haha


I just finished my interview process with a couple FAANGs and a few other companies, and was a little surprised by how "memorizable" it was, depending on the company. I'm not talking about the topics but rather the process -- ie you might follow a framework for coding interviews like 1) ask clarifying questions, 2) run thru an example, 3) state brute force, 4) solve it optimally (hopefully), etc.

It felt like some sort of bougie "fashion" (or hazing ritual like everyone likes to call it) that candidates needed to follow in order to land the job. Even for system design I felt the same trend was there.


Also recently went through a bunch of tech interviews, and had a similar experience. Felt the exact same as doing consulting case interviews in college. I have my script, I recite the relevant variation for the problem, and I pass the interview.

It’s great for me that I can almost always pass, but I do feel bad for people who never had a chance to just learn the magic words.


This looks great for leetcode, it’s a bit ruined by my considering that a terrible interview technique in the first place.

It can still be good for practice though, it does seem to cover all bases.


> terrible interview technique in the first place.

I'm really not a big fan of whiteboard tests, but I have yet to find an initial filtering method that is as quick and cheap as 30mn with coderpad or whatever.


Do you find that it’s a good filter as well as being quick and cheap? Couldn’t tell if this was meant to be ironic.


As a test of bare minimum coding skills, definitely. You wouldn't believe the number of people who apply for junior software engineering positions yet can't do an easy leetcode problem. That's like a medical intern not knowing how to install an IV

I don't go further than that (no complexity questions, no silly google-style "let's pretend you haven't learnt this by heart" knapsack problems). And I don't care about syntax. It's just a way to know who is a developer and who is a stackoverflow cut-and-paster


I am surprised this needs to be said but the medical intern analogy is terrible and emblematic of why this style of interview is terrible. Medical interns will likely be doing things like installing an IV on a daily basis. It is a great work sample task. Your garden variety web developer will not be building algorithms from scratch virtually ever. This is not a valid work sample task.


>Your garden variety web developer will not be building algorithms from scratch virtually ever.

Web developers like this are those that introduce O(n^2) complexity in places where it could be done in O(n), which makes for poor application performance. Sure, they aren't writing algorithms per se, but their lack of knowledge of algorithms still tends to create problems not-that-rarely (assuming the product is above certain size and complexity).

Also, what constitutes a "basic algorithm" is pretty vague. I've seen quite a few people interviewing who couldn't solve fizzbuzz. Until that moment, every time i heard about people not being able to solve fizzbuzz, I though it was an exaggerated joke to describe someone incompetent , but nope, plenty of people legitimately don't know how to solve it.

I don't think fizzbuzz could even be legitimately called an "algorithmic" problem, but no matter what, if you cannot just do a trivial for loop and a couple of if-elif-else checks, you are not qualified, algorithms or not. And there is no quicker way to filter those people out at the initial stage than a simple whiteboard problem.


Remember the selection bias in observing who fails fizzbuzz. Those are the people who are repeatedly interviewing because they can't fizzbuzz. Those who can will get a job and remove themselves from the interview pool. Fizzbuzz-failers are a much greater share of interview candidates than of all programmers.


Agreed, this is an important bias to remember. I was just impressed that those people actually existed at all in the numbers beyond just a rounding error, because I used to think it was just an exaggeration, kind of like saying "he can barely figure out 2+2". Reality ended up being much more grim. I wouldn't be as baffled by tons of people failing fizzbuzz, if it wasn't for most of them having actually pretty legit-looking resumes though.

Like, you would have a great programmer who aces all the interviews, but their resume is just decent. While people failing to solve fizzbuzz might have pretty impressive-looking resumes at times. Which is why, unfortunately, leetcode feels like a necessary evil. Because we live in a world where even the most incompetent people might have enough years of experiences of skating-by to craft a top-tier looking resume. Way too easy to bs your way in if you rely mainly on resumes, and firing people later (as opposed to not hiring them in the first place) is just too expensive in every single aspect (from money to opportunity cost to team morale).


The questions asked in junior interviews are truly trivial operations which any engineer should be able to understand. A common one might be to loop over an array and build a hash map counting object occurrences. Another one I see is iterating over a log file and printing only certain types of messages.

Working with simple data structures isn’t even writing algorithms, it’s quite literally the building blocks of programming.


Depends on what you call an algorithm, of course. As GP mentioned bare minimum coding skills, I assume he uses really simple ones. This can, of course be relative, but maybe he was intentional with the analogy.

Because you can't really code without writing algorithms. Yes, you are never write a quick sort or implement a red-black tree, but you do have to understand problems and provide some kind of solution to them and then translate that into code. I.e. come up with an algorithm.

I used to use two simple problems, both involved math. One of them is a bit harder and the ideal solution assumes that you do remember high school math (not very complicated but something that you may have forgotten or at least something that you won't think should be used there). But it's nice, because you can come up with more and more optimal solutions and the naive one (which is O(n^2), IIRC, really just shows that you managed to understand the problem). I don't use that one anymore though. I almost exclusively just use the one that I thought was embarrassingly simple and which I used to use as a "warm up" exercise. Not more complicated than the fizz-buzz (which I hadn't heard of at the time I picked this one).

I'm not telling you here what it is, but it indeed does weed out people who can't really think through a simple problem and then can't think of a solution that consists of several steps following each other. (I.e. an algorithm.)

Now I'm not interviewing 'top' developers, just your average web devs most of the time. Another thing I'm doing in the past few years is code review. I have 2-3 pieces of badly written code that I have reviewed and/or rewritten.


I think the problem with your thinking, and something that I've been trying to figure out as I sit on/direct more interviews myself, is that this doesn't actually tell you that the candidate can't code, but just that they couldn't code in that setting with that problem in the way you chose to frame it for the interview.

Meaning, I often sit with candidates who come in with 3-5 years experience in industry who can't do a relatively simple programming task. That can either mean that we consistently are drawing applicants who have been pulling the wool over the eyes of their past employers for 1/2 a decade, but actually can't program. Or, it could mean that they get incredibly nervous coding in front of strangers, aren't used to doing more abstract problems, and once they realized this, experiencing a "snow ball effect" of nerves, wherein they train wreck in the interview.


Sure, in theory it can mean that some of them are just super nervous. But, of course I do help them if I see that they don't get anywhere. I don't think that there are too many people who will just 100% shut down without showing any sign of actual nervousness.

I didn't mention how much these guys were working at other places, but it's completely possible that they were doing a job where they could just get along somehow. E.g. the last guy who wasn't able to get anywhere with my puzzle was a candidate from an agency for one of my clients. I think he was some junior guy, but I don't know for how long he has been earning money as a developer. Obviously, the agency wanted to sell him.

Besides not being able to solve the coding problem, which I told him was not a big issue (so that I don't make him more nervous) he didn't find any issues in the sample code in the code review task. (And it has about 10 issues in 15 lines.) So yeah, some of these people may have been just copy&paste programming, they may have been tinkering with larger existing projects here and there (I don't know, maybe customizing CMS-es) or are complete beginners who haven't faced with the fact that they can't (yet) write code.

After all, you have to prove somehow that you can write code if you want an employer to hire you and pay you for writing code. Sure, it could be a take home challenge too. Some companies do that. Once I interviewed for a company where I didn't want to work for a role I was not interested in (embedded C++ developer), just out of curiosity and because they were pushing it. They had a quiz about C++ (like what is a constructor, what does this piece of code print and why, etc.) which they said I aced. Then they gave me a laptop with VisualStudio and a simple task (something like writing a text wrapping algorithm in an hour). Now not having any recent C++ experience (other than on Symbian, which used a completely different std lib) and having never used VisualStudio, I wasn't able to finish. But they have a more complex take home task. That could work for nervous types. It was an interesting experience, but I believe being able to communicate about the task is much more important than being able to type out the code on your own without any help. Well, at least for most organizations.


Should have been more precise. When I say “easy algorithm” I mean fizzbuzz or reverse the words on a page or something equally basic.


Yes it's definitely good for filtering out the people who are very bad.


For a certain definition of bad, for sure.


For any reasonable definition of bad. Unless you're happy unwittingly hiring people who can't even write a for loop.


Not the OP, but absolutely yes to all those things, except being ironic.


A pattern I've noticed in the comments is how leetcode "commoditizes" engineers. To which some counter with an argument against this idea. It's clear that companies seek engineers from all around the world to help them in their business. Because they're casting such a wide net, some kind of cheap, quick filter is required to sort out a large segment of the population.

What if the initial filter is physical proximity to the company headquarters? What if the companies look for engineers first in their surrounding areas, and provide opportunities for those even without the skills initially set on the job opportunity? My hunch is that, if a person is dedicated, he/she will work hard and acquire the necessary tasks required for the job. The surrounding communities will benefit from the gains by the business, and leetcode-like tests would be unnecessary for the majority of job postings.

Curious to hear what others think about this "local-first" approach.


I know of a F500 company that takes this approach in IT. They’re constantly understaffed, despite being a legit good place to work, with fair compensation.

People tend to leave once they’re trained - it’s pretty natural, given they’re smart, motivated engineers and the company just doesn’t need to solve very interesting tech problems.

The company does it for effectively altruistic reasons - they want to help bring tech talking to their declining city.


Software companies tend to be very scalable because they can get their customers from anywhere in the world. Unless the company was itself hyper local (which I’m sure exists but I’m struggling to think of an example) why would they handicap themselves for little benefit?


Why should people with geographic proximity be disproportionately granted opportunity?

Like I fail to see the motivation for doing this outside of base nativism/tribalism.


Why did you link the html view now we can't clone it.

https://docs.google.com/spreadsheets/d/1gy9cmPwNhZvola7kqnfY... for the lazy


i already went to college, and worked hard to get my CS degree. i don’t do and should not be doing these interviews, and everyone else should do the same. it is the only way to stop this.


If you think holding a CS degree in any way correlates with being able to function as a professional developer... Boy have I got news for you. Companies conduct these interviews because holding a CS degree is almost worthless as an indicator of ability.

I suggest you sit in on some of these interviews if you get the chance, you will very quickly see what an absolute train wreck they are when you get someone with the right paper qualifications who just doesn't know how to code at all.


> ...what an absolute train wreck...

I am sure they can be, but there are many accounts of talented people who initially bombed these things in a big embarrassing way, NOT because they lack ability, but because they never practiced for the test and didn't know what to expect (yes it happens).

This can happen to folks who learned on the job until they outgrew their job and decided to leave. They suddenly find themselves doing an "obstacle course" technical interview gauntlet that focuses on things that they just haven't had to think about for a long time (or ever). I believe talented enough folks can get through it if they can focus and study for a while (and accept the humiliation of failing big at first). It's not an infinity of subject matter, nor is it rocket science. I think if someone loves the work enough they can get themselves through it.


exactly this. and practicing for a few days is not enough. if someone who is experienced and can write software needs to spend months training for this, the interviews are broken.

but, i disagree with the solution you suggest. the only solution is to refuse and instead of wasting 3 months training with no guarantees, build a product instead and sell it.


> but, i disagree with the solution you suggest. the only solution is to refuse and instead of wasting 3 months training with no guarantees, build a product instead and sell it.

Because being a software engineer means running your own business… right.

Just like surgeons should start their own hospital before getting hired. Makes total sense…


surgeons are not dealing with this problem. they get their degree, go through some mandatory on the job training, then get hired. nobody ever asks them later on to do some "surgeon leetcode" to prove they can do their job.


Hopefully:

- clearly incompetent surgeons never graduate

- any surgeon who can’t do the equivalent of basic “surgeon leetcode” would be painfully obvious to their colleagues on the job and would struggle to continue employment as a surgeon


Getting that degree and the mandatory training is FAR more work than a few months of leet code grinding. The leet code thing really really sucks. But it is light years easier than and less tedious than the hoops you need to jump through to become a physician, let alone a surgeon. A more comparable comparison would be if literally every programming job required that you passed multiple algorithm tests, some of which took years to study for, before you could even apply.


> instead of wasting 3 months training with no guarantees, build a product instead and sell it.

I totally sympathize with your sentiments, but building a product and selling it ALSO has NO guarantees.

These folks that "learned on the job" often were hired into completely different roles from SWE but ended up doing it anyway and they're not making anything close to ~300K FAANG money. So, it's not THAT much of a stretch to blow a good number of months grinding leetcode if a job upgrade is a possibility.

Ultimately, it's a matter of finding the right place. Often that means using your professional network and getting referrals. If the gauntlet can be bypassed through connections, that's a good thing for everyone.


after building a product, you will have generated IP that you own. after doing leetcode for a few months, you have gained nothing.


maybe where you are from CS degrees mean nothing. in western europe you don’t get a CS degree unless you actually have both the academic chops and can build working software. i completed many projects while still at the uni.

also, no other white collar job has these sorts of ridiculous demands. and no, engineering does not pay disproportionately well.

finally, most of these interviews test if you solve a very specific programming problem in under a minute. the only way to succeed is by having already solved that particular problem before. so the interviews don’t test if you are a good computer scientist. they test how far you are willing to go to get the job.


Unfortunately for you I live, work, and do tech hiring in Western Europe, so I'm just going to go straight ahead and call you out on your weird geo-superiority bs.

Western European grads are no better or worse than elsewhere. Tech jobs do pay disproportionately well (not as much as the US but still well), and I assure you other white-collar fields such as law, finance and medicine have plenty of ridiculous demands and barriers to entry - often much worse than the ones we see in tech.

Honestly if you are as good as you say you are you should be happy to just walk into any tech company you like, spend a couple hours showing off, and be set for life.


if you think you can walk into any tech company and show off for a few hours and be set for the rest of your live, you must be nuts.


What is your objection here?


> most of these interviews test if you solve a very specific programming problem in under a minute

You’re supposed to be solving it as you go, most of the time while interacting with your interviewer. Certainly not “in under a minute” unless your concept of “leetcode interview” is some kind of rapid-fire “I tell you the problem you describe a sketch of the solution as quickly as possible”.

I will also echo the other commenter’s opinion regarding “CS in Western Europe”. Not only do you have people graduating who can’t build working software, you also have a wide range of skill levels such that some companies want to hire from the top of the distribution and not just “any CS grad who can build software”.


What we should/shouldn't do doesn't matter. The fact is that because CS has exploded in the past decade, and because there are hundreds of applicants to most jobs, there needs to be some sort of programming litmus test to thin the herd. That's why I'm OK with FAANG companies having multiple rounds of interviews; they need to because so many people want to work there. What is the alternative?

The problem is when some dogshit startup that got funding 2 weeks ago copies Google's process. That I don't know how we can fix.


you can thin the herd by demanding prior experience and actual software built and ask to see some source code and discuss that. much better and more fair to those who are experienced.


Not knowing the program you graduated from, I have zero faith that you have any idea whatsoever how to actually be a productive member of a development team as a result of that degree.

If you refused to answer technical questions during an interview because you believed your degree should be enough, you would not get a job from me, period. I don't know if that's true universally, but I expect it to be nearly so.


good for you. your mindset is only working to the advantage of your employer (maybe i should say pimp). no one else in our industry.

in our program we had to work in team on projects besides doing our own. maybe you went to a shitty program.


I am the employer, and I didn’t go to any program.

But more importantly, if you think a group project is anything at all reflective of daily software development, you’re exactly the candidate I’m trying to put under a microscope.

Honestly, your downright terrible attitude would disqualify you well before any technical questions.


good, go to a program, then come back and we can have a discussion.


Do you tell the interviewer that when they ask you if you can work in a team? They can't know the quality of your education or the merits of your experience, and you can't demand that they go out and look at exactly how the courses you took are structured or how those groups are evaluated or run.

It doesn't matter if the program you personally took was good preparation, if most programs sucks then people will assume that your program also sucked since you have no other way to separate yourself from those who went to those sucky programs.


but you do. if you went to a decent program you will have built software that you can show, and show the source code for and discuss it. no need for any leetcode.


I wouldn't waste my time.


You worked hard but the rich kid next to you in class paid someone to do their homework and steal test keys and maybe even bribed a professor or two. Companies need to filter them out.


you can just ask for projects they worked on and look at their code and discuss that.

also, you can’t bribe anyone at universities where i’m from.


Or you can ask them to code up some algorithms. The companies paying the best choose to do this, and they grew and became much larger than the companies that didn't do it this way, so now most companies do it this way.


leetcode BS interviews are a quite recent fad.


Yeah, if you’re old enough to consider Microsoft doing them in the 90s a recent fad, sure.


you'd be shocked how much bullshit occurs on universities across the world, even in EU.


So do you think people who haven't done a CS degree shouldn't be allowed to code? Or should we make them do whiteboard questions but not anyone who did a CS degree? What about people that got a 3rd from a poor university?

And what if we need practical programming experience, not just algorithmic CS stuff? Can we ask you questions about that?


if you can graduate a CS program without hands on experience your program is broken


You can definitely graduate from CS and be poor programmer, hell you can even work decade in industry and be poor programmer - also known as "10 * 1 year experience"


Obligatory quote (attributed to Dijkstra, but see [1]): "Computer science is no more about computers than astronomy is about telescopes."

It's possible to be a good computer scientist without being a good programmer, and vice versa.

[1] https://quoteinvestigator.com/2021/04/02/computer-science/


I will happily do them so that I don't have to compete on wages with people who just got a degree in college and feel like the salary is now owed to them.

I enjoy LC, I'm good at it, and I find most of the people complaining about it to be the same sort of people as parents who would complain about how their kid "was very smart, but just not a good test-taker" if only those elite colleges would just take notice.


LC is too easily gamed and I’ve been burned by candidates that did great on those style questions.

Your comparison is actually perfect - elite colleges pass on bright students all the time and often just filter for those willing to play the game.


I really don't think you can neglect the large number of people who think they are very bright and effective and would be great for this role, but actually aren't.

Based on my experience, this is a solid proportion of nerdy men (I say as one of them). When I was studying physics, all sorts of people would want to talk about their interest in physics and quantum mechanics with me but they "were only good with the concepts, not the math." Any brief conversation would reveal they didn't have the damnedest idea what was going on.

My suspicion is that these are the people driving the anti-leetcode discourse.

Top unis absolutely pass on bright students all the time, you are right.


i like leetcode and I'm good at it, but after a certain point grinding leetcode doesn't make you a better software "engineer", it just makes you better taking tests. Now i believe tests have a purpose in the interview process and i'm not opposed to having some leetcode in interviews. I've noticed the pattern of making leetcode type problems harder and harder because people keep grinding and grinding. There are some problem where the solutions are so absurdly clever that unless you are either a genius or you've studied the solution before, you would never be able to get close to it on interviews, and i'm leaning towards most people aren't geniuses.


In one of the first interviews I conducted (for my first ever startup), I interviewed a guy who had enough of his corporate job. He was a java architect, studying for some of those SC... whatever exams. He was also a graduate of my alma mater. Same univeristy, same faculty. I thought he would be great.

I was new to this, but had looked up two coding quizes. The first one was really simple, I found it a bit embarrassing during the first few interviews. Now that was the interview, with this guy, when I decided that I'll never state that I'll give them two coding challenges and that the first one is really simple. Because the dude wasn't able to solve it, at all. And when I say simple, I mean simple. My co-founder, who sat over all those interviews and who really had nothing to do with programming, did already understand the problem and solution after a few rounds of interviews.

So while I do think that getting CS (or similar) degree does indeed give you a lot it's by no means a proof that someone can actually write code.


I don't care about your degree, I care if you can solve problems. If you can't solve a simple leetcode problem, why should I expect you to solve anything else?


This is a self-defeating attitude.

It sucks sometimes, but we have to deal with the world as it actually is, not how we wish it to be.

When it comes down to it, the universe doesn’t give a shit about our opinions :)


with your attitude everyone would still be shoveling shit somewhere in a coal mine.


I may not have communicated as clearly as I could have.

I’m not saying mindlessly accept a shitty deal. I’m saying that wishful thinking such as “it shouldn’t be this way and I’m going to pretend it isn’t” is not very helpful.

If you can turn such thinking into a useful action that leads toward the outcome you want or a general improvement, then that is great. But that isn’t always easy to do.

Sometimes you have to deal with bullshit because the result is worth it. Or choose another path entirely, realizing the trade off you are making.


The "talk about a project" is the coal mine though, what we have now is the evolution of that.


Performative programming comes full circle.


I really really liked Interviewing.io for practice.

You take a mock interview, white boarding and all. I'm actually very very bad at these interviews. That said, I've also been on the other side of the table.

One candidate didn't actually know how to program. I had to argue with almost everyone else in the company to not hire this person, and on top of that I got reprimanded for being condescending.

Don't put a programming language on your resume if you've never actually used it.

I do wish American employment law was a bit more flexible here, I'd prefer to give someone a small paid assignment and if they're able to complete that great. Right now you need to decide if you're going to hire someone within about 30 minutes of knowing them.


As someone who wrote about interview prep (mlengineer.io), I encounter people hate this LC-non-sense quite often. There are few things come in mind.

1. When you apply to STEM graduate program, you also need to take GRE. It's clearly that you don't need those GRE absurd words in your study. But I do see some values in people who dedicate their time to learn these GRE materials. Company might see it the same way, they want to hire grinder.

2. Some companies like Intuit, Airbnb try different method, for example, giving candidate 90 minutes to solve one particular engineering problem related to real world. After a while, people share the problem sets and everyone who have prepared can crush them easily.

3. Some of my ex-coworkers who are competent and have good swe skills but they never bother with LC. as a result they stick with underpaid jobs for years.

I strongly believe in the near future more companies will adapt different interview structure to evaluate candidates. The current structure heavily reward for grinders.


I think it is focusing on the wrong parts.

My framework for the coding part of the interview is following:

* Does the candidate tackle the problem analytically? Being able to think analytically is critical to scaling from solving small problems to solving large problems.

* Can the candidate predict what his code is going to do before he executes it? Repetitively executing code to see if it works may seem innocuous, but really is a huge defect in candidate's process which makes it really difficult or impossible for them to write anything non trivial to be reliable.

* Can the candidate recognize and make use of an advice? Surprisingly, a lot of people can't recognize when they are given advice even when I directly give them solution to their problem. It is really disheartening to have a candidate dismiss your advice even when repeated multiple times. I frequently give advice to unstuck the candidate because seeing him/her work on the problem is more important to me than seeing if they can solve any particular part of the problem.

* Can the candidate recognize choices/tradeoffs that impact performance and/or simplicity of the solution?

* Can I have productive discussion with the candidate about designing his solution? Can the candidate converse in a clear and precise language?

* Is he/she caring that the resulting code is readable?

* Can they diagnose faults in their code effectively? Assuming they executed their solution and it is not doing what it was expected to do, I like to see the candidate follow some kind of analytical process to find out what is causing the problem. I even design the task specifically so that it is very likely they are going to make one particular mistake to give me that opportunity.

**

What I specifically ignore in the coding task:

* Whether the candidate knows the standard library. I don't care. During normal work they can look it up on google. On the interview I tell them that if they can describe the function I can tell them the name and how to use it.

* Whether the candidate can have flashes of insight. I don't get so many candidates to waste them on that kind high false negative signal. I have identified parts of the task that require a flash of insight. I give the candidate a minute or two when they reach it and if I don't see progress I give them a progressively less subtle nudges.

* Whether the candidate can do TDD/DDD/etc. It all depends on their experience. If they worked for non-TDD or non-DDD shop for a long time it is going to take some time to adjust.

* Whether the candidate can solve a very complex problem. The interview is already stressful enough experience for the candidate. I can learn what I need on a simple problem and I care much more to see the candidate's process rather than results. The problem is just hard enough to be out of grasp of most people at the very beginning and require some actual thinking. Some people get it instantly and then I am disappointed a little bit because I can't see them working on it.


Not to say that I disagree, but it's worth noting that your criteria pushes the onus of an interview away from purely objective metrics into territory that overlaps with communication and rapport.

Again, I don't necessarily disagree with that, but I think it's always useful to consider how much of my interview criteria can be reduced to "do I vibe with the candidate" and make sure there's more to my interview than just that (there IS a place for it though. The world isn't an objective metric)


"Do we get along" is, in many cases, more important than hiring the best programmer possible. Once a team exists, the team dynamic is much more important than any one individual's abilities. (And, if there is a need to work with someone difficult, then a lot depends on the leader's ability to isolate and direct the difficult person.)


Relying too much on "do we vibe" is how you get an EEOC lawsuit, and rightfully so.


There is no objective way to assess a candidate on an interview just as there is no objective way to specify and measure what a good employee is.

What I try to do is I try to learn specific things about the candidate not because they can be measured easily or objectively but rather because I think they are important.

I am not "vibing" with the candidate (although this is definitely a plus!) I have a checklist of things and I try to make sure, by the end of the interview, that I can run through my checklist and have an answer for each point on it.


> Can I have productive discussion with the candidate about designing his solution? Can the candidate converse in a clear and precise language?

> Can the candidate recognize and make use of an advice?

I must admit that following those two criteria helped me make the most effective hires.

I used to ask a question that was mildly confusing, but only to the kind of people I was trying to screen out. The people who I was trying to hire would usually get the question right away.

Basically, in one of my questions I had developers use a lesser-known threading API in C#. The people I hired either got the answer right away, or could quickly adapt with my advice and answer the question.

Another interesting anecdote: I used to ask candidates to discuss tradeoffs of the XML APIs built into .Net. The point was to discuss tradeoffs and to demonstrate more than superficial knowledge of serialization. (Most .Net developers have some exposure to its XML APIs at some point.) One candidate quickly hopped into a JSON is better than XML tirade. I ended the interview quickly.


That candidate probably tells a version of this story where they dodged a bullet by avoiding a job where they would spend all their time serialising XML.


Please tell me there was more to that last story? Having someone come in and have a strong opinion on a topic, and have the gumption to state so in an interview, is exactly what I would look for.

Did you poke them for more? Did you ask how JSON compares to XML? If they only answers "Well, I've never used XML, but I was trained on JSON and won't ever use XML, I don't know any of the downsides of XML", then you were correct to pass on them. But if they responded "Because of reasons X, Y, and Z, JSON is better than XML, and all XML interfaces are equivalent in this way, so I can't really answer your question because while I know enough about XML to know that it's bad, I can't tell you exactly the difference between different interfaces, as I've avoided them." That would be an answer I would accept.

> got the answer right away

This is the major critique of asking these interview questions. There's a small subset of them, so candidates train on exactly them, and only know how to do these, and when you hire them they won't be able to actually code anything. Using ML terms, they're 'overfit'. Getting it right away is actually a bad thing, but then you get candidates who act like they are going through the process just to look like they are learning on the spot... it's a tough problem.


Regarding JSON vs XML, I just pointed out that this is an interview question about discussing engineering trade-offs. The purpose of the question was to ensure that the candidate could understanding and evaluate trade-offs without making snap judgements.

The candidate was incapable of doing so.

For the threading question, the reason why I chose an unusual API was for your exact reason: You'll never see it on a list of interview study questions.


>* Can the candidate predict what his code is going to do before he executes it? Repetitively executing code to see if it works may seem innocuous, but really is a huge defect in candidate's process which makes it really difficult for them to write anything non trivial to be reliable.

I'm going to disagree here. Testing your code in small chunks can not only catch small errors before they propagate into something larger, but also give you insight into whether you are going down the wrong path.


If you write a small piece of code and can't tell if it is going to work or not you don't really understand what it does.

We are not talking using external dependencies or frameworks. I am talking simple code that you just wrote in entirety, maybe couple of loops and conditionals.

If you can't predict what that small piece of code does, exactly, you are thoroughly unable to write reliable code, by definition.

Reliable code == code that does exactly what you intend it to do, nothing less, nothing more.

If you can't predict what your code does you are writing buggy code. Experimentation can uncover some of the bugs, but if you start with buggy code the experimentation will never remove all bugs -- only the ones that are easy to detect.


I think different companies view this differently. I recently did an interview with a social media company (not Facebook, one of the other ones) where I wrote up a solution to the problem that "just worked" in one go, as did my extensions. After the interview, when I asked for feedback, the interviewer told me that I should have been running and testing my code as we went along. However, I've also heard in mentoring sessions from engineers at other companies that the goal in interviews is to write code that works the first time without having to test along the way.

In general, I prefer the approach of writing out the entire solution at one go, and trusting that bugs will be easy to fix at the end if the code is clean enough to understand what is happening at every step. This is in part because there usually isn't enough time in interviews for running tests at every crux moment. However, I think our current virtual environment may make it easier to test for (and to utilize) these types of programming best practices, since instead of writing code on the whiteboard, candidates and companies can use one of the interview interfaces that allows running code as you write it.


> After the interview, when I asked for feedback, the interviewer told me that I should have been running and testing my code as we went along.

Have they specified that you should be testing your code? If they haven't specified then it is wrong to expect some kind of arbitrary outcome.

People write code in many different ways, some do it very chaotically, some have very organized process.

I try to figure out which of these behaviors are fundamental to writing good code and which are not.

For example, I believe a habit of writing readable code IS fundamental to writing good code. If you don't have a habit of writing readable code, on average your code is going to be worse than if you had.

On the other hand, I believe writing unit tests IS NOT fundamental. Unit tests is one possible way of working with the code, but people were writing reliable code before unit testing became popular. Unit testing takes time and effort that can be allocated differently.

And so I try to look (I can't call it measuring because it is subjective) only at the behaviors that I believe are fundamental and dismiss others.

In the end, if you are smart person that can program well, you will be able to learn those other behaviors. But if you are not smart or you lack certain fundamental abilities, it is going to be much more difficult to fix it.


I do an interview in which we do TDD on a problem together. So we are frequently running tests during the interview. However, one of the things that's expected from the candidate is that they will be able to predict what happens when we run tests. Will the tests pass? Will an exception be thrown? How will it fail?

Maybe as do often the difference in this discussion is a quantitative one not one in principle.


Running tests frequently is not a problem. Not knowing what the code will do is.


Testing code as you go along is not experimentation. It saves time by allowing you to verify your assumptions are correct in a controlled way.

You can sit there and plan all you want until you think you understand it, but you never actually know until you run the code. Being a good programmer is all about simplifying complexity. Testing your code as you go along to verify your assumptions is one more way to do this.


It doesn't matter how you call it. If you know the code works then there is no benefit of running it. If you need to execute it it is only because you don't know if it works correctly.

Sometimes this is fine. I do this when I execute external services because I may not be sure how they work exactly and I want to correct mistakes before I do too much design resulting from my flawed knowledge.

But when it comes to your code you should be able to tell if it works correctly. If you can't there is couple of ways to deal with the problem that I adopted over the years:

-- techniques that make it easier to ensure code works correctly are frequently also improving it in other ways. For example, making code simpler is important not just for reliability.

-- specifying your internal module boundaries and APIs is critical for anything non trivial. Abstracting parts of the problem is enabling you to solve each part of the problem separately. You can't keep 10k lines of code in your mind and tell if it works correctly, but maybe you can split it into multiple modules and make sure each works correctly, separately.

-- when I have written a module that I am just not sure works correctly, I refactor it so that it is easier to tell if it does what it should be doing.

-- make it a habit to write code that is what I call "obviously correct". One way to do it is to make sure it can never get into wrong state.


> If you know the code works then there is no benefit of running it. If you need to execute it it is only because you don't know if it works correctly

People make mistakes. You have to run the code. Do you submit PR's without running code? If not, then we're agreeing. Making code simple enough to reason about without executing is a great goal, but you cannot replace running code.

> You can't keep 10k lines of code in your mind and tell if it works correctly, but maybe you can split it into multiple modules and make sure each works correctly, separately.

But ultimately they need to work together, and that's the most important part. Again, you need to run the code here.

> when I have written a module that I am just not sure works correctly, I refactor it so that it is easier to tell if it does what it should be doing.

This is a good goal, though again, I hope you run the code. You don't know what you don't know.


You have a good point overall. However, can you truly reduce the definition of reliable code to something that simple? I think there are a lot more factors that determine if code is reliable or not.


I actually think that this is good definition of reliable code -- product of an individual developer's process.

It does not say whether it was specified well, but I think an application can reliably execute wrong process. There is no contradiction.

I don't want to say that making sure that your code does exactly what you intend it to do is at all simple.

As your application grows, as you start incorporating dependencies, external systems, code that came from others, things that you do not know very well -- writing code that does exactly what you intend it to becomes very difficult or even impossible.

But on the lowest level, when we are talking couple of variables, conditionals, loops, maybe small data structure -- I would expect a developer to be able to write it to do what is exactly intended.

Because if you have that ability and use it for every piece of code that you write, the overall result (even if it does not guarantee reliability) will be better than a person that cannot do it.

In my experience (and it depends a lot on our HR and candidates that are being sent my way) only one in about 10 candidates (for senior Java dev position) can do that.


> If you write a small piece of code and can't tell if it is going to work or not you don't really understand what it does.

And if they don't understand it, they're going to B*&^% and Moan about having to fix the bugs once they come back.


The good news is their bitching and moaning won't last too terribly long. The bad news is that's because it will become clear they can't fix the bug, and someone else will end up doing it.


Btw I’ve seen the “advice” part faceplant terribly, where the interviewer is pompously trying to “unstick” the interviewee but it turns out the interviewee already understands the problem in a deeper sense than the interviewer and the interviewer is effectively ignorantly and annoyingly trying to derail the interviewee’s thought process without recognizing what they’re doing.

In general I’m deeply skeptical of interviewers coming to the table with their framework full of assumptions for how to evaluate the interviewee when in fact they have a solid chance of being the less smart person in that room and can’t get a readout on the smarter person.


Nice. The things you focus on seem like the skills needed for actual work, as opposed to interviewing.


I'm surprised that no one has suggested the obvious: Yes, tech interviews are flawed in the literal sense: they don't measure genuine or practical programming ability.

However, it's illegal for companies to require IQ tests to screen employee hiring, and these sorts of interviews seem like a pretty clear stand-in for an IQ test. It's doesn't really measure your ability to be a programmer, but act as a good-enough replacement for an IQ test: The problems are hard and complex, and require that someone does actually spend the time to learn the problems.


Does it really measure IQ? Memorizing the most common algorithms and data structures will eventually land you a good job if you try several times.

For example: Binary search Tree traversals Graph searches Sort algorithms Interesting Trees(Trie, RedBlack, etc) Etc many other common data structures or algorithms

If you memorize the implementation of those. You only need to make sure you understood the problem and check for edge cases.

What I have seen is that a lot of people struggle in the interview because they lose a lot of time trying to implement the algorithms but sometimes know how to get there


It’s not a stand in for an IQ test. If you study the problems, you get a better score. Therefore, not an IQ test.


You can study IQ tests too


Whenever a topic like this comes up, people inevitably talk about broken FAANG hiring and the stupidity of whiteboard programing (aka Leetcode.)

I often worry that people who say this are somewhat missing the point of what interviewing is about, and thus lower both their own chances of getting more & better offers, and worse, if someone else is influenced by their writing, their odds go down too.

I say this as a hiring manager and someone responsible for company-wide engineering recruitment for a long time, not at a FAANG but a company of that caliber who competed w. FAANGs for talent. From that perspective, whiteboard programing provide a meaningful signal (though not the only signal) towards a hiring decision.

When I am hiring someone, I care about many things: how do they understand and solve problems, how do they prioritize and trade-off solutions, how do they collaborate and communicate, how they react when they are "stumped", how they react to coaching/hints/feedback/etc. And yes, whether they can code well, and how they deal with pressure.

So when I used to go through one of these problems with a candidate, I would form a picture of how they'd interact with their product manager (from the way they asked questions and clarified their understanding of their problem), how they'd interact with their technical team (in the way they explain their thinking and evolve their solution in response to feedback) to how they'd be with me/their manager (from the way they took feedback and reacted to being coached/helped) to how they would be in an incident/outage (based on whether they were able to make progress despite the time pressure of the interview) and ultimately what their code looked like in my repo (from... the way their code looked like.)

Yes obviously there are factors that make people perform differently in an interview than at the job and as an experienced interviewer you try to discern that, but the bigger point is if you are ranting against whiteboard coding interviews, but you don't understand what I just wrote, then you are basically arguing from a point of view of ignorance. If you don't understand what the process is meant to do, then of course it seems stupid to you but that's not a problem with the process...

I can also look at it from an ever more "real" perspective. Companies hire you to solve problems. To do that, you need the smarts to understand what it takes to solve the problem and the drive and persistence to actually solve it. So if the problem in front of you is "I want to be hired by Google", are you able to recognize that part of the solution is "I need to get good at whiteboard programming" and that a part of that is research and disciplined practice? If not, then you're sort of failing at the meta-interview, because your very approach to problem solving falls short of the level required to get these jobs.

By the way, there are plenty of great places to work that hire on other criteria, and what those places expect from programmers day to day is different to some extent than FAANG-caliber. That's totally fine to be working there, I just wish people had the self-awareness to know what FAANGS want and how they screen for it, vs just saying "oh I am not good at that but it's OK cuz it's stupid to begin with."


This does seem a little irrelevant unless you're just practicing analog whiteboarding. Almost every company I've interviewed for just buys some inane off-the-shelf Hacker Rank thing and sends it out to the whole world, followed by "I guess now we'll talk".


How does one prove runtime complexity? I would love to know how to do this, just out of interest.


Skiena's algorithm book goes into some detail on proving complexity and proving optimality.


Thank you. Are you by any chance aware of a more introductory text on the subject?


Believe it or not Skiena's book is an introduction to algorithms! His lectures for the book are available on YouTube.[0]

[0] https://youtube.com/playlist?list=PLOtl7M3yp-DX6ic0HGT0PUX_w...


Oh, perfect! Video lectures FTW! Thank you once again :)


I personally like to go line by line in my solution and state the runtime of each line of code. Then I add them together, and write out the limit notation showing as N->inf and how I can cancel certain variables out. Its a little bit of overkill but I've found it helps me stand out a bit during interviews.


Usually on interviews it is not actual proving in any mathematical sort of way. It's more like: "I am calling sort() in a loop so obviously this is O(n^2 * log n) complexity" and then the interviewer nods in approval and you move on to the next question.


Even though you can perfect this grading rubric, it doesn't guarantee that you will get the job in an actual programming interview.


To all the naysayers, who proudly proclaim their professional pedigree and computer science degrees, but crash and burn through these interviews in vain: if you thought your Macbook Pro was a good investment, if you thought your 4 year degree was worth the time and effort, wait until you discover the returns from spending less than $1,000 USD on preparing for software engineering industry interviews. Swallow your pride and start getting paid!


I fail to understand why this content was published as a Google spreadsheet. Or as a spreadsheet in general.


Because it's a rubric? It fits a tabular format.


Sure, but plain html supports that. Heck, you can TSV a pastebin.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: