Hacker News new | past | comments | ask | show | jobs | submit login
Coding interviews are stupid (ish) (darrenkopp.com)
260 points by darrenkopp 7 months ago | hide | past | favorite | 650 comments



Assuming that a company does not look for candidates who are naturally good at ICPC-type of questions or geniuses who can come up with amazing algorithms in a matter of minutes, there is actually a different way to do coding interview: just give a high-level description of a sophisticated enough algorithm to a candidate and ask the candidate to turn that into code. I think it strikes a good balance between depth in CS and the coding abilities. This type of interview is similar to what engineers do at work too. We rarely invent new algorithms, but we do read white papers or blog entries or books to pick an algorithm to implement.

There are many variations in questions too: search a tree or graph with some customized criteria, using a double buffer to implement a tree-style prefix scan, implementing an integer multiplication with unlimited digits, some streaming algorithm, tree-walking to transform one tree to another, a simplified skip list, the options are unlimited. A good candidate tends to grasp the high-level concepts quickly (and they can ask questions), and is quick to convert intuition into working code. I find that there is a strong positive correlation between the performance in work and the performance in such coding interviews.


I still don't get why such questions are even asked as most jobs I've ever had not even remotely touched those and I've touched quite a few industries, technologies and types of companies.

To me, the value of a software engineer is to ask questions, make hypotheses and be able to iterate quickly. Balancing trees, leetcode and other algorithmic stuff on the spot sounds like bringing the dreadful education system structure to the real world.

Also if a senior person can't in 30/45 min of talking with someone figure out the general experience level then the problem is them, really.


> if a senior person can't in 30/45 min of talking with someone figure out the general experience level then the problem is them, really.

This is why I've always been so confused. Why is the software engineering interview wildly different from the traditional engineering interview (seniors sit down with candidates and discuss how to solve a relevant proxy problem the team is currently undergoing (has the side benefit of making the interview process potentially fruitful if you don't go with that candidate. This can be (and sometimes is) abused though)). I mean... we all speak the same language, and that isn't standard English... right?


> Why is the software engineering interview wildly different from the traditional engineering interview

I have my personal theory.

1) Top companies receive way more applications than the positions they have open. Thus they standardised around very technical interview as ways to eliminate false positives. I think these companies know this method produces several false negatives, but the ratio between those (eliminating candidates that wouldn't make it and missing on great candidates) is wide enough that it's fine. It does leads to some absurd results (such as people interviewed to maintain a library not being qualified despite being the very author) though.

2) Most of these top companies grew at such rates and hiring so aggressively from top colleges that eventually the interview was built by somewhat fresh grads for other fresh grads.

3) Many companies thought that replicating the brilliant results of these unicorns implied copying them. So you get OKR non sense or such interviews.


Yup. And 3) is particularly interesting. Lots of companies actually need to hire people who can get things done and who can build user-friendly software, yet they thought they needed to hire people who could turn any O(N^2) algorithms into O(N) or O(Nlog(N)).

And even for Google, leetcode has become noise because people simply cram them. When Microsoft started to use leetcode-style interviews, there were no interview site, and later there was this Cracking the Interview at most. So, people who aced the interview were either naturally talented or were so geeky that they devour math and puzzle books. Unfortunately, we have lost such signals nowadays.


> yet they thought they needed to hire people who could turn any O(N^2) algorithms into O(N) or O(Nlog(N))

And the great irony is that most software is slow as shit and resource intensive. Because yeah, knowing worst case performance is good to know, but what about mean? Or what you expect users to be doing? These can completely change the desired algorithm.

But there's the long joke "10 years of hardware advancements have been completely undone by 10 years of advancements in software."

Because people now rely on the hardware for doing things rather than trying to make software more optimal. It amazes me that gaming companies do this! And the root of the issue is trying to push things out quickly and so a lot of software is really just a Lovecraftian monster made of spaghetti and duct tape. And for what? Like Apple released the M4 today and who's going to use that power? Why did it take years for Apple to develop a fucking PDF reader that I can edit documents in? Why is it still a pain to open a PDF on my macbook and edit it on my iPad? Constantly fails and is unreliable, disconnecting despite being <2ft from one another. Why can't I use an iPad Pro as my glorified SSH machine? fuck man, that's why I have a laptop, so I can login to another machine and code there. The other things I need are latex, word, and a browser. I know I'm ranting a bit but I just feel like we in computer science have really lost this hacker mentality that was what made the field so great in the first place (and what brought about so many innovations). It just feels like there's too much momentum now and no one is __allowed__ to innovate.

To bring it back to interviewing signals, I do think the rant kinda relates. Because this same degradation makes it harder to determine in groups when there's so much pressure to be a textbook. But I guess this is why so many ML enthusiasts compare LLMs to humans, because we want humans to be machines.


Many software programs fail to achieve ultimate efficiency either because the software engineers are unable to do so, or because external factors prevent them from achieving it. I believe that in most cases, it is the latter.


I'd like to think the later because it makes us look better but I've seen how a lot of people code... I mean GPT doesn't just produce shit code because it can't reason... It'll only ever be as good as the data it was trained on. I teach and boy can I tell you that people do not sit down and take the time to learn. I guess this is inevitable when there's so much money. But this makes the interview easy, since passion is clear. I can take someone passionate and make them better than me but I can't make someone in it for the money even okay. You're hiring someone long term, so I'd rather someone that's always going to grow rather than someone who will stay static, even if the former is initially worse.

IME the most underrated optimization tool is the delete command. People don't realize that it's something you should frequently do. Delete a function, file, or even a code base. Some things just need to be rewritten. Hell, most things I write are written several times. You do it for an essay or any writing, why is code different?

Yeah, we have "move fast and break things" but we also have "clean up, everybody do their share." If your manager is pushing you around, ignore them. Manage your manager. You clean your room don't you? If most people's code was a house it'd be infested with termites and mold. It's not healthy. It wants to die. Stop trying to resuscitate it and let it die. Give birth to something new and more beautiful.

In part I think managers are to blame because they don't have a good understanding but also engineers are to blame for enabling the behavior and not managing your managers (you need each other, but they need you more).

I'll even note that we jump into huge code bases all the time, especially when starting out. Rewriting is a great way to learn that code! (Be careful pushing upstream though and make sure you communicate!!!) Even if you never push it's often faster in the long run. Sure, you can duct tape shit together but patch work is patch work, not a long term solution (or even moderate).

And dear God, open source developers, take your issues seriously. I know there's a lot of dumb ones, but a lot of people are trying to help and wanting to contribute. Every issue isn't a mark of failure, it's a mark of success because people are using your work. If they're having a hard time understanding the documentation, that's okay, your docs can be improved. If they want to do something your program can't, that's okay and you can admit that and even ask for help (don't fucking tell them it does and move on. No one's code is perfect, and your ego is getting in the way of your ego. You think you're so smart you're preventing yourself from proving how smart you are or getting smarter!). Close stale likely resolved issues (with a message like "reopen if you still have issues") but dear god, don't just respond and close an issue right away. Your users aren't door to door salesmen or Jehovah's Witnesses. A little kindness goes a long way.


> And the great irony is that most software is slow as shit and resource intensive

You really need those 100x faster algorithms when everything is a web or Electron app.


I’d add another factor to #1: this feels objective and unbiased. That’s at least partially true compared with other approaches like the nebulous “culture fit” but that impression is at least in part a blind spot because the people working there are almost certainly the type of people who do well with that style and it can be hard to recognize that other people are uncomfortable with something you find natural.


I would say that it makes the interview process more consistent and documented, and less subject to individual bias. However there's definitely going to be some bias at the institutional level considering that some people are just not good at certain types of interview questions. Algorithm and data structures questions favor people who recently graduated or are good at studying. Behavioral interviews favor people who are good at telling stories. Etc.


Yes, to be clear I’m not saying it’s terrible - only that it’s not as objective as people who like it tend to think. In addition to the bias you mentioned, the big skew is that it selects for people who do well on that kind of question in an unusual environment under stress, which is rarely representative of the actual job. That’s survivable for Google – although their lost decade suggests they shouldn’t be happy with it – but it can be really bad for smaller companies without their inertia.


Yeah I buy this theory.

The problem I have with it is that for this to be a reasonably effective strategy you should change the arbitrary metric every few years because otherwise it is likely to be hacked and has the potential to turn into a negative signal rather than positive. Essentially your false positives can dominate by "studying to the test" rather than "studying".

I'd say the same is true for college admissions too... because let's be honest, I highly doubt a randomly selected high school student is going to be significantly more or less successful than the current process. I'd imagine the simple act of applying is a strong enough natural filter to make this hypothesis much stronger (in practice, but see my prior argument)

People (and machines) are just fucking good at metric hacking. We're all familiar with Goodhart's Law, right?


I think (but cannot prove) that along the way, it was decided to explicitly measure ability to 'study to the test'. My theory goes that certain trendsetting companies decided that ability to 'grind at arbitrary technical thing' measures on-job adaptability. And then many other companies followed suit as a cargo cult thing.

If it were otherwise, and those trendsetting companies actually believed LeetCode tested programming ability, then why isn't LeetCode used in ongoing employee evaluation? Surely the skill of programming ability a) varies over an employee's tenure at a firm and b) is a strong predictor of employee impact over the near term. So I surmise that such companies don't believe this, and that therefore LeetCode serves some other purpose, in some semi-deliberate way.


I do code interviews because most candidates cannot declare a class or variable in a programming language of their choice.

I give a very basic business problem with no connection to any useful algorithm, and explicitly state that there are no gotchyas: we know all inputs and here’s what they are.

Almost everyone fails this interview, because somehow there are a lot of smooth tech talkers who couldn’t program to save their lives.


I think I have a much lazier explanation. Leet code style questions were a good way to test expertise in the past. But the same time everyone starts to follow suit the test becomes ineffective. What's the saying? When everyone is talking about a stock, it's time to sell. Same thing.


> If it were otherwise, and those trendsetting companies actually believed LeetCode tested programming ability, then why isn't LeetCode used in ongoing employee evaluation?

Probably recent job performance is a stronger predictor of near future job performance.


so having done interviews, just because the latter may be more present, does not mean the hordes of people just throwing a spaghetti made-up-resume at the wall have gone away. our industry has a great strength in that you don't need official credentialing to show that you can do something. at the same time, it is hard to verify what people are saying in their resumes, they might be lying in the worst case but sometimes they legitimately think they are at the level they are interviewing for. it was bad before the interest rate hikes, i cannot imagine what the situation is like now that hiring has significantly slowed and a lot more people are fighting for fewer jobs.

i did interviews for senior engineer and had people fail to find the second biggest number in a list, in a programming language of their own choosing. it had a depressingly high failure rate.


I had a candidate claiming over ten years of experience who couldn’t sum an array of ints in any language of his choosing.

This wasn’t an off-by-one or didn’t handle overflow, but rather was couldn’t get started at all.


Ten years of experience at one of those places where every keystroke outside powerpoint is offshored. Why would they know how to sum ints? Some people do start their careers as what could best be described as software architecture assistants. They never touched a brick in their lives, to go with the architecture image.


I have junior and senior students that struggle with fizzbuzz... But damn, are they not allowed to even do a lazy inefficient `sort(mylist)[-2]` if they forgot about for loops? That's the most efficient in terms of number of characters, right haha

But I still think you can reasonably weed these people out without these whiteboard problems. For exactly the same reasons engineers and scientists can. And let's be honest, for the most part, your resume should really be GitHub. I know so much more about a person by walking through their GitHub than by their resume.


Using GitHub is discriminatory against people who don’t code on the weekends outside of their jobs, and most people’s job related code would be under NDA and not postable on Github.

To be a capital E Engineer you have to pass a licensing exam. This filter obviously is not going to catch everything but it does raise the bar a little bit.

—-

As far as the root question goes, they are allowed to propose that, and then i can try and tease out of them why they think that is the best and if something is better. But you would be surprised at the creative ways people manage to not iterate through a full loop once.


You're right. But a lot of people that are good coders code for fun. But you're also right that not all those people push their code into public repositories. The same is true for mechanical engineers. They're almost always makers. Fixing stuff at home or doing projects for fun. Not always, but there's a strong correlation.

But getting people to explain projects they did and challenges they faced can still be done. We do it with people who have worked on classified stuff all the time. If you're an expert it's hard for people to bullshit you about your domain expertise. Leet code is no different. It doesn't test if you really know the stuff, it tests how well you can memorize and do work that is marginally beneficial in order to make your boss happy. Maybe that's what you want. But it won't get you the best engineers.


Leet code, in the interviews that I do, is not the only thing I do.

But when I am asked to do a one hour part of an interview for four interview loops a week, all the preps and debriefings, and also do all my normal day-to-day deliverables, we need some fast filters for the obvious bullshitters. The interviewing volume is very high and there is a lot of noise.


Hiring people who code at all is discrimination against people who played video games instead of learning to code.


“Codes on their spare time” is not part of the job description, but “codes at all” is.

There are plenty of reasons not to code on spare time. If anything the people who are most likely to do that are often also the people who coding interviews are supposed to be privileging, fresh single college grads.

I don’t know how people would square the statements “take-home assignments are unpaid labor and unfair to people with time commitments” and then do a 180 and say “people should have an up-to-date fresh github that they work on in their spare time.”


If it would take the candidate "spending every waking moment of their lives coding" to have one or two small coding projects after a half decade plus in the field, that's a signal.

If you went to college but never made anything, that's a signal.

If you didn't go to college, and never made anything, just shut up and make something.


In a half decade plus some people pick up other commitments that are not side projects, like a pet, a child, sports, hiking, etc.

At the end of the day, it isn’t really relevant to the employer what is done in spare time off the job when they get hired, so it’s not like I should privilege their spare time projects over people who don’t do that, particularly if people don’t want to code off the clock. There are plenty of good engineers who do not code off the clock, and there are plenty of bad engineers who do.

Also, more often than not, coding off the clock for the sake of having a portfolio is not really indicative of anything. There aren’t, for example, review processes or anything like that in a one person side project, and unless I spend significantly more hours background checking who’s to say the side project is not plagiarized? People already lie on their resumes today.


In the time you took writing this comment you could've gotten the repo created and the title and description filled out. Writing text in a public readme.md would serve you better than sending it to me.


I have side projects, but I don't expect every candidate to, nor do I expect every candidate to be a religious reader of every HN comment thread.


I'm not saying it should be mandatory, but they would have to show mastery some other way. Whiteboard? Live coding? Project?

I think a side project opens up the opportunity to skip that for a project presentation. This is a lot more in line with real life as you would typically code and later present that work to others. You would defend it to some degree, why you made choice A vs choice B. If you created it, you'll be able to do that.

Doesn't need to be a huge thing. Just show you can do anything at all really at the junior level. Intermediates can show mastery in a framework with something with slightly more complexity than a "hello world".


> I'm not saying it should be mandatory, but they would have to show mastery some other way. Whiteboard? Live coding? Project?

godelski said your resume should really be GitHub. You could have said this instead of sarcasm.


Typically the people without side projects also make excuses to not do those either.

If I had a company I'd offer looking over an existing project or a project where you create a side project of your choice without any further direction.

So not mandatory but the easiest way to go probably. Once you apply to my company you'll have one to show for next time at least.

(If you want to write the project out on the whiteboard instead I guess feel free, that seems hard though.)


Many people do not have side projects. Few people working as software engineers were not tested in some way.

I think it's more useful and more fair to give candidates some direction when I request something. What scope is enough? What tests are enough? We define side project differently or you would expect much more time from candidates than I do.


> What scope is enough? What tests are enough?

How each candidate answers each question themselves tells you about them.


I used to think so. But real tasks have acceptance criteria. Seeing how candidates work with loose criteria has told me more than telling them in effect to read my mind.


It's so clear that people on this site have no friends, family, or hobbies. I also don't think you realize how ableist your comment is.

Some people have more trouble completing tasks for reasons that have nothing to do with talent or intelligence.


this meaning no 1) is the right answer


> Why is the software engineering interview wildly different from the traditional engineering interview

One angle is that SWE is one of the very few professions where you don't need a formal degree to have a career. It's also a common hobby among a sizable population.

I think this is truly great. A holdout breathing hole where people can have lucrative careers without convincing and paying off a ton of gatekeepers!

But I also think that when you hire in other industries, you can get much more milage from looking at the candidate's formal degrees and certifications.

In our industry, you kinda have to start from scratch with every person.


> But I also think that when you hire in other industries, you can get much more milage from looking at the candidate's formal degrees and certifications.

> In our industry, you kinda have to start from scratch with every person.

Not really - in software people leave a bigger and more easily trackable track record than any other engineering field. From previous work projects/experience to open source projects/experience, from personal projects to the communities a person belongs to. A lot of stuff is directly visible on the Internet. In other engineering fields, you have to trust what the applicant says in his or her resume and maybe at most you can call the previous companies he worked at for reference. In software, a lot of the trail is online or easy to tell, and you can still call.

Even for totally new graduates, it is better in software: Its much easier for a software undergrad to work part-time or in a hobby project or contribute to open source and produce something before he or she graduates, so that you can assess his skills. Its much harder for a mechanical or civil engineer to do that, so for that reason you have to rely solely on the relevant university/college and the grades of the candidate.


> Not really - in software people leave a bigger and more easily trackable track record than any other engineering field. From previous work projects/experience to open source projects/experience, from personal projects to the communities a person belongs to.

That only apply to software people who either (a) are getting paid to work on open source or (b) have enough spare time to work on open source as a hobby after hours. Option (b), in particular, usually implies having no children or other familial responsibilities.


And b also implies programming is also a hobby. For a lot of people it is their work. We can not filter on that as not many will be left to hire.


Nnno. You start from their experience and give the benefit of the doubt. As someone who’s been in software for 12 years, I don’t want to talk about writing algorithms. I want to talk about how to motivate 150 engineers to actually write code correctly or inspire a technical initiative.


To have good comparison/calibration between candidates, you should be asking the same question each time, so it can't be about the "problem the team is currently undergoing", because that's going to be something different every week/month.

In general however, of course, there is/should be a round of interview that covers architecture/system design. It's just that the coding interview is a different interview type, which gives a different kind of signal, which is still important. It doesn't replace architecture interview, it complements it.


> because that's going to be something different every week/month.

Why's that a problem? What you're going to be doing on the job is going to change at the exact same rate. But people also tend to talk about recent problems and those may be even a month old. Honestly, the questions are about seeing how the person would approach it. It is not about solving them, because you're going to be doing things you don't know the answers to beforehand anyways.

> It's just that the coding interview is a different interview type

For what reason? "Because"?


> Why's that a problem?

The first half of the sentence you're responding to answers this question already. Because you can't compare candidates fairly if you ask everyone a different question. Is a candidate who aced an easy question better or worse than a candidate who struggled with a difficult question?

> For what reason? "Because"?

What are you asking? Why is an interview where you ask about high level design different from an interview where you ask to write code? Isn't that like asking why an apple is different from an orange? They just are, by definition.


Mechanical engineering interviews seem to do the same as software: "Engineers always ask about beam bending, stress strain curves, and conservation of work. Know the theory and any technical questions are easy."

Basically an equivalent of simple algorithmic questions. Not "real" because it's impossible to share enough context of a real problem in an interview to make it practical. Short, testing principles, but most importantly basic thinking and problem solving facilities.


> Mechanical engineering interviews seem to do the same as software:

I've been an engineer in the past (physics undergrad -> aerospace job -> grad school/ml). I have never seen or heard of an engineer being expected to solve math equations on a whiteboard during an interview. It is expected that you already know these things. Honestly, it is expected that you have a reference to these equations and you'll have memorized what you do most.

As an example, I got a call when I was finishing my undergrad for a job from Raytheon. I was supposedly the only undergrad being interviewed but first interview was a phone interview. I got asked an optics question and I said to the interviewer "you mind if I grab my book? I have it right next to me and I bookmarked that equation thinking you might ask and I'm blanking on the coefficients (explain form of equation while opening book)". He was super cool with that and at the end of the interview said I was on his short list.

I see no problem with this method. We live in the age of the internet. You shouldn't be memorizing a bunch of stuff purposefully, you should be memorizing by accident (aka through routine usage). You should know the abstractions and core concepts but the details are not worth knowing off the top of your head (obviously you should have known at some point) unless you are actively using them.


I've had a coding interview (screen, not whiteboard) fail where the main criticism was that one routine detail I took a while to get right could have been googled faster. In hindsight I still doubt that, given all the semi-related tangents you end up following from Google, but that was their expectation, look up the right piece of example code and recognize the missing bit (or get out right immediately).

For a proper engineering question (as in not software), I'd expect the expected answer to be naming the reference book where you'd look up the formula. Last thing you want is someone overconfident in their from memory version of physics.


> Last thing you want is someone overconfident in their from memory version of physics.

Honestly, having been in both worlds, there's not too much of a difference. Physics is harder but coding you got more things to juggle in your brain. So I really do not think it is an issue to offload infrequent "equations"[0] to a book/google/whatever.

[0] And equations could be taken out of quotes considering that math and code are the same thing.


I had a senior engineer chastise me once for NOT using the lookup tables.

"How do you know your memory was infallible at that moment? Would you stake other people's lives on that memory?"

So what you did on that phone interview was probably the biggest green-flag they'd seen all day.


We live in the age of ChatGPT. It might actually be time to assess how candidates use it during interviews. What prompts they write, how they refine their prompts, how they use the answers, whether they take them at face value, etc.


Sure, and we live in the age of calculators. Just because we have calculators doesn't mean we should ban them on math tests. It means you adapt and test for the more important stuff. You remove the rote mundane aspect and focus on the abstract and nuance.

You still can't get GPT to understand and give nuanced responses without significant prompt engineering (usually requiring someone that understands said nuance of the specific problem). So... I'm not concerned. If you're getting GPT to pass your interviews, then you should change your interviews. LLMs are useful tools, but compression machines aren't nuanced thinking machines, even if they can mascaraed as such in fun examples.

Essentially ask yourself this: why in my example was the engineer not only okay with me grabbing my book but happy? Understand that and you'll understand my point.

Edit: I see you're the founder of Archipelago AI. I happen to be an ML researcher. We both know that there's lots of snakeoil in this field. Are you telling me you can't frequently sniff that out? Rabbit? Devon? Humane Pin? I have receipts for calling several of these out at launch. (I haven't looked more than your profile, should I look at your company?)


I'm actually not talking about interviewees (ab)using ChatGPT to pass interviews and interviewers trying to catch that or work around that. I'm talking about testing candidates' use of ChatGPT as one of the skills they have.

> I see you're the founder of Archipelago AI.

I don't know where you got that from, but I'm not.


> I'm talking about testing candidates' use of ChatGPT as one of the skills they have.

The same way? I guess I'm confused why this is any different. You ask them? Assuming you have expertise in this, then you do that. If you don't, you ask them to maybe demonstrate it and try to listen while they explain. I'll give you a strong hint here: people that know their shit talk about nuance. They might be shy and not give it to you right away or might think they're "showing off" or something else, but it is not too hard to get experts to excitedly talk about things they're experts in. Look for that.

> I don't know where you got that from, but I'm not.

Ops, somehow I clicked esafak's profile instead. My bad


You might as well ask how they use book libraries and web search.


I'm a chemist by education, so all my college friends are chemists.

Being asked a theoretical chemistry question at a job interview would be...odd.

You can be asked about your proficiency with some lab equipment, your experience with various procedures and what not.

But the very thought of being asked theoretical questions is beyond ridiculous.


Why, don't they get imposters? You sure run into people who can't code in coding interviews.


Because to be a chemist you need to graduate in chemistry.

What would be the point of asking theoretical questions?

There's just no way in hell people can remember even 10% of what they studied in college, book knowledge isn't really the goal, rather than teaching you how to learn and master the topics.


Because to actually have those types of conversations you have to have legitimate experience. To be a bit flippant, here's a relevant xkcd[0]. To be less so, "in groups" are pretty good at detecting others in their groups. I mean can you not talk to another <insert anything where you have domain expertise, including hobbies> and not figure out who's also a domain expert? It's because people "in-group" understand nuance of the subject matter.

[0] https://xkcd.com/451/


Doesn’t that comic more closely hew to the idea that some fields are complete bullshit?


That's one interpretation. But that interpretation is still dependent upon intra-group recognition. The joke relies on the intra-group recognition __being__ the act of bullshitting.


Hmm… I have a twist on this. Chemistry is a really big field.

My degree is in computational/theoretical chemistry. Even before I went into software engineering, it would have been really odd for me to be asked questions about wet chemistry.

Admittedly it would have been odd to be quizzed on theory out of the blue as well.

What would not have been odd was to give a job talk and be asked questions based on that talk; in my case this would have included aspects of theory relevant to the simulation work and analysis I presented.


And software and computing isn’t a big field? Ever heard of EE?


Half a dozen years ago in a conference talk, Joel Spolsky claimed credit for inventing these sorts of whiteboard interviews (with his Guerilla Guide to Interviewing), and that it had broken software engineering hiring.

https://thenewstack.io/joel-spolsky-on-stack-overflow-inclus...


FTA:

“I think you need a better system, and I think it’s probably going to be more like an apprenticeship or an internship, where you bring people on with a much easier filter at the beginning. And you hire them kind of on an experimental basis or on a training basis, and then you have to sort of see what they can do in the first month or two.”

Well, if he fucked it up, I don’t see any reason why his ideas can’t also fix it.


Unfortunately this only works for interns and new grads. Nobody experienced want to take a job on an experimental basis.


Fortunately people with experience have resumes and are easier to tell if they're bsing their resume.

Fuck man, people do this with engineers who work on classified projects. You all are over thinking it. You're trying to hyper optimize for a function that is incredibly noisy and does not have optimal solutions.


And how would it scale to a number of candidates greater than one? A classroom full of competing peers? That's what talent shows are for.


Aren't probationary periods pretty standard, in many/all industries and countries too not just software?


Yes and such a system makes hiring so much easier because mistakes cost much less. But the US ties things like healthcare to employment so a company that has a reputation for firing people after hiring them (however legitimate) would probably be one people would avoid. In Sweden, for example, I’ve found interviews so much more reasonable. Then again, I had healthcare there regardless of employment.


Oh, I see. I'm in the UK, so more like Sweden; no experience of having healthcare tied to employment (other than optional private healthcare as a perk).


> Why is the software engineering interview wildly different from the traditional engineering interview

I am guessing here, but wouldn't a candidate for a traditional engineering role normally hold a college degree in a relevant field, so that part of quality assurance is expected to have been done by the college?


Candidates for software engineering roles normally hold a degree in a relevant field.


> if a senior person can't in 30/45 min of talking with someone figure out the general experience level then the problem is them, really.

Being able to evaluate a person is a difficult soft skill to learn. An interviewer cannot learn nor improve it over night nor months nor years. This is basically being good at reading a person. Not to mention an issue with bias that is highly subjective.

If an interviewer isn't good at this, the solution would still be to supplement your evaluation with a coding interview.


I've only ever had a single whiteboard interview in my career, and it was a single interviewer who preferred them (I accepted the job), but I have also walked through the backdoor via recommendations for all but 1 of my employers in ~20 years in the industry. From embedded in radio and television broadcasting, to medical robotics, to AAA games, with some excursions into web development. Every other interview at a company I accepted an offer for was a conversation with engineers about my experience and some hypotheticals.


If you talk shop with a mechanic, they're going to know pretty quickly if you actually know what you're talking about. In my experience, the same applies in our field.


> I still don't get why such questions are even asked

The thesis is not that these exercises are representative of work but rather predictive of performance.

Sales has a similar phenomenon with sports. While there is no athleticism involved in selling, many believe history in competitive sports to be a positive predictor of sales success.

---

You can reasonably argue whether leetcode accomplishes this well or poorly, but...

Always remember that purpose of an interview (for an employer) is to predict performance. So you are looking for the resume screening+interview process that most accurately assesses that.


> Sales has a similar phenomenon with sports. While there is no athleticism involved in selling, many believe history in competitive sports to be a positive predictor of sales success.

This is interesting. I had never heard this before. Is there any research on this? For that matter, is there any actual research showing that leetcode-style interviews actually do predict performance? If so, do they do so any better or differently than an IQ test?


I recall reading that leetcode style questions are basically just a standin for an IQ test, except with plausible deniability for being directly applicable to the job.


I saw this claim. I never saw data.


So you're saying that someone wasting their time studying leetcode to pass a stupid game is a good indicator?

I would almost believe the opposite: if you actually pass those tests with flying colours, it shows me that you believe you needed to do that to be hired, while someone who's actually experienced would never in a million years step down so low.


I believe skill at “leetcode problems” is predictive of general programming skill. Someone who can solve leetcode problems can almost certainly learn css. But, clearly from reading comments here, not the other way around.

Personally I love leetcode style problems. They’re fun. And useful - I use this stuff in my job constantly.


I would be scared you are overengineering and optimising things though. I have seen people implementing complex paradigms and weird optimisations instead of writing simple code just to make sure they are perfectly optimising.

E.g. optimising client side code where N is likely never to be above 300, but instead of few simple lines, write a complicated solution to make sure it can scale to billions and beyond.

I would take any problem solving energy and spend it on side projects instead of doing leetcode. I do like those exercises, but I enjoy building new things more and which gives me practical experience which I think is more important.


Skill gives you the gift of choice. You know how to write it either way around and it’s up to you to decide. Being able to correctly decide when to hack something inefficient together and when to optimise is another skill issue. It’d make a good interview question, that.


Yeah, but leetcode does not necessarily give you that skill or even prove it. And a talented problem solver would be able to find optimal and practical solutions when they are required and are not premature even without doing leetcode.

You might get false positives as well. E.g. you get people who are tunnel visioned on leet code, cracking the coding interview and other common system design books, they know all the answers, but then they completely lack common sense day to day and it can be hard to test for that if you are solely focusing on leetcode.


Of course. I’ve interviewed over 400 people in my career and I’ve never directly asked someone if they’ve done leetcode problems. I don’t care about actual leetcode. Looking at someone’s progress on leetcode as a replacement for an interview would be a terrible idea.

As an interviewer, I care about their skills - technical skills (like debugging ability and CS knowledge), social skills (can we talk about our work together?) and judgement. Your ability to understand data structures and algorithms is signal, but there is a lot more to a good candidate than knowing how to make a bit of code run fast. Knowing when to make code fast is, as you say, just as important.


It's a good indicator for someone will to jump through arbitrary hoops to get promoted at a big corp: write all the paper work as design docs, get all the promos requirements and make sure all the weird business requirements of corpo customers are met. If you are not willing to "step down so low" you a perfect example of the someone who they want to filter out.


I didn't state my personal opinion above, but yes I have seen leetcode aptitude to positively correlate with day-to-day problem solving.


If your company uses leetcode to filter out employees then you are a leetcode team with your internal levels and ranks, not at all representative of the whole population of skilled IT people.


How would you, or anyone, know if your career is representative of the field?

I’m sure plenty of people spend their career never learning or using data structures and algorithms knowledge. But I suspect plenty of people spend their career using this stuff all the time. Eg people who work in databases, compilers, operating systems, video games, ai, crypto currencies, and so on.


I have seen Goodhart's law.


> The thesis is not that these exercises are representative of work but rather predictive of performance.

Well said. Just like in college, calculus (not analysis) and organic chemistry are used as filter courses. Of course, why the two courses, especially organic chemistry, are so hard to American students is another topic. I personally think that it shows the failure of the education system of the US.


> Also if a senior person can't in 30/45 min of talking with someone figure out the general experience level then the problem is them, really.

OK take it as read that your posturing has succeeded and we all agree that you're a brilliant interpersonal genius and the rest of us are all useless chumps. What then? The rest of us still need to interview and make hiring recommendations. Or are you suggesting that employers should fire anyone who lacks your magical talent?


Why would you read that?

I just mean that if you put 2 experienced people talking about a topic they both know, it should be pretty easy for both (or at least the interviewer) to get a rough understanding of the level of the interviewee.


> I just mean that if you put 2 experienced people talking about a topic they both know, it should be pretty easy for both (or at least the interviewer) to get a rough understanding of the level of the interviewee.

Well, in my experience it's not, at least not in a "hostile" context. Most technical people are used to assuming good faith in technical conversations, and there are some very smooth bullshitters around; it's easy to construct a verbal facade that only falls apart once you ask someone to actually code.


Anyone who can't see through the 'verbal facade' is not that good. They may be very smart, but not at a bonafide genius level.


It's not. You will find plenty of examples in this very thread.


Personally I think such questions have three values:

- Future proof. Unless I work for an outsourcing company, sooner or later I will want to push the envelope, or so I hope I do. And to push the envelop, one needs good CS fundamentals (maybe there are some exceptions in some specialized field). Think about React. It's a JS framework, yet to invent it one needs to understand at least compiler and graph.

- Geekiness/talent filter. The same reason that the nascent Google and Microsoft and any elite companies like Jane Street asked Putnam questions, Martin Gardner questions, ICPC questions, and clever probability puzzles. Whether it's a good idea is debatable, but at least those companies want to have such type of people and they were hugely successful. Note the word filter: people who pass the interview may not be good, but failing the interview means the candidate may not be smart or geeky enough. Again, I'm not endorsing the idea but just exploring the motivation of the interview policies.

- Information density. Assuming a company does want to time box a coding interview in an hour, it will be hard to come up with realistic question without spending too much time on the context. On the other hand, it's relatively easy to come up with a data structure/algorithm question that packs enough number of abstractions and inspection points to examine one's programming skill.


> Think about React. It's a JS framework, yet to invent it one needs to understand at least compiler and graph.

Are you hiring people to create new JS frameworks, or to use an existing one?


Well, I guess the example is more confusing than clarifying. I used that for a case of pushing the envelope. When Facebook needed a better solution for their feeds, they invented Reactive, and I was saying that one would need to know compiler to build JSX and need to graph to build the optimized virtual DOM. Yes, nowadays we just hire Reactive users, but my point is that in the future we may have another moment that we need to invent something, and as a company founder I sure would like to hire the author of Reactive or the like.


To use an existing one.

And tools are best used when they are understood.


Hiring, like many things - including engineering - is about tradeoffs.

Is someone more knowledgeable "better"? Sure. But sometimes it's about getting a specific thing done, and if you're hiring someone to improve a web thing that uses React, maybe you don't need someone who understands compilers and the kernel and how the underlying hardware works and all that. Maybe you can spend a bit less and get someone who will do a perfectly adequate job.


I don't know anything about quantum mechanics. Does that mean I don't understand how computers work?


The common advice is to understand one level above and one level below where you operate.

For example:

If you build computers, you should understand how the individual components are implemented.

If you build computer components, you should understand materials science and electromagnetism.

If you study electromagnetism, you should understand quantum mechanics and relativity.


How many react developers understands compilers & graphs? I'd wager that it's way below 10%.


Also why do you need knowledge on compilers and graphs to create React?

Firstly, React is not compiled. Secondly the graph or tree or whichever aspect can come naturally when you come up with the idea of how it would be best to maintain JS apps.

In fact most important experience should have been grinding put tons of unmaintainable JS apps to come up with something like that.


I wouldn't argue with that.


> Also if a senior person can't in 30/45 min of talking with someone figure out the general experience level then the problem is them, really.

Actually most of them, including the really inexperienced juniors have 'figured' you out in less that 15 minutes, or at least they have decided whether to hire you or not in 15 minutes. But they have to put on a charade of being fair.

Also a 'white' older male is the least preferred even if he is smarter compared to all females and the minorities that are being interviewed as long as they are not terrible. Biases galore.


Got any evidence for the following claim:

“ Also a 'white' older male is the least preferred even if he is smarter compared to all females and the minorities that are being interviewed as long as they are not terrible. Biases galore.”


Isn't that the definition of affirmative action? Which most companies claim to do (e.g Google [1]).

Admittedly Google also claims not to discriminate on the basis of protected characteristics, which is somewhat contradictory to the definition of affirmative action as positive discrimination [2].

[1] https://www.google.com/about/careers/applications/eeo/ [2] https://en.wikipedia.org/wiki/Affirmative_action


The Wikipedia article did not define affirmative action as positive discrimination.


“Affirmative action (also sometimes called reservations, alternative access, positive discrimination or positive action in various countries' laws and policies)…”

It calls them the same thing


No. Positive discrimination is a form of affirmative action in various countries' laws and policies. Positive action is another. Positive action is not positive discrimination. Paragraph 2 mentioned merely targeting encouragement for increased participation even if you did not know what positive action meant.


The text attests to both our interpretations.

1) By saying x is sometimes called y under some circumstances, the text implies x (Affirmative action) is equal to or a subset of y (Positive discrimination)

2) The second paragraph also suggests examples of affirmative action that wouldn’t constitute positive discrimination


The text would support your interpretation if the article ended after the 1st sentence. And the 1st sentence did not mention positive action.


He's close. The real bias is against extremely good looking guys named Rene who are also way smarter and charismatic than everyone else. Terrible bias.


I dunno. "Rene" sounds a lot like "renege." Which probably means you'll sign on to a project and abandon it halfway through or something. Very sus.

(I wish I could say my early interviewing rubrics were any better than this. We have come a long way as a people, we Silly Valley programmers.)


That kinda makes sense, unless you want Rene to be your replacement.


> Also a 'white' older male is the least preferred even if he is smarter compared to all females and the minorities that are being interviewed as long as they are not terrible. Biases galore.

I will agree that ageism is a thing, but 90% of all of my coworkers (who were software engineers) have been white males, so I cannot at all agree with this take otherwise.


> Also if a senior person can't in 30/45 min of talking with someone figure out the general experience level then the problem is them, really.

Or it's not their call, theirs is just to work with them after.


>I still don't get why such questions are even asked as most jobs I've ever had not even remotely touched those and I've touched quite a few industries, technologies and types of companies.

I've had to work on tree traversal stuff multiple times in my life, anything low level GUI related will work with trees a ton.

I've also had to work with hash tables directly, and with memory caching layers.

I really should learn to write a proper parser, as I've had to write parsers multiple times now and they are always an ugly hack job.


Yep. In a project I’m working on at the moment (collab text editing), I’ve implemented 2 different b-trees, a skip list, 2 custom file formats and a dozen or so algorithms which do various graph traversals.

I get that this is uncommon, but if you scratch beneath the surface, most software (browsers, databases, compilers, OSes) are full of this stuff.

Even while I was consulting stuff like this would come up. At one company we were using a custom graphql wrapper around a CMS, and it was missing some functions we needed. The wrapper was implemented like a compiler from the cms’s data format to a set of query functions. Fixing it to do what we needed it to do was really hard and broke my brain a bit. But I did it. And I wouldn’t have been able to without understanding compilers and algorithms.

You can spend your whole career walking the beaten path adding features to apps and websites, and never traversing a tree at all. There’s lots of work like that out there. But if you ever want to go deeper, you’ve gotta understand data structures and algorithms. I know not everyone is suited to it, and that’s fine. But there’s definitely a reason big tech asks about this stuff.


> But if you ever want to go deeper, you’ve gotta understand data structures and algorithms.

I don't think this is quite right. I think it's more like:

If you ever want to go deeper, you've gotta be able to recognize when the problem you're solving fits a pattern for which good data structures and/or algorithms exist, and you've gotta be able to find, understand, and apply good reference material.

Solving this "knowing what you don't know" problem is the best and most important role of formal education, in my opinion. It's not as important to know a topic as it is to know that it exists, and some of the basic terminology necessary to get started researching it further.


Yeah I think that’s what I mean by “understand data structures and algorithms”. Or, I think your description is exactly what a useful working understanding looks like. You should know broadly what’s out there so if a problem comes up, you know where to look. (Would a hash table help? A priority queue? etc). And you should be skilled enough such that if you decide to use a red-black tree, you can find a good library or implement it yourself - with access to the whole internet as reference material. (And test it).

Nobody expects you to memorise a text book. But if an API gives you a list of items and you want to count the occurrences of each item, you should be able to figure out how to do that. And ideally in less than O(n^2) time if necessary. It’s surprising how many otherwise productive coworkers I’ve had who struggle with stuff like that.


But this isn't what the leetcode interview tests for. Reversing a binary tree, or figuring out how to arrange the parking lot to fit the most cars or whatever isn't a working understanding, it's essentially memorization. Being able to memorize something like that takes intelligence and dedication, so it does a pretty good job selecting for that, but it also filters out a lot of people who for good and valid reasons don't want to spend hours and hours studying. Not even doctors/lawyers do this: they do it exactly once and never again.


> isn't a working understanding, it's essentially memorization

Boring questions that you've seen before are memorization. But there's thousands of interesting questions out there that will never show up on leetcode.

Two examples from times I've been interviewed:

- One time my interviewer gave me about 15 lines of C code that used multiple threads and asked me if it was threadsafe (it wasn't). Then he gave me some threading primitive I hadn't seen before and asked me to fix it. Well, I had no idea what the threading primitive was so I was a bit stuffed. I asked him to explain it and he did, and then I (successfully) used it to solve the problem. He wanted to hire me, saying the fact that I hadn't seen that primitive before and still managed to figure out the answer within the interview impressed him more than anything else.

- Another time I was asked some more classic algorithm problem, where (in hindsight) the answer was clearly to use a priority queue. I didn't think of that, and under pressure I came up with some weird alternative in the interview. The interviewer messaged privately after the interview - he'd gone back to his desk and spent the next hour trying to figure out if my hairbrained idea would work, and he was as surprised as I was to realise it would. I told him I'd realised a priority queue was a good approach as soon as I walked out the door of the interview. I was offered the job.

I've never "crammed leetcode problems" in my life. I don't think thats what any interviewers are looking for. They're looking for people who can think on their feet and use DSA to solve real problems. AFAIK algorithm puzzle interviews predate leetcode. Algorithms have been used since the very early days of google and (I think) microsoft.


There's not a lot of difference between the algorithm interview and the leetcode interview--leetcode is just a story problem around some kind of basic algorithmic problem.

I've done multiple interviews where they use some site and you're expected to plow through 10 questions or whatever in 60 minutes. You can subscribe to the site to practice. Gazillions of employers use this site or one like it.


I agree that I think this is what most experienced people mean when they think of understanding data structures and algorithms.

The problem is that this kind of understanding is very rarely what coding interviews check for. They either don't check for this at all - instead just making sure people can write simple code while reasoning through a simple problems under time pressure - or they check for whether people memorized a textbook, looking for a specific non-obvious data structure and algorithm.

What I try to do, because I think it almost ticks all the boxes without asking for memorization, is ask questions that start with the simple "can you program at all" part, with follow up parts that end up in a place where we can have a conversation (without implementation) about how different tradeoffs could be improved, which often leads to discussing what useful prior art might exist.

Unfortunately I think this still has very high false negative issues. I've worked with people who prove to be perfectly capable of noticing when an appropriate data structure will make a big difference in their actual work, without that coming out in their interview.


My recommendation is to have a lot of different stuff in an interview, so you aren't making a final judgement on someone over any individual part of the interview. That means the candidate can stuff up one or more parts of the interview, and you can still get good signal.

For example, do all the things you suggest. Get them to write some simple code. Talk to them about an algorithm problem. Also, give them some simpleish code with some failing unit tests and ask them to debug the code. (This requires some prep, but its a fabulous assessment to do.) Ask them about their prior work. Just do everything you can think of, and don't give any specific part of the interview too much time or attention.

In my experience, this gives candidates a lot more opportunities to impress me. I don't really care if one person on our team is particularly weak on data structures. Its kinda better if we have someone who's an absolute gun at debugging, and someone else who's amazing at data structure work. That creates a better team than if everyone is impressive in the same way.


I mean, I agree, but this sounds like it could easily be a 3 hour interview :)


I think about time complexity and DSA all the time when programming. My personal view is that the people who claim it is unnecessary don't understand it and probably would be better off if they did.

I've seen lots of code that would be better if the author knew some basics. For example a report that took over half an hour to generate, I made a one-line change and cut the time to a few minutes - pretty sure I could have made it practically instant if I had taken the time to go through all of it.

And it's not like I'm some genius, I just understand the stuff I've been taught. Pretty sure most of my peers are supposed to have learned the same stuff, I think they just didn't really understand it.


In my experience, whether this is top of mind has a lot more to do with what people work on and with what tools than with level of understanding. For instance, in your example:

> For example a report that took over half an hour to generate, I made a one-line change and cut the time to a few minutes

In essentially all the work I've done in my career, this would be the result of expertise in SQL and the relational model, not in data structures and algorithms. I don't recall ever working on reporting code that isn't a dumb pipe between a SQL query and a mature library for writing CSV (or parquet or whatever). Sure, there are tons of data structures and algorithms on both the database server and client side, but that's not what I'm working on.

And I think this is pretty typical for people who mostly build "applications", that expertise in tools is more of a value-add than expertise in data structures and algorithms.

But having said that, I do agree with you that everyone benefits from these kinds of "fundamentals". Not just this, but also other fundamentals like computer hardware and systems, networking, etc. I think fundamentals are very useful, while also thinking that many people are good at their jobs without them.


In my case the processing was happening in our backend. I can't remember exactly why it couldn't be SQL, actually it's possible it could have been sql. But changing it to sql would have been a bigger change and this wasn't really the task I was working on, I just happened across it while doing something else.

I have also seen and fixed similar travesties where someone iterates through a huge list making one query per element, where it was fairly trivial to rewrite it to a swl join.

Point is just that understanding what you're doing is is valuable and in my mind DSA is a fundamental part of understanding what you're doing. Anyway I think we agree :)


Unsurprisingly we've now reached the perennial "is premature optimization actually premature" of it all :)

Would it have been better for the person who originally wrote that just-iterate-the-list implementation to have been thinking about data structures and algorithms that would perform better? Opinions on this vary, but I tend to come down on the side of: Optimize for human productivity (for both the writer and the many future readers) first, then profile, then optimize any bottlenecks.

My assumption when I come across something that turns out to be a performance bottleneck that is easy to fix with a better data structure or algorithm, is that the person who wrote that was consciously doing a simple implementation to start, in lieu of profiling to see where the actual bottlenecks are.

But I also understand the perspective of "just do simple performance enhancements up front and you won't have to spend so much time profiling to find bottlenecks down the line". I think both philosophies are valid. (But from time to time I do come across unnecessarily complicated implementations of things in code paths that have absolutely no performance implications, and wish people wouldn't have done that.)


> Optimize for human productivity (for both the writer and the many future readers) first, then profile, then optimize any bottlenecks.

I don't agree. The problem with this approach is that there are some optimisations which require changes to how data flows through your system. These sort of refactorings are much more difficult to do after the fact, because they change what is happening at the abstraction / system boundaries.

Personally, my approach is something like this: Optimise first for velocity, usually writing as little code as possible to get something usable on the screen. Let the code be ugly. Then show people and iterate, as you feel out what a better version of the thing you made might look like - both internally (in code) and externally (what you show to humans or to other software systems). Then rewrite it piece by piece in a way thats actually maintainable and fast (based on your requirements).


I did mention that people disagree on this :)

I've moved both ways along the continuum between these perspectives at different times. I don't think there is a single correct answer. I'm at a different place than you on it currently, but who knows where I'll be in a year.


Totally fair :) I have the same relationship with static typing. Right now I couldn't imagine doing serious work in a dynamically typed language, but who knows what I'll think in a year too. From where I'm standing now, it could be ghastly.


Actually I think you're misunderstanding me. I'm not saying you should profile and optimize all the code you write, I'm saying that a basic understanding of algorithms, data structures and complexity analysis allows you to write better code without any extra tools. I didn't profile this report to find out why it took 30+ minutes to run. I just happened across some code, read it, saw that it was essentially two nested loops iterating through two huge (50-80k elements each) lists matching items by name, changed it to use a dictionary instead of the inner loop and that was that.

It's a trivial change, it wouldn't have taken any longer to write this the first time around. There is no excuse for doing this, it's just a dev who doesn't understand what they're doing.

That's my point. Understanding these fundamentals allows you to avoid these types of pitfalls and understand when it's okay to write something inefficient and when it isn't.


If I want to go deeper. I'll have to understand the trees _at that time_.

Not now, when I'm just re-exporting ESM node packages as CJS so our legacy system can work with them


Traversing trees recursively is so trivial. I have to do this kind of stuff all the time. Just last week actually (in some frontend code no less).

Graph search and B-trees I haven't done professionally since I left college though. But it is still good to know the theory when dealing with databases.

A lot of these algorithms is more about knowing their characteristics than knowing how to implement them. For example cryptographic algorithms can be complex, but having a good lib and knowing each crypto algorithm characteristics is usually good enough for almost everyone.


> I've had to work on tree traversal stuff multiple times in my life, anything low level GUI related will work with trees a ton.

How many times did you have to write tree balancing code with no reference materials?


Bingo. You forgot to add "with someone literally looking over your shoulder," though.

I've written AVL trees, B-trees, red black trees, and a bunch of other things people have named here. But, right now, without looking at any references, I couldn't even tell you how to balance an AVL tree, much less sit down and write out code for it.


That's why these interviews select for recent grads. Or leet code studiers.

Yes we've all done this in university. We've learned the theory. We had to write an implementation of this or that algorithm in whatever language the university made us use.

And we also know that great minds took a long time to come up with these in the first place. These "basic algorithms" are not something you think up in 5 minutes after first learning that computers exist or that some problem exists.

Bin packing algorithms are another such thing. Sure ask me interview "questions" like "please prove whether P=NP".

Eff off Mr. or Mrs. interviewer!


The exact same number of times I've been asked that during an interview: 0!

I do ask tree traversal questions when interviewing because I've had to traverse a lot of trees so I think being able to do an in order traversal of an already sorted binary tree (which is only a handful of lines of code) is fair game.


Just the once? [maths joke]


In order traversal is simpler and more practical than the questions xandrius asked about.


I disagree with some of this.

At some of my past jobs and the current one, this kind of algorithmic knowledge was important to build features that were differentiators in the market. As much as people love to pretend, not every single possible solution is in a library. Sometimes you're the one building the library.

It doesn't have to be leetcode, but candidates should at least be able to produce some code that doesn't come from the README of their favourite framework.

Also, talking for 30/45 mins can be enough, but it produces false positives when you have people coaching candidates. I've had people completely acing interviews that it felt like the perfect candidate. Well, it was rehearsed. When I asked for a fizz-buzz type question they completely messed it up.


| this kind of algorithmic knowledge was important to build features that were differentiators in the market

I agree with this, but with the caveat that it's extremely rare to come up with a truly novel algorithm in a production environment. Those almost always come out of academia or R&D departments.

So is it important that people remember how to implement algorithms from scratch? Or is it important that they know when to identify when an existing algorithm would be useful?

For instance, if I see that something is suspiciously like a Stable Marriage Problem, do I need to remember how to implement the Gale–Shapley algorithm? Or is just the ability to recognize the problem as something that has a particular solution 90% of the way there? I would argue yes.

That said, I'm not sure how to test this is an interview setting.


You don't need something "truly novel" to have a market advantage. Far from it.

You just need something that is not ready made.

Remember that not every algorithm is neatly package-able, or has a cool name like "Stable Marriage Problem" or "Levenshtein Distance".

Also: maybe your advantage is implementing some of those in a new language. Maybe your advantage is implementing one of those in SQL! Maybe you have a tree-like or graph-like structure, and you need to do something with it without having to export into some format expected by certain packages. Knowing what to implement is important too.

Also, those interviews are often testing for something even simpler than those fancy algorithms.

Nobody uses Fizz Buzz or Recursive Fibonacci daily, but you might need to implement something that is beyond creating views/models/controllers in your favourite MVC framework.

This is what the test is checking for: coding beyond what's on the README of your framework.


Unless you have data to show that your "different way" produces better results, your idea sounds exactly the same as every other idea, which is basically garbage.

Lots of people have lots of ideas on what makes a great interview question, but none of them are backed by data. LeetCode-style algo questions USED to be an indicator of intelligence as per Google's investigation, but now it's so heavily gamed that I doubt there's any signal behind it anymore.

Someone needs to spend time and effort and money and actually do random controlled tests to see if people who pass a particular type of test actually make good employees. But so far no one has done this, except Google as far as I've seen. But even now, I think there's no evidence that even algorithm questions are any indicator, given what I've seen.


Don't discard good anecdotal evidence because there's no proper randomly controlled trial. If you're experienced/senior in the industry you probably worked with dozens of developers closely, with hundreds less closely, you've seen many projects and designs, you've interviewed and was involved in the hiring of dozens of people.

This random controlled trial you're talking about is so hard to do because there are infinite confounders. How do you even measure success? After how long? How do you separate politics from technical ability?

In reality people above a certain bar are able to make some contribution to most projects. You can't and don't need to staff every project with superstars. The superstars are rare but are not hard to identify. They'll have the track record, reputation and the skills.

Another thing to consider is that the best outcome of interviews depends on the quality of the pipeline into those interviews. If you can't get good employees to apply you won't get good employees hired. Pay well and build a company that people want to work for, and then your hiring is going to be easier.


I guess we could make some reasonable evidence about hiring approaches based on HN hirers: What percentage of hires using your favourite technique/interview question do you think worked out as a success?

My rate is around 50% good, 50% I wish I hadn't hired.

Anyone else want to fess up?


I've had a much better rate than that. What does "wish you hadn't hired" mean exactly. Did you completely misjudge them or did you have an idea of what you're getting and still hired them. To be way off on 50% of interviews feels like way too high failure rate.


> Unless you have data to show that your "different way" produces better results, your idea sounds exactly the same as every other idea, which is basically garbage.

Do you have evidence that the standard coding interview works? (There's evidence that it doesn't)

I'm with you that the claim might be too strong to say "this is the way" but that's because I'm of the (very strong) opinion that interviewing is an extremely fuzzy process and there are no clear cut metrics to measure one's abilities in a significantly meaningful way. a̶l̶l̶ ̶m̶o̶d̶e̶l̶s̶ ̶a̶r̶e̶ ̶w̶r̶o̶n̶g̶ ̶b̶u̶t̶ ̶s̶o̶m̶e̶ ̶a̶r̶e̶ ̶u̶s̶e̶f̶u̶l̶ There are useful interviewing methods but certainly not "the best" method. Trying to just mark checkboxes only leads to mediocre results. The reason we're generally okay with this is because we more often than not don't need rockstars and it doesn't make sense to put a lot of time and energy into this process when we get sufficient results through lazy methods.

FWIW, a typical (non-software) engineering job really just involves high level discussions like the OP suggests but even without the implementation. It is generally about seeing how the person thinks/problem solves and looking at problems they solved in the past. It isn't resource intensive and good enough. Because truth is, you don't know what someone is like as an employee until they are an employee (and even then this is fuzzy)


> Do you have evidence that the standard coding interview works?

No, I think that they are all garbage.

The only way to really hire is by having a vibe check to see if they are someone the team wants to work with, making sure that the person seems competent and has a reasonable chance of being very productive, and then hiring them quick. Give them a month and if they don't seem like a good fit, then fire them, with 2 months severance.

This is the only way I've seen that will produce a great team quickly, by hiring quickly and firing quickly. This is similar to what Netflix does but they also pay top of market which not too many companies can afford, but it produces the best results.


Seems we're on the same page: https://news.ycombinator.com/item?id=40291828

> Give them a month and if they don't seem like a good fit, then fire them, with 2 months severance.

And this! I think this is quite effective and efficient. You aren't wasting anyone's time and making sure the person has adequate time to find their next income source. It immediately makes me respect you and I think would build high employee loyalty.


As long as you want to only hire people who don't have an existing job or competing offers.


But we're talking about "seems competent and has a reasonable chance of being very productive". That's why you're interviewing in the first place. If you don't ask them to demonstrate their competence in any way how do you know they're competent? If they can't "produce" anything during an interview (even something trivial) how do you know they'll be productive? Assuming a complete stranger with no references you can trust.

A month might be too short of a time for certain roles. I think people would be hesitant to take a risk with you if they know this is your policy. I'd would say that at any point where someone is clearly not a fit they should be let go. You might know after a month (if they're terrible), you might know after 6 months, they might progress initially but stall. I think e.g. with new grads it's going to take a little longer in general since they have a pretty long growth trajectory.


> LeetCode-style algo questions USED to be an indicator of intelligence as per Google's investigation, but now it's so heavily gamed that I doubt there's any signal behind it anymore.

Where can I find the data on this?


All fair points, and I agree with you on rigorous experiments. For now we are talking about only intuitions.


I have a different take on this. I ask a fairly simple coding question and have them write code. No fancy algorithms to memorize or competition style coding.

I want to see if they can think in code. If they can translate something simple to written code. There's always a chance to have a conversation as well to probe different aspects of their understanding.

A few of those by a few people and I think you get a pretty good read.

Mix in some questions about their work and/or some general knowledge. I've also given people code to read (real world code) and explain to me.

For Algorithms and Data Structures I can look at their grades if they're fresh out of school. But we all know most of our work is not that (someone's gonna show up on the thread and say differently - I'm sure ;) ). If you can think in code and have the right mental models implementing an algorithm from a book isn't hard if you really have to.


I used that approach too for a beginning frontend dev: see if they can get a solution to straight-forward problems, and ask them to explain it afterwards. They got all the time they needed, and a clean laptop with wifi. Two candidates came out well, two others didn't. That was a bit shocking, given the low bar.

But it won't cut it for a senior. Such people should be able to answer a wide range of questions, including some algorithm and data structure stuff. Not that they have to be able to code a min-max-heap from scratch, but they have to know what's out there, how to use it, and how to keep the overview of an entire project. That's not going to be evident from a coding interview.

So, horses for courses.


I agree. For more senior people I'd mix in systems, architecture, design trade-offs and such. I'd want to see they have actually delivered something in their career and dig into the details.


And even for Google, leetcode has become noise because people simply cram the questions. When Microsoft started to use leetcode-style interviews, there were no interview site. So, people who aced the interviews were either naturally talented or were so geeky that they devoured math and puzzle books. Unfortunately, we have lost such signals nowadays.


Your "solution" is just another coding interview.

A typical engineer doesn't begin their day thinking "I should implement a streaming algorithm" out of nowhere (and if they do, they can always seek reference). They analyze a (typically underspecified) problem, figure out the constraints, and then start thinking about a solution. Note that both processes, analysis and solution, can take hours, days, weeks or months!

Coding interviews have everything backwards by providing synthetic solutions to mostly out-of-context problems.

For example, interviewers could:

- Share some portion of a codebase with the candidate prior to the interview

- Go over the code with the candidate

That's it. You will know really fast who's senior and who's never wrote software. It can't be faked. And it can be adapted to mostly any positions (backend, frontend, devops, architecture, whatever).


My approach is to ask the candidate to show me some code they've written ( could be for class project, but one requirement for me to interview them is that they have a GitHub) and explain the choices they made or other anecdotes that surround the project.


Good idea. But the focus on algorithms is maybe overkill for at least 80% to 90% of companies. Perhaps lay out some subset of the business requirements or problems the company is working on and ask them to turn that into working code is more suitable


What is ICPC?


International Collegiate Programming Contest


Thanks


My attitude towards code interviews is to politely decline them and encourage people best of luck hiring a junior developer; because that's obviously what they are looking for. I'm closing in on 50, so not that junior anymore. If anyone has any doubts about my coding abilities after reading my CV, browsing my Github repos, and talking to me, then it's not going to work and we can both save ourselves some time. I've stopped responding to recruiters as I rarely see any evidence of them even being capable of doing these simple things.

I've hired and interviewed a lot of people over the years. Mostly without the help of recruiters. I like to think I'm pretty good at that, actually. I always look for people that are eager to learn. I care more about what they don't know than what they can regurgitate because they did lots of silly online prep. If people are young and fresh out of school, my assumption is they are going to have to learn a lot in a hurry. So I look for people that are curious, eager, and have an open mind. The best way to do that is to get them slightly out of their comfort zone and talking about how they would tackle things. The signs I look for here are enthusiasm, ability to surprise with good answers, etc.

This generally does not involve code interviews; though I might ask some targeted questions about how they would deal with certain things in languages, frameworks, etc. they list as being proficient in. I've of course worked for some people that insisted on me organizing code interviews and it's a chore and IMHO you learn absolutely nothing that isn't obvious if you just casually talk to people for 30 minutes or so. I usually argue against such things and prefer just talking to people directly.


I totally understand your point of view, but looking at it from the other side it’s not as simple. We had candidates with 10+ years of experience in their resume, talking to them it seemed they know what they’re doing, they showed some of the code they’ve supposedly written. Then they got hired and it turned out they can’t code - their PRs are below junior level, constant bugs, communication is abysmal, they overshoot their estimations by 3-4x, etc.

Thus happened to us twice, and after we introduced a simple live coding session to our interviews where you have to implement a simple real world web component (for a frontend position) suddenly the problem of bad hires disappeared almost entirely (you can’t judge someone’s character during a short interview but that’s another issue).


We recently interviewed a candidate, with 7 years of experience at a FAANG company, fail a really basic coding interview and we don't even make you write code.

I have no idea what these people expect. We ask coding questions, we have basic programming skills as a requirement, what do they think will happen if they get the job? Do they expect to just magically learn the required skills or hope that nobody will notice that they're not able to close their tasks?


Do you think working at a FANG would make you a better developer? I’m sorry but I think that’s a bad expectation, hire from a startup if you want great developers, their span of control and influence is much wider. Most of us at FANG are pigeon holed into a very narrow topic, have to work with really annoying, slow and complex internal build systems. And write very little actual code. Also, unless you have global scale problems, a lot of the skills developed working at that level aren’t useful to smaller orgs and will just drive up costs. Dont look to FANGs for guidance is my advice.


> Do you think working at a FANG would make you a better developer?

No, but I do expect any reasonably competently run company to figure out that the person they hired isn't actually able to do their job, especially after 7 years. I was also under the impression that this company actually did coding challenges as part of their interview process, yet this person managed to slip through their interview process. On the other hand our approach to just talk about programming in general terms seems to spot this type of candidate pretty quickly.

One of our most used questions is to ask people to tell us about something they don't like in a language or tool (one the candidate is very familiar with or enjoy using). It's pretty hard to imaging even a senior Java who doesn't have some pet pew about the language.


> No, but I do expect any reasonably competently run company to figure out that the person they hired isn't actually able to do their job, especially after 7 years.

Oh, they did. This person probably did not get paid as much as their peers. A great deal of compensation is discretionary.

Whether any decision-maker actually stood to benefit from removing this person from their position is another matter, however. Firings seem very uncommon, and layoffs depend on business conditions.


It's less that working at a FANG produces better devs and more that FANG can afford to be picky and so having worked there is a sign you passed a high hurdle once.

Kinda the same idea as having Harvard or Stanford as your alma matter. Most schools will teach you most of the same stuff, but those universities only take the "best". If your idea of "best" is similar, you'd take that Harvard also liked the person as a good signal.


Iant Harvard 10% best on scholarships and 90% children of rich and connected parents? Not total idiots, but still.


> Do you think working at a FANG would make you a better developer?

Yes...? I mean look at the complexity of the products these companies build?

I would expect if you've contributed on say Chromium or some EC2 networking layer that you're quite competent...is that somehow unreasonable?

These companies tend to pay multiples of typical "enterprise" CRUD developers because the work is that much more complex.


15 years ago fang devs were some of the best. Now it feels like all of fang is crumbling under tech debt and enshittifying executives, like AOL did.

You love to see it!


> Do they expect to just magically learn the required skills or hope that nobody will notice that they're not able to close their tasks?

It obviously worked for the candidate you mention, didn't it? Spent 7 years at FAANG, didn't know how to code.


That is true, but how?


There are lots of roles at big companies that don't involve actually writing code. The bigger the company, the more communication and organizational overhead there is, and the more not-actually-coding work there is. Even within roles that have a "software engineering" title.

Hell, I barely touch any code in my current company. The only reason I can still code is because I love it, started programming at about 11 years old, and still do it in my spare time. If it was "just a job" for me, then no doubt the skill would atrophy.


Some people genuinely freeze up or overthink to their detriment in an interview setting, despite being competent otherwise.

Happened to me a couple of times back in university when I had to get an internship for one of the semesters.


This, I freeze up when put on the spot, otherwise I can actually code, but it doesn't seem like it when doing coding interviews.


I interview best when I'm relaxed and it doesn't feel like there's a lot riding on the interview. What has worked for me is to interview early (when I feel "maybe I should leave my current job", rather than "I have to leave this awful job ASAP or I'll lose my mind"). Even then, I've had bad interviews - where it's like forgetting your own phone number or PIN; It's hard to recover from your brain short-circuiting early on, I suspect my interviewers may have thought I'm a fraud too, but such is life.

Interviewing has plenty of randomness - on a few occasions, the stars were aligned and I solved sequences of very challenging technical questions much quicker than the interviewers had planned, which gave the impression that I'm some sort of genius. That performance was not representative of my usual capabilities, but I didn't tell them that (:


Some people freeze up when they have to do their jobs...


Out of curiosity, is it possible he lied on his resume about being a FAANG SWE?


The IT job market is a two sided lemon market, people may lie on their CV but companies can be very abusive as well. If you don't do what OP suggests, people will not respect you. I've had five developers join my interview call to see if I can implement a palindrome checker. This is proof that non of them bothered to read my CV and check out my github projects. This is not respectful, and you have to stand your ground against such nonsens. Let them now this is not what you expect from people working for you and and that you expect better from them in the future. If they don't like that you won't have lost anything.


Was it the task itself you feel was disrespectful or the way they ran it? Anyone can put up fake/inflated CV so it's not really proof they didn't bother to read it, GitHub can be easier or harder to fake/inflate depending on the projects. Doing something like a quick palindrome checker to show "Hey, yep - I'm real not some random inflated applicant you probably got 10 applications of alongside mine" seems more than reasonable/respectful (by both sides). I can also easily see how 5 people can hop on and completely run that task in the wrong way and come off as being know it all douchebags or the like but that'd be a different independent issue really.


Why even show up with 5 people.


same experience, same conclusion. I used to hire without coding interviews, i don't anymore.

Although i see it as a way for the candidate and i to talk about code in general, and assess his level. No as a simple barrier.

i hired a candidate that completely failed a code interview because he was super nervous, but just talking about the problem made me quite sure he was actually good.


Just by introducing coding interview, how would you know they will not overshoot estimations by 3-4x? It's not one person's job, normally estimates are done by the entire team


So have the candidate do a small project and submit a PR.


The problem there is that the amount of time being wasted is asymmetrical since the employer isn't present, so the typical experience from the candidate's perspective is spending an evening working on a project and then getting no feedback and a canned rejection letter.


it doesn't need to take an evening just something simple can be a great indicator


Do that constantly and you get free employees.

Just kidding, but you should value your applicants time. You could just as well show them bad code that is a problem and ask them what they notice when they look at that code. If they are good they point put the problem(s) if they are bad you will probably be able to figure it out based on asking them alone.


FWIW I always ask some really super simple coding questions in interviews, even for really senior people with apparently stellar CVs. Let them pick their language or use psuedocode or whatever.

It's surprising how many 'senior' engineers are actually BSers who will slow you down or derail you completely rather than speed you up and you need to spot them because they will excel at getting through the non-technical interview filtering!

Also, I'm interested in how you explore things and explain things. I'm not actually interested in acquiring an implementation of FizzBuzz or whatever. I just want you to show me that you 'get' it and then we can get on to the interesting stuff like 'tell me about your last project' etc.

So don't be too hasty to think the people doing technical interviews are idiots thinking devs are interchangeable cogs etc.


> Also, I'm interested in how you explore things and explain things. I'm not actually interested in acquiring an implementation of FizzBuzz or whatever. I just want you to show me that you 'get' it and then we can get on to the interesting stuff like 'tell me about your last project' etc.

Sounds like you'd more valuably/realistically get that from the discussion of a previous project though? How it works, or something interesting they had to figure out, etc.?

> don't be too hasty to think the people doing technical interviews are idiots

I don't think anyone has a problem with technical interviews? At least I agree that's not reasonable. That doesn't have to mean 'coding' though, you can ask about how they'd approach a particular problem, quiz on some fundamental knowledge in a fizzbuzz sort of way, etc.

You can more easily tune it to the kind of candidate you're looking for then too, for example I've asked how they'd tackle improving the performance of a particular SQL query that's been identified as too slow. There's a tonne of possible answers to that ranging from naïve/they don't really know, through pragmatic good fit responses, to way overkill I hope they understand we don't need them doing that here/not operating at that scale etc. - and it's fairly open-ended in what you can discuss driven by what they volunteer and know about. (Which is another good thing IMO, I don't like being on either side of interviewer quizzing and candidate umming and ahhing not really knowing! Both more comfortable and more beneficial to have a discussion about whatever is known IMO; to quickly move along the perimeter if you hit the edge of that.)


> Sounds like you'd more valuably/realistically get that from the discussion of a previous project though? How it works, or something interesting they had to figure out, etc.?

You may be underestimating people's ability to bullshit their way through this sort of discussion.

It's harder to bullshit your way past a blank file in a code editor.

I'd wager that something like FizzBuzz will eliminate 90% of the chaff. Yes, it's laughably simple. No, it's not so simple that it won't stump a significant number of folks who've coasted for years.


FizzBuzz is an incredible filter for devs. Any kind of async JavaScript that needs to do something with the data after it fetches it has also been a winner. Lately the biggest weeding tool has been asking candidates to fetch some JSON from a bucket, then visualize that data. I'd say that maybe half the people manage to fetch the data in 30 minutes, and of that, maybe 10% get the data fetched within 10. The other 20 minutes is trying to help them recognize that their data is initially being rendered with "undefined" and they have to update the viz.


But are you filtering for realistic performance, or for performance under time & observer pressure?

I did a .. bit more complex than fizzbuzz but similar toy problem sort of thing recently, and it's nothing I can't do, it passed the visible test cases (HackerRank if you're familiar, I haven't used it before but I assume it's generally similar) etc. but it was clumsy and far from representative of what I'd actually write on the job I think. Even if I still only spent an hour on it, just not having that pressure. Plus I suppose I would likely have seen the problem before in planning, and realistically it would be something within the context I was familiar with from working on it every day, not the toy problem. (This wasn't it, but imagine something like calculating the possible states of a chess board after each player makes another move. Bit simpler though.)


I think you're completely ignoring the reality of frauds. Jeff Atwood was writing about this over a decade ago with "FizzBuzz".

There are many people who spend more effort creating the illusion of competence on paper and on the job, getting harder to detect the higher they go.

We as a profession (software engineers) have continually resisted broad unified certification like other engineers which could be a replacement for code interviews to assess competence, but would have other drawbacks.

So we are stuck with code interviews to ferret out BSers. And even then it sometimes fails. But it appears to be the best tool we have, because there is nowhere to hide. Don't take ot so personally.


> There are many people who spend more effort creating the illusion of competence on paper and on the job, getting harder to detect the higher they go.

If you can navigate a software engineering position, purely undetected, by bullshitting it, I would say you can be a very good manager. You can probably handle high level concepts without knowing the implementation details.


This is basically what happened, the industry turned manager-heavy and expelled a lot of talent, replacing lifelong developers with bootcamp devs and other non-tech background people. It's kinda messed up because the lifelong developer types got called nerds growing up, had to learn what they know in the face of bullying and computers being very uncool, only to be basically replaced by those who made fun of them when it finally became "cool" and lucrative to be a developer.

Because it's all managers and no talent now, there's like an Interview Industrial Complex that emerged, where most teams spend the majority of time and energy just interviewing thousands of people and hiring/firing (via manufactured drama) while they never really build anything - it's all these managers know how to do because there are so few real developers left.

Some of the best developers I know of (of libs I use, etc.) outright refuse to work in the infantile conditions of the modern corporate setting anyway. The lucky ones have found other revenue streams and spend their coding energy on open source or personal ventures.

I talked to a young founder the other day - maybe 10 years younger than me, in his 20s - who said multiple times he was "retired", he kept waiting for some kind of validation on my face I guess but I just don't find it impressive. I lost respect actually, having heard that. In his mind he thinks he's a baller, in my mind he's a lazy egomaniac who knows 4 total things - I wouldn't even let this kid mow my lawn.

Smart, talented people just aren't valued anymore - it's more about prestige and authority now. But maybe not forever, they're certainly leaving themselves wide open at the advent of this LLM thing. Would love nothing more than the big tech ship to sink and get displaced by smaller, smarter companies.


> Smart, talented people just aren't valued anymore - it's more about prestige and authority now.

At the core of any corporation that isn't in the process of rapidly dying, between all the middle management and socialising and meetings with pretty graphs and interoffice politics, there needs to be someone that does some actual work.

This is where the nerd fits in a large corporation. That person is irreplaceable, and the layers around them recognise this (or else the company implodes). The may posture, but if you push, they will jump through hoops for you. Flex your muscles. You have more power than you think.


...which is completely unhelpful if you're not looking to fill a manager position.


> I think you're completely ignoring the reality of frauds.

Or maybe their strategy still catches all of the frauds and it has therefore never been a problem to them?

I have to agree with their take, and just asking a bunch of technical questions -even without any code- is good enough to filter out the obvious incompetents.


>even then it sometimes fails

How can a code interview fail? Hidden earpiece?


People spending inordinate amounts of time memorizing solutions to common problems. This is admittedly partly the fault of HR not ensuring that interviewers have a good pool of problems to choose from and twists to put on things, but it's a constant cat and mouse game with various websites aggregating interview questions from companies.


It's pretty easy to spot when the candidate goes from canned answer to actually having to think. "Thank you, that looks good; now I need it to also do this extra thing" - keep tweaking the question until they have to think.


I’ve had reports of Zoom interview where an earpiece has fallen out and my interviewers heard advice on how to answer the question audibly present. Kind of dumb, how hard is mixing the audio?


Had a candidate copy and paste the answer wholesale.


Hard disagree. On my teams, we're going to give you a problem with some slight ambiguities to see how you handle that. We're going to see what kinds of questions you ask and how you respond to feedback. We want to hear you walk through your thought process. The more senior the position, the more important all of this becomes. Getting the "correct" solution is, at best, 50% of the goal with the interview.


This is not the approach that the majority of coding interviews take. In my experience, it has been disinterested interviewers who are blatantly pretending to understand what they are asking. Any attempt to engage in the type of discussion you aim for is met with dead ends, because that's not what the Googled answer contains.

Your approach is a major outlier. You probably have many good candidates turning away, and for good reason, they have no idea that you are different to everyone else. Find a different way to do this, there are several other approaches.


There are very few options outside of coding interviews, and getting fewer by the day.

It's almost like everyone big and small is standardizing on this model. Feels like one of those mandatory courses you took in college: the teacher and all the students knew it was bullshit yet you needed to perform the parroting at some adequate level to pass.

I have 21 years of varied experience in software engineering, yet a recent "technical" phone interview was a kid asking me to "balance a B-tree"; I can tell his expectation was for me to start reciting some CS algorithm BS, probably that's what everyone else is doing. I politely declined and that was the end of the interviewing process with that company.


Classic mistake of overthinking it and failing to realize what interviewer really wants - which is to make sure the candidate can actually write code, like at all. The question itself doesn't really matter as much as it's just a pretext. I actually asked a variation of this question for many years at Google and it was clear within first 5 mins who has been writing code day-to-day and who's been mostly "brining key stakeholders into conversations at appropriate time".


Exactly this. In the interviews I give I care about whether the candidate can write code, yes, but also talk and think about code.

The conversation is the most important part of the interview, and the thinking (and communication) is the most important thing I'm trying to judge after basic skills.

Like you said, you can get a good sense within the first few lines of pseudocode if someone's at least competent at writing code. But that's just one motivation behind coding questions.

It's also very difficult to talk about code, algorithms, and solving problems without a concrete problem and concrete code in front of the candidate and interviewer. So both the question and the code the candidate writes are mainly context for the conversation where I try to see how the candidate thinks.

These kind of articles make me sad because I (and many other interviewers I've worked with) try to make it clear that this isn't a test - we don't care so much about being "right" or "wrong", and there shouldn't be any tricks of "a ha" moments.

We explain the goals and what we're looking for right up front. And I would hope most interviewers do the same, but I guess not. So there's this persistent myth among both interviewers and candidates that coding questions are about getting a right answer.

That's a shame because coding questions get such a bad rap, but I'm not really aware of a better options. Take-home problems and looking at GitHub are unfair to many people. A well-run technical interview should give lots of people a chance to succeed.


You are the exception I think. Most interviewers care about the correct answer. Get it and maybe get the job. Fail and definitely don't get the job.

If the interviewer said at the beginning, "I don't expect you to solve this problem in the 40 minute nor to have an optimal solution. I just want to watch you write some code and hear the problems you foresee and how you'd solve them" then maybe I could relax and do that. But, generally the pressure is on "get this right in 40 minutes or you're rejected"


This is actually why I dislike these "coding interviews are useless" type articles. The issue has as much or more to do with bad interviewers than it does with the fact that it's a coding interview.

When I'm tasked with interviewing candidates and evaluating these basic algorithmic and coding skills, I have a 5-part problem (each building on the next, and only revealed when the previous one is complete) that is basically impossible to finish in the time allotted. I tell the candidate ahead of time that it's an ongoing problem that's designed to not be completable in the time: we're going to work through this problem and see how far we get. I've passed candidates who "failed" the actual problem, when the conversation and coding that were shown still gave me a good understanding of their capabilities.


Coding tests are an awful place to test someone’s conversational skills. I don’t talk while I code. You don’t either. Honestly I can’t even remember the last time I talked to anyone about the code itself outside of a PR. People talk about architecture and database migrations and why their containers aren’t behaving locally. Nobody ever tests for that stuff.


> I don’t talk while I code. You don’t either.

That's quite an assumption. You've never heard of pair programming? You've never asked for help on a bug in your code? You've never talked through alternate approaches to a piece of code with a coworker? You've never hashed out an interface or a method signature or some pseudocode while talking through the problem? You've never walked through a calculation with an SME? All of these are "code and talk at the same time" exercises.

If I'm being brutally honest, I have a deep-seated suspicion that everyone who says they can't talk and code at the same time also just cannot code at all. I don't know you, of course, and I'd love to be proven wrong. My sample size is small, but the few people I've met who cannot talk-and-code also simply could not code.


Here's my brutally honest take on pair programming: Usually 1 person wants to do it more than the other, and usually that person is being unnecessarily assertive.

The only scenario I think pair programming is socially acceptable to force on developers is a senior type onboarding a new developer out of necessity - might screen share and direct them around some places to show the ropes.

Of course if you love to hang out with someone else while you write code for some reason - more power to you, have fun. For me it's a private thing, even after 20+ years. If anything the LLM is a much more useful sidekick to figure things out.

> Can't talk and code means can't code at all

I disagree with that, I can't even have lyrics in my music really if I'm working on something super hard especially outside my normal wheelhouse. It would at least be disruptive.

The last time "hanging out and coding" was a thing was learning it for the first time - I used to hang out with friends as a kid and we would all try to figure out what Visual Basic was lol and I remember hanging with a friend learning JavaScript during the early web days, drinking coffee through the night, good times.

These days it would feel forced and can't imagine why anyone would regularly pair program, especially now with LLMs.


Lots of people in tech have never heard of pair programming, because it's an absurd idea. This site isn't just silicon valley. This is a tiny fraction of the tech universe.


Personally I only ask super easy questions because you should at least be able to talk about something trivial. Yet unfortunately the question “find the second largest number in an array of numbers” has a high failure rate as the first question, because there are a lot of people lying on their resume just throwing spaghetti at the walls.


Thats to easy man arr.sort((o,O)=>o<=O)[1]

One for you: Write a recursive function that finds the second largest number in an array of numbers.



If you're anything like me then you think a lot while you code, like inner monologue thinking, and that's what the interviewer is testing for.


This provided me with a fascinating, albeit somewhat familiar, piece of insight: which is that I don't really ever hear my inner monologue. I'm not sure I have one! I'm either typing out my thoughts as I have them or speaking them as I have them.

I struggle in coding interviews precisely because of this: either I end up vocalizing my emotions and insecurities instead of coding, or I end up coding instead of talking about what I'm trying to accomplish. Often I will see many alternate pathways branching out before me, but if I try to start talking about them, I am no longer coding, and so my brain context-switches to "social and emotional."

Probably something I could get better at with practice, but I honestly end up commenting on places like HN simply because it "allows" me to think. If I could have a coding interview in the form of realtime text chat + code, that would be ideal for me.

I guess I have seen companies do things like "contribute to this open source project and work through an MR." I do find that quite appealing as an interview process.


Interestingly, it turns out a large number of people have no inner monologue. [1]

Some studies indicate that it's as low as 30% of people who do (so 70% don't have an inner monologue), while others show the opposite, implying around 75% of people have some amount of inner monologue while 25% do not. It's a difficult subject to test and study since we don't have direct access to people's minds and asking someone what they're thinking about literally forces their thoughts through the filter of language.

[1] https://science.howstuffworks.com/life/inside-the-mind/human...


This is a fairly common condition, along the same lines as aphantasia (lack of inner-picture, rather than inner-voice). I do not believe there is any “cure” for it.


I don’t have a sense of a persistent inner voice either, but I can verbalise my thoughts just fine. It feels to me like the part of my brain that turns thoughts into English sentences automatically goes to sleep when it’s not being used. And that brain power can be used for something else.

When I’m doing particularly hard programming work, I can’t have words said near me. Even music with lyrics messes me up. I think because it wakes up my “thoughts to English” pathway and that gets in the way of my “thoughts to code” pathway.

Anyway, I don’t want to be cured. There’s nothing wrong with my mind. If anything I feel sorry for people who can’t turn off their inner dialogue, because it means they can never use those neurons for other tasks - like maths or programming.

Personally I can talk while programming if an interviewer wants that, but I’ll be a bit dumber at the keyboard than if I sat in silence.


i am the same except i use music (lyrics and all) to drown everything else out. i don’t. price the lyrics at all, but my brain is processing them because every now and then i suddenly break out of focus and am like “wait, what the hell did that song just say?” usually on comedy songs


And vocalizing this inner dialogue comes naturally to you I suppose.


If you can't explain your thoughts out loud then you're going to have a difficult time working anywhere.


For me, it's not at all "I'm incapable of explaining this to you given a few minutes to collect my thoughts." and very much "You either get Coding Mode, or you get Conversation Mode, but not only do you never get both at the same time, I need time to context switch from one to the other.".

The second paragraph in this comment is very, very, very close to how my brain operates: <https://news.ycombinator.com/item?id=40290732>.

And I've been programming professionally for quite a long while now, so this quirk of mine doesn't seem to have made it difficult for me to work at programming shops.


As I've gotten older I've discovered that I really can't judge normalcy based on what I find natural to me. Every brain is different and abilities range. Some people visualize things in their head; some people have to talk out their thoughts; some people are more optimistic. If we are only grading on one very particular brain type then we are missing out on the opportunities for diverse thought patterns producing something even better than expected.


It's not a test of someone's conversational skills, its a test of their technical communication. Conversational skills get tested in the small talk and introduction phases at the start and end of the interview.


I consider taking while I code to be a somewhat useful skill. I’m totally capable of verbalizing my thought process as I work through a problem


What's the quality of output like? Have any links?


> I don’t talk while I code. You don’t either.

Speak for yourself


I, personally, cannot _think_ and _talk_ at the same time. It's just a stream of half-sentences, many of which my brain has already moved on from because what I originally thought won't work.

After writing this article it became very apparent to me that I'm complete garbage at interviews, but I'll outperform and exceed at the actual job function.


In my work, if you literally cannot write any code while also discussing the code, and if you literally cannot express thoughts while also thinking them, then you actually wont exceed at the actual job function, at all. You're not the only programmer on the team. I don't know why people think communication skills are not required for programmers. You won't be coding the correct thing unless you can talk about what you're doing.

And that's all I ever ask in an interview. Ask questions and talk about what you're doing. The worst hires I've ever seen were all the ones who never asked questions and never talked about what they were working on. Half sentences are fine; moving away from the keyboard while we talk is fine; being unable to talk and think at the same time probably is not.


> In my work, if you literally cannot write any code while also discussing the code, and if you literally cannot express thoughts while also thinking them, then you actually wont exceed at the actual job function, at all.

Followed by

> You're not the only programmer on the team.

It sounds like you're implying some connection between the two, whereas most successful teams don't require the behavior your team is demanding. Including the ones with good communication skills.

I can write code well. I can discuss it well. I simply don't need to do both at the same time. Unless people are in a pair programming session, they don't need to openly discuss the code while they're thinking about them and writing them. They can discuss the problem before and after. Why do they need to discuss it while coding?

It's like telling journalists or authors "Hey, if you can't discuss your story with the editor while you are authoring it then you can't succeed here."


>I don't know why people think communication skills are not required for programmers

That so significantly fails to resemble the claims being made that it strains credulity that it could be a good faith interpretation of the conversation.

>You won't be coding the correct thing unless you can talk about what you're doing.

Maybe, but that has no bearing on whether they need to be done at the same time, which they do not in just about work environment. I guess there's probably somewhere that does mandatory pair programming for everything, but I've certainly never seen it.


The vast majority of engineering design happens async - by typically a single engineer, puzzling/experimenting over possible solutions and then creating a design doc. Discussion then happens synchronously. Solving a complex design problem on the spot is not the norm.

I personally find system design interviews pretty tough - it's not a mode of operation I ever experience on the job. To solve them at Big Tech, you pretty much have to memorize many design patterns and be able to regurgitate them on the spot. Like algo questions, it's testing your ability to work hard to prepare more than anything else.

Not to say this doesn't have value as a filter, it's just not testing the thing you think it is.


If I'm not cut out to work in your environment, that's fine. I do disagree with your other conclusions, however. I'm not bad at communication, I'm bad at verbal communication while simultaneously trying to solve a problem. I'm excellent at problem solving and simultaneously chatting in something like slack, however.


Shit, I can’t take notes on a meeting and also participate—like, at all. Decent odds I’ll reach the end and struggle to give you even the gist of what happened, without reading my own notes. If I’m trying to take notes and someone addresses me I’ll be all kinds of confused about what the context is.

And that’s English and mostly just writing what people are talking about, not thinking up novel things to write.


Adding another comment here, because this is part of the reason why I wrote this article.

> These kind of articles make me sad because I (and many other interviewers I've worked with) try to make it clear that this isn't a test - we don't care so much about being "right" or "wrong", and there shouldn't be any tricks of "a ha" moments. > We explain the goals and what we're looking for right up front. And I would hope most interviewers do the same, but I guess not. So there's this persistent myth among both interviewers and candidates that coding questions are about getting a right answer.

I understand all of those things. I've written the same before[1]. However, as clear as your instructions are and as well meaning you may be, it may not help. I can logically understand every word you say, but as soon as that question rolls out, I will now be dealing with stress hormones and 30 years of learned behaviors from thousands of experiences, whether I choose to or not.

So while I applaud your methodology and wholeheartedly agree, just telling people that doesn't guarantee that it's not still an issue because humans are complex organisms.

[1]: https://darrenkopp.com/posts/2016/02/25/always-learn-somethi...


This is pretty much how it worked at the robotics company I worked at.

We would give them a whiteboard problem, but:

a) it was a simple, stupid problem in C (C++ was our implementation language, so thinking at byte level was an important skill)

b) we were very generous about minor mistakes like missing semicolons, etc.

c) we were very generous about "points for effort"; if they didn't make it through the problem but we saw that they were on the right track we might pass them. Total frauds outed themselves very early; they would produce between jack and squat in terms of actual code (a lot of bloviation though).

But again, most companies aren't that company, or your company. For most screening coding exercises, a correct answer (and even something like optimal algorithm complexity) is a must to pass the candidate.


I would like you to be the interviewer of all my future jobs, please.


The normal way to phrase that is "are you hiring?" :)


Meh. I’ve met dozens like you. People who swear they just want to see “how you think.”

Then, in the post interview roundup we talk about the candidates and you’re a bit disheartened that they didn’t complete the exercise, so you give a pass even when every other person in the room gives a thumbs up.

Nah. Re evaluate yourself and your biases.


My favorite question to ask these "see how you think" types when it's my turn is:

What is the most impressive thing you've ever done?

Just watch how they struggle with this one. I've never met anyone slinging code tests who is so curious in "how we think" that has ever made anything interesting whether it's code, design, music, a company, anything. It's just a pretentious statement made by gatekeeping noobs who love to interview, nothing more.

I tried to save your comment lol


You are not alone! This is also how I run coding interviews - I have absolutely passed candidates who did not actually complete the described problem. I always inform my candidates that we want to have a conversation about the problem and its solution. I also deliberately pick questions that are going to need thought and some design, specifically to spark that conversation - talking about reversing a list gets boring pretty fast.

The issues people have with coding interviews are more about the interviewers than the questions, honestly.


I love the typo at the end. I have definitely worked with some stakeholders that I would have loved to stuff in a jar of pickling spices and leave on a shelf for several years to ferment.


Im dislexik


Ha! I thought it was Shakespearean!

When you feed people confabulated information, it acts as an encompassing medium, a reality buffer, effectively making them more inert, less engaged, in actual reality. Brined.


>Classic mistake of overthinking it and failing to realize what interviewer really wants - which is to make sure the candidate can actually write code, like at all.

What about the lad who develops homebrew who got rejected from Google because he wasn't able to invert a binary tree on the spot? Many Googlers use his software internally and externally. If the purpose of the interview is to make sure he can code why did he fail?

https://www.quora.com/Whats-the-logic-behind-Google-rejectin...

He seems to have a great attitude and would fit right in but it's clear Google is optimising to keep people out rather than find great software Devs.


You should really read your own link - the part where he admits he made that whole story up. Im gonna guess failing some basic coding is not what tanked him. He still thinks they owe him the job though because he’s kinda famous and did a popular project in a language that Google doesn’t use, so that’s cute.

I also dont see how this one anecdote (even if it was true) invalidates anything I said above. You’re gonna have false negatives in any system you chose unless you just wave everyone in


>You should really read your own link - the part where he admits he made that whole story up.

Where does it say that?


Why does "Many Googlers use his software internally and externally" mean he would be a good hire for Google?

From all his public complaining about failing an interview it seems Google did the right thing not hiring him, he has a massive ego and it's very possible that "writing homebrew" is less useful to google than "inverting a binary tree"


It means that he created a tool, with his skills and capabilities, that is a force multiplier for other Google engineers. This is a straight up undeniable example that his capabilities _already_ brought value to Google and their stacked deck of genius non-egostical binary tree inverters.

There's not a more pragmatic measure of whether somebody can code than a track record of a successful code project used by other coders.


Here's another way of phrasing it -- if Linus Torvalds went for an interview with Google would he have to invert a binary tree and if he failed to do so (maybe he misconstrues the question and messes up or him being Linus and just refusing) would that be a good reason to reject him? Linus also can be equally or far more egotistical than Max Howell.


I find the idea that just because someone is an excellent software engineer they are therefore guaranteed to be a good fit for a particular role at Google a bit weird

I'd say that if Linus applied to be a software engineer at Google they should be prepared to invert binary trees or do $generic_leetcode type things because that's the expectation for that role

If they applied to be Google Fellow or some other lofty position then I wouldn't expect them to need to do any coding at all in the interview


>If they applied to be Google Fellow or some other lofty position then I wouldn't expect them to need to do any coding at all in the interview

So the higher the role in Google the less the requirements?


Not even Max Howell thinks Max Howell has a great attitude. He's often a dick in his own words. Maybe Google would have found a different job for him if he wasn't.

He said 90% of Google engineers used Homebrew. Google engineers said it wasn't true.[1]

He said Homebrew using OS libraries saved a lot of pain. He presented it as an example of why Google should have hired him. Actually it caused enough pain Homebrew stopped it.

[1] https://news.ycombinator.com/item?id=23844936


My go-to first screening question, regardless of where I work, is to present some JSON from a public API relevant to the domain (nothing too crazy, maybe 2 or 3 levels of nesting max), then ask the candidate to do a filter plus a sum or max - then for bonus, analysis or refactoring of their choice.

Way too often, this results in a painful slog, lots of hints from me, and, nowadays, snippets of AI generated code copied in that actually solve part of the problem but that they don't read and therefore end up breaking.


If the interviewer want to know if someone can write code at all, then why he/she expects much more than that in an interview?

I believe there is no single declaration about 'what the interviewer' wants.

Especially that several times they do not know themselves, just try to mimic what is expected from them or seen in some random situation they came across. Sometimes, of course.

---

"brining key stakeholders into conversations at appropriate time"

This is some very good quality euphemism here, two thumbs up! : ))


The problem is that the candidate has been assured that they will be asked 'leet' code questions where solving the problem isn't enough, they will also be asked about O notation and how the code can be optimized and whether to use memoization or recursion. This is what the books will tell you, this is what YouTube will tell you, this is what 'helpful' recruiters will tell you.

And IME this is what most interviewers have been taught. They've got a list of sample questions, they've been told that if they give the knapsack problem and the interviewee doesn't immediately call out 'dynamic programming' than the interviewee is a pass.

If you only want to see working code than you are the exception rather than the rule.


I will ask all of those questions. But I don't expect perfect answers. You should at least know what big O is. I would really like it if you can tell an O(n^2) algorithm from a linear one. (That is often really important in real-world code). I would like you to consider different ways you can optimize the code.

I don't expect you to quickly crank out a novel optimal algorithm. But I like to see that you can think about how to solve problems programmatically. I would like to see you can identify differences in different algorithms and what tradeoffs there are. Considering different approaches that may be faster also shows that you didn't just memorize an algorithm from somewhere but that too took time to actual understand the problem and build a mental model in your head.

I have given people great reviews when they couldn't come up with a working algorthim because the clearly knew how to think like a programmer and considered all of the important things.


As an electrical engineer who has basically always worked as an embedded programmer, I don't know what big O is. But, I'm probably not applying for your open positions.


It definitely depends on the domain. In embedded programming you can often know that your n is small for your algorithms. If you are making a desktop app or web service it is very important to prepare for people using your service in unexpected ways and a super linear algorithm can result in terrible performance or downtime if not managed correctly.


That's a pretty far cry from

> which is to make sure the candidate can actually write code, like at all

It's also still a terrible approach and only gets leet code crammers.


are you suggesting that only leet code crammers can tell the difference between quadratic and linear algorithm?


You're all not getting it.

They only want to see somebody who can get working code and a glimpse of their thought process. But from 100s of mediocre examples, the better coder will have a "better thought process."

Same goes for dating. Of course people will swear up and down they "only consider personality." Turns out, they've met 10 other people with a better personality than you.

Just because they're "only looking for x" doesn't mean they'll accept anybody that clears the bar.

The ultimate read between the lines though is that "oh I'm only looking for xyz, nothing superhuman" in a process where you have 10,000 competitors and applicants will still require high performance on your part. It's just a nicety, a meaningless phrase.


Yes. I've filtered out countless candidates who can't code like this. It's amazing how people will fire off applications to jobs they can't do. You might doubt this because you wouldn't do it, but just wait until you get to sift through a fresh wave of applicants.

Has everyone I hired got "perfect marks" on the test? Of course not! It's not about that. It's about seeing how they react to a problem, watching them break it down, ask questions, and, ultimately, get on with it. If the job is too sweep floors you need to be able to hold a broom. It's as simple as that.


Yes, unless you’re in a niche area or only hiring through a network, you’re going to sift through mountains of unqualified candidates because those are mostly the ones on the market (see Market for Lemons[0])

[0] - https://en.m.wikipedia.org/wiki/The_Market_for_Lemons


I disagree, I think it's a failure of recruiting if you got someone into an interview who can't code "at all". That must mean nobody at the company is technical "at all" either then, if you are getting people that far along without having any clue if they even write code at all.

I don't have this problem - I can easily tell who is good and who is not good by looking at their stuff online, which repos they are contributing to and what their contribution is. I look at personal projects - I can easily tell what parts they wrote vs didn't because it's usually specific to the project.

I can tell from their blog posts and comments, especially GitHub comments - I can even see if they're pushing features at 11PM on a Friday if the obsession piece is crucial to the hire.

People who say all their work is hidden under NDA or they're new grads and haven't done anything yet - sorry, if there's nothing to view online I just wouldn't qualify you to interview.

Though I have given out 50+ code tests in my career because I had to, I would never choose to do this, if given the chance when hiring someone I would never give them a code test. I think it's an amateur move and wastes everyone's time. At its best (as in the case of CTCI interviews) it's an exclusivity filter for academics who memorized the optimal data structures for algorithms as taught in school, but the candidate might not have any of the skills needed to build app features, perform DevOps, etc. or even operate a Terminal - CTCI doesn't cover anything async, nothing about UIs, APIs, databases, services, git, design, file formats, etc. it's purely academic sport. And like I said, a good developer's work should be highly visible anyway - skip the random code test.

I would spend the recruiting effort finding specific developers using specific technologies that aligns with the role and making them excited about the opportunity rather than canvasing 1000 code tests out to anyone who applies.


> That must mean nobody at the company is technical "at all" either then, if you are getting people that far along without having any clue if they even write code at all.

Lots of people skate by in technical roles by barely doing anything technical. Lots of people also overinflate their achievements in resumes and conversations aka lying. How is non technical recruiter supposed to evaluate their coding chops?

> People who say all their work is hidden under NDA or they're new grads and haven't done anything yet - sorry, if there's nothing to view online I just wouldn't qualify you to interview.

Lol, great method! I have a better one - just hire acm winners. No need to test them


> People lie

Obviously taken into account - I still have no problem whatsoever identifying great developers on GitHub and I'm sure many other developers who actually code often could too. You have bigger problems if you can't tell if someone is lying or not about their abilities when all their work is visible to you. You should be able to easily tell what is theirs vs not.

> Lol, great method!

Yes people inflate their own egos and abilities - especially those who spend all their time interviewing others instead of building.

I prefer demonstrable experience for an engineer over standardized tests which tell you nothing about any real world experience: App architecture, async programming, APIs, UI, DOM, git, unit/e2e tests, any known framework or library, etc. A person who knows all those but hasn't memorized CTCI is a lot more useful than the CTCI memorizer with no evidence of work ever performed.


> when all their work is visible to you

Doesn't really happen reliably in the real world.

> You have bigger problems if you can't tell if someone is lying or not ...

Whether or not I have bigger problems is independent from whether I need to recruit more developers.


It's like a designer with no portfolio. A lot of them do exist actually, not saying they don't, but I would never hire one.


> While you're spending all your time and the company's money conducting maniacal quizzes and tests your competition is building features and teams and making progress

If your research on your candidates' profiles is of the same quality as you demonstrated nosing around my profile, I don't think any of your competition should be worried at all ;-)


No way to know for sure, but I'd guess the number of employed developers writing blogs and having personal projects is in the single digits. For developers with more than 20 years experience, I'd be surprised if it's even 1%. I've got more experience than that, which means I have an extensive network of former colleagues at a similar experience level and I don't know one single person who blogs about tech or works on personal projects. We all have busy family lives and plenty of money so absolutely no need to spend any personal time doing work related stuff.


Not to mention github profile eval would be much easier to game in the age of LLMs if it were to become mainstream. You can already kinda see the effects of these with people sending you mindless PRs fixing grammar and whatnot if you maintain any popular projects.


I find a simple realignment would solve a lot of this interview angst: Interviewers seem to interview on the premise that they need to find out what the candidate can't do. But this is not useful. I can tell you what they can't do. To a first approximation, the answer is "everything". I am an experienced and skillful developer, yadda yadda yadda, and I've never touched React, have no game development experience, have never written kernel code, haven't touched matrices since school, have only dabbled in embedded development as a hobbiest, etc. etc. etc. Make a list of all the things I can't do and I look like an idiot. The list of possible things is too long.

The goal of an interview is to find out what the candidate can do.

If you interview someone, and they "fail all the questions we asked", it is not the candidate who has failed... the interviewer has failed. You ran an entire interview and all you know is what the candidate can't do. You have learned virtually nothing. The questions must be adjusted to what the candidate can do. Only for the absolute worst candidates should you ever exit an interview saying they failed at everything. (Sadly, such candidates do exist. For instance, consider the recent stories about AI fraud. If your level of programming skill is "I have ChatGPT", I'm going to have a hard time scoping a question down to you.)

If I had asked this question in an interview, or a related one, I would have stepped the problem down until the candidate could pick it up. (Odds are I'd have started lower and worked my way up anyhow.) If the candidate then sort of finds their footing and once going can start folding in the other requirements as we go, great. Who sits down and designs a system for all twelve adjectives ("reliable", "concurrent", "redundant", "runs on a Z80") we want anyhow? One can not help but design a system one adjective at a time, with the others only kept in mind to make sure we don't back into a corner. There's no reason not to interview that way too. (And I tell the candidate this is what we're going to do, so hopefully they don't feel like I'm out to get them or experience requirements whiplash without knowing why I'm doing this.)


Yes this, and to further this, the "can't do" thing that you measured is in fact extremely partial and incomplete: you've measured that the developer can't write an algorithm on a whiteboard with a felt tip marker in front of a stranger they just met. You don't even know if they can't write the algorithm. Just that they can't do it there.

Is that important to you? Maybe it is for some people. As a person who has been in team lead and hiring capacity before, it is not for me.

I trained for interviewing at Google twice, but never really chose to give interviews despite that being a highly pressured part of the job because I could not philosophically vibe with that process. But what some people who have adopted this process are missing is that Google etc does this only because they are swamped with resumes and have their pick of bazillions of quality engineers. They don't care about false negatives, more false positives.

Startups and smaller companies should care about false negatives. It's hard to find and retain good people. Smaller companies need to aggressively find and cultivate good people to make good teams in order to be/stay competitive -- and that means accepting a diversity of ways of working.


I'm glad whiteboard interviews are dead now that everyone does "virtual" on-site interviews. When I interviewed at Google, I requested a laptop because writing on a whiteboard all day hurts my hand. Every interviewer complained about me using a laptop.


What, exactly, was the substance of their complaint(s), if any? Was it just that you made them do their job in a way different from how they are used to?


This was years ago (pre-covid) but what I remember was a mix of:

  - "Oh, you're using a laptop"
  - "Do you really need to use that?"
  - "Just write it on the whiteboard"
  - "It would be easier to see on the whiteboard"
They also didn't give me a mouse, so using the trackpad was slow.

What I hate most about whiteboards is how you can't easily insert lines so you have to leave lots of space. I tend to write code like an onion rather than top down.


Here's my take on it: the interview is testing your ability to "show" that you absorbed the content and process of your 4th year data-structures and algorithms classes. The whiteboard process is all about showing them that you can regurgitate what those courses taught you, likely in the manner that the prof taught you. You're essentially playing "professor" for them in front of a "class", for a short period of time. Not showing them that you can write code. The code you write on the whiteboard will look nothing like anything you'll ever write at Google.

I think one way to do well in that interview, is to pretend you're the professor and they're a student, and you're working through course material.

In the end, it's just showing them that you passed the socio-cultural hurdles they think are necessary, even if they no longer explicitly check GPAs and which school you came from at interview time.

Why?

Google is more often than not the first job these employees have after school, and many stay there almost forever after. A large percentage are masters or PhDs.

Google is founded by two Stanford grads, both children of academics, who never worked in the industry outside of Google.

Academia is hence the reference point against which Google measures things.

Google is structured in many ways just like a university. Publish or perish (Design docs, PRDs). Thesis committees (perf "calibration" committee, interview committees, etc) and review (intense code review). Even down to the physical "campus" structure. On site cafeterias, even housing/dorms (GSuites, etc)

It's something that was very foreign to me having worked in the industry for a decade before, without a degree.


> I think one way to do well in that interview, is to pretend you're the professor and they're a student, and you're working through course material.

If that's what they want, they should give out the questions in advance, and have the candidate prepare a slide deck. No professor conducts class by having students show up and fire random questions at them all semester.

More explicitly, if that's what they want, they should say that.


> no professor ...

That's one of the higher leverage activities for an instructor. Anyone can go read a book, watch a video, or do some practice problems, and lecturing at a bunch of students isn't a great way to convey information, either in absolute terms or compared to those alternatives. The instructor's job is to tell you which books to read, why the material matters, and contextualize the course against other things you care about. The particular road they tell you to travel depends on where you currently are, and a 2-way conversation is critical to getting good results.


I know all that. But, have you perhaps been in an undergrad classroom recently? It's still predominantly lecture-based.


In addition to what others have said, that's an accessibility concern. There are many people who have a physical inability to handwrite.

https://en.wikipedia.org/wiki/Dysgraphia

In some ways I'm grateful for Covid's impact on the job market because I now have many alternatives to physically drawing on a whiteboard. I can use TikZ as fast as my professors can draw, and I can put that on a monitor in a conference room. I've even used it on exams before.

I would probably struggle for reasons unrelated to my actual competence as a programmer if I was forced to use a physical whiteboard during an interview. Your comment made me appreciate that I'm graduating into a different world than 4 years ago.


When I interviewed at Google I was making iOS apps using Objective-C at my current job, so I decided to stick with that for the whiteboard so I wouldn't blank out on some syntax at an inopportune time.

Big mistake. The method names get so long I was kept running out of space on the whiteboard and it took forever to write it.

I don't remember exactly what I wrote anymore (it was over a decade ago), but this site[1] has some examples of really long property and method names in Cocoa (its framework), like:

splitViewControllerPreferredInterfaceOrientationForPresentation

or

initWithBitmapDataPlanes:pixelsWide:pixelsHigh:bitsPerSample:samplesPerPixel:hasAlpha:isPlanar:colorSpaceName:bitmapFormat:bytesPerRow:bitsPerPixel:

Anyway, for that and other reasons I didn't get the job (I apparently didn't prepare quite the right things, despite preparing for a few hours every night for the two weeks prior, which is how long the recruiter gave me before flying me to Mountain View, so my performance for at least two of the six interviews I had that day were a bit weak).

[1]: https://github.com/Quotation/LongestCocoa


The "point" of the whiteboard interview is to see how you think and converse and interact around your coding process, not (necessarily) how accurate/good you are at writing the code. You could write pseudocode that can't compile, and as long as you can explain the algorithm and talk about its complexity you could "pass" (in theory)

Screening interviews do take place in a shared doc or other editor, I believe.


I've never had any company let me write pseudocode. I once tried to write "vec" instead of "std::vector" and the interview told me to write valid code on the whiteboard. (And I did explain that I was to avoid writing "std::vector" repeatedly).


I’ve managed to mostly avoid these kinds of interviews—is this for real?

Is this what lots of the folks mean when they claim tons of candidates “can’t even code?” Syntax or some names aren’t right for some particular language?

Shit, I forget the syntax for “for” loops or what the right way to get an array’s length is or what this language’s way of declaring a constructor looks like in languages I’ve written several hundred of lines of in the last few days, routinely, while under no real pressure at all. I crib off my own surrounding code for syntax hints constantly.

I’ve been doing this north of 20 years, and tend to end up as the guy to go to for tricky problems wherever I work, but I would fail 100% of whiteboard code-writing tests that gave any fucks about it being remotely correct in any particular language. I think the last time I could maybe have done it was when I knew only one language, and it was Perl, and I mostly wrote it in notepad—so, like, the first year or so after started writing code.


template<typename T> using vec = std::vector<T>;

Or even using V = std::vector<T>;

for a concrete type T

Solves the problem. If someone denies you this, you don’t want to work there :)


I'm sorry, I don't see how that relates to what I was asking. To be clear, I agree, I just don't understand why you chose to write this in reply to what I wrote.


The complaint of the interviewers would (likely) be that they want to see the candidate go through the whiteboard process, because of the reasons I gave. It's just part of the process. Writing in an editor takes away their ability to probe that process.

Don't get me wrong I think it's silly, but


Oh, I see. There's no literal reason why that discussion can't take place over a laptop screen, either. So, I suppose it more or less does boil down to "you are making us do our job differently from how we are used to, and we don't like it." That is supremely silly.


I just (barely) failed an automated interview by some company who posted on the last Hiring thread. Their approach was interesting but they definitely measure the wrong thing IMO. They were looking for senior Python devs yet assumed that every senior Python dev knew X, Y or Z Python API/feature by heart. I failed a single question, precisely one where I wasn't familiar with the particulars and couldn't "figure it out" live due to the nature of the platform.

I've worked on a Python interpreter (not CPython) for many years now, I just never used that particular module (because we do it differently in our implementation). I suppose businesses try their best, but aren't always measuring what they think they are...


Related to not knowing everything, there's also this mal-assumption that if the candidate doesn't know something simple (and obvious to the interviewer) that they will never be able to figure it out on their own.


> Who sits down and designs a system for all twelve adjectives ("reliable", "concurrent", "redundant", "runs on a Z80") we want anyhow?

Wait, is Z80 seriously something that you want/design for?


Sounds more like a case of "arson, murder, and jaywalking," applied for its rhetorical effect, to me. :-)

https://tvtropes.org/pmwiki/pmwiki.php/Main/ArsonMurderAndJa...


My approach to interviews over the past decade has been as follows:

1. My preparation for an interview involves researching the company, not technical matters. I don't brush up on coding interview questions. I've never done leetcode.

2. If I find the interview questions to be ridiculously off-topic (such as silly algorithm questions), I end the interview. You're not the kind of company I want to work with.

3. If I find the questions to be valid, but I can't answer them, then I'm not the right candidate for the job (hopefully I'd already have found this out during the research phase, but we all make mistakes).

4. If we can get past all this gatekeeping to the actual important topic if what BUSINESS issues they're trying to solve, and how I can fit into this process, then we've got a real interview and I'm interested.

So far I haven't been out of work more than a few months.


I have done a few of these types of interviews. It's incredibly awkward. At one place, I was even issued a laptop, a badge and added to the Github organization. I mean, I guess? But it felt like I was starting work, not doing an interview. People there also didn't know if I was a new employee or who I was when I went to lunch and get snacks. I spent more time getting my local dev up and running than actually doing the task I was assigned, which they could have gathered the same information about me in a 30 minute interview.

If you can't determine from a few interviews you want to hire me or not, it's a no from me. It's a HUGE red flag if an employer can't seem to make up their mind so they have to resort to awkward "test drives" like this. If you have a job at the time already, it also makes it almost impossible to pull off without using PTO.

Add all these things together and you really don't want to work for a company who is very inexperienced in hiring such that they resort to tests like these. It shows they don't know how to interview, are not efficient with your time (and probably their time also), and a high chance they are non-technical.


The past decade is the golden age for software developers, so we had lots of options. I was wondering if we will still have so much freedom as the market has increasingly become the buyer's one.


I haven't been interviewing full-time, but these past couple of years I have had barely any success. Currently getting insta-refused even for jobs with lower skill requirements. I've heard similar from friends and colleagues.

There's a bootstrap issue as well. A developer with a brand name on their resume is much more desirable than one without, even for equal skill and experience.


Please don't post such statements without location. This is completely irrelevant without knowing the country at least.


It’s the right and honest way to go about it: good fit from perspectives of both sides … that’s the original purpose of interviews.


How do you end the interview? To me it seems like that might be awkward.


I'm not op, but I have had to do this a few times because of the same reasons. You pause, take a breath and kindly say "thank you for the opportunity, but at this time I don't think this is the right fit" and leave it at that. No need to embellish, or add extraneous detail or think you're being awkward because they will do the same thing if they don't want to go further in the process with you. It's just business, treat it as such.


If I’m considering ending the interview, I’ll instead critique it on the spot. It’s more interesting for everyone. If they pass me over, that’s fine - I’m already considering walking out anyway.

“Hey, can I stop you there for a minute? This interview style isn’t really working for me to the point that I’m considering cutting the interview short and heading out. Here’s why …” - and then have that conversation. Some people will take that badly - and that’s fine. But I have no idea what will happen next after saying something like that. And that makes it an interesting direction to take it.


Not OP, but I do this too.

I generally ask the purpose of the question.

Sometimes there are valid reasons for why they look at very strong algorithm skills and then I simply admit it's not really my cup of tea nor my passion.

Sometimes they answer variations of "it's standard/it's how we do it" so then I propose whether we would like to code something more similar to what the daily job would be to which they generally play along, even happily.

But if some don't I then say that I don't see a fit.

Might be awkward but...who cares? I don't, they won't after 5 seconds the interview's over either.


We should be a union of people who won't take this kind of bs. I have the same exact procedure as I think interviews like that show a complete lack of understanding of what makes a developer great in a company, and let me tell you, solving CodeSignal/Leetcode unpaid for hours isn't that.


The coding interview looks different when you view it for what it would be called in other industries: a licensure examination. It looks particularly insane to relicense for every single job you apply to. It also looks supremely unfair to have proctors for this exam with varying expectations and training to actually correctly administer it.


- licensing ensures only a minimum level of quality

- people with licenses still do interviews, often just as grueling

- licensed careers with high performers (lawyers, doctors, ib, etc) have other forms of filtering which are much more painful, like years of low pay internships

Studying for a few weeks to solve fun puzzles to make 400k sounds like a deal to me.


What kind of job pay 400k? Even when I worked as C++ guru at BMW and was hired as very senior my before taxes compensation was 40k (year).


Jobs at software companies in the US. Other industries don't pay very much for technical talent.

$40k is extremely low for a senior role. Even in places like India or China.


> Jobs at software companies in the US.

A few very rare jobs at software companies in the US.

The BLS median for 2024 is way lower, at $110k. (Yet at the same time, still much higher than parent-poster's $40k figure.)


I assume the BLS median accounts for all software jobs. That would include software jobs at GM (the US equivalent of BMW), banks, insurance companies, and the rest of the economy. Does it control for industry?


Going back to the phrase "software companies", the closest category-breakdown looks to be "Software Publishers", where the mean average [0] rises a bit to $149k. (I'm going to assume that the relative difference in means is similar to the difference in not-shown medians.)

You can see higher values, but I think they look suspiciously like generic labels slapped over large something-opolies such as Google ("Web Search Portals, Libraries, Archives, and Other Information Services", mean $215k) or Uber and Lyft ("Taxi and Limousine Service", mean $187k.)

[0] https://www.bls.gov/oes/current/oes151252.htm#ind


You can make double or triple that in total compensation at the staff or principal level in the Bay Area.

I'm not saying these jobs are easy, or easy to get, but yes, they exist, even in this market.

The reality is the biggest drop-off in recruiting is at the very top of the funnel. Your basic phone screen and first technical screen -- before you get to a full panel. Once there you have way more, qualified, applicants than roles. I'm not sure it really matters which person gets the job, and this is one way you can make a fairly arbitrary decision. It's nice because it's within your control to grind some leetcode lol. From the company's perspective it's fair and has limited negative selection risk, and doesn't lead to as many regrettable hires.

If you've got 100 great candidates and 1 role, does it really matter which great candidate gets the job, so long as the selection process is relatively fair and uniform? From the company's perspective, probably not.

Since there's no feasible way to select the local maxima, the second sort is consistency.


> You can make double or triple that in total compensation at the staff or principal level in the Bay Area.

Sure, it is possible, but that is a small number of roles at a small number of companies, and the vetting process will be more intense. I think the coding interviews we're talking about in this thread are going to be at the junior to mid-level for the most part.


You can make millions as a famous actor in Hollywood. So what? That doesn’t describe the average actor’s situation.


I think it's a lot easier to get a staff role than being a Hollywood actor. Staff+ comprises around 7-10% of a company's engineering team.

A quick Google leads me to believe there's usually about 15-20 A-list actors at any given time and 130,000 members of SAG-AFTRA.

I think it's certainly achievable if you put in the time and effort (and of course if you're good at your job) -- the question is do you want to or not. It can be incredibly demanding and may not be intrinsically rewarding to you. It seems folks generally believe "senior" is an achievable level -- they should, it's usually the first "terminal" level at which it's ok to remain for the rest of your career -- then you shouldn't view staff as intrinsically out of reach.


Perhaps they meant to say "studying for a few weeks to solve fun puzzles, after getting a CS degree from Stanford, moving to a city where a basic apartment costs $1 million, and getting 8-10 years FAANG experience, also like half that $400k is stock not cash"

But no licensing!


None of these things are true of me actually. Skill based interview is an equalizer which overcomes status education. And fine divide the salary by 2 and it’s still better than most “licensed” careers


Not to mention those of us who are self-taught and don't do as well in formal learning environments. One can be higher functioning, but not learn the same as others. It's harder to work around at times, and I do have to use various coping mechanisms in practice (same as setting lots of alarms for meetings through the day). I don't always do as well in converting literal leet code pre-screenings online though. I'd rather do a take-home type assignment (actually make a solution for X) or a person to person screening, where I can ask followup questions to better understand what I'm solving for in context.


Stock in public companies is effectively equivalent to cash, so what’s the difference? Sell on vest and buy index funds or anything you want.


That's assuming the stock price doesn't go down in the time it takes to hit your first (or any meaningful) vesting cliff.


To be fair I think most top tier companies vest quarterly so...


They do, but they don't do refreshers quarterly.


You definitely don't need 8-10 years or a degree from a particularly prestigious university to achieve this.

Also you can get this pay outside the bay area, still in high cost of living areas, but for that pay it's really not an issue.


Low level (L4) at big tech can clear $400k if the stock appreciates, otherwise L5 (“senior”, but really just mid level) will easily be there. Higher level ICs can make 7 figures.


Jobs in bay area with most tech companies with 5-10 years of experience at mid level gives you this. With zirp gone this may no longer be true but was the case 2 years ago


How are people still so misinformed about Tech salaries?

Are you in the EU? Salaries are lower across the board there for engineers.

In the US for Silicon Valley Type large Tech Companies the starting salary for New Graduate SWE's is 150k USD on the low end.

For a senior engineer at FAANG it can easily get above 500k USD.


Keep in mind that 150k USD can also be a terminal salary at 30 years of experience in most other places in the US.

Most jobs are not FAANG or FAANG adjacent.


Yes of course this is true but I was responding to this common idea that high salaries, say over 300k, don't actually exist in the US when they do and are very common for Engineers at large SV tech companies.


I am in Portugal yes.


[flagged]


Americans emigrate to Portugal, you know; don't feel so sorry.


Retirees usually


> C++ guru at BMW

This sounds interesting--what did you work on?


I think the author is talking about comp data from levels.fyi https://www.levels.fyi/t/software-engineer?countryId=254&cou...


Maybe for the devs, but "solves fun puzzles" is not a measure of quality either.

The number of people who "aced the interview" but are borderline worthless for real product development is not trivial in my experience.


I still haven’t met the guy who is an algorithms genius who can’t program, but I can guarantee a license isn’t going to solve that problem.


What you want is someone who can break down a problem to the point where they can use the scientific method to come up with an algorithm on their own.

What you get is someone with canned responses in their head. And resultant mass layoffs. We're trying to automate picking just the right puppy instead of going out and playing with a few of them.


"Programming" is only half the battle. E.g. doesn't overcomplicate the codebase with weird one-off solutions, can actually work on features, some of which (God forbid) involve a UI or existing code.

In 20 years, most leetcode style genius types I've worked with really aren't great at this type of SaaS work.


Programming is a very small part of the battle of being an effective software engineer.

It leaves out:

- communicating

- teaching

- dealing with ambiguity

- navigating politics

- working cross-functionally

Most high level individual contributors at large tech companies don't even code.


i have experienced a lot of people who have convinced themselves they are adding value despite not coding, yes


System design isn't coding and reviewing all designs across 40+ people and leading cross team tech initiatives is a full time job.


What makes you say they don't even code? In my experience coding is indeed drastically reduced, but high level ICs still regularly code.


Looking at their commit history.


As an addendum to "communicating" is documentation. That said, plenty of those who do document/communicate well can still suck at teaching/training at a higher level.


But being able to program is table stakes to be an effective developer. There better be a lot more to the interview if you are hoping to find rounded developers with proven experience at the level you are hiring for.


For engineers at the mid to low level where leet code dominates the trait to select for is ability to program and be reasonable to work with.


At entry level fore sure, but not everyone is going to be equal after 5-10 years of experience. They'll all be able to code of course, but some will have gained valuable experience and moved on in capability and some won't, which of course is why you interview to be selective - to find those who can do more than just regurgitate LeetCode solutions.


> They'll all be able to code of course...

I promise you that's not the case, lol.


It's not about being able to program. It's much more about product sense and system design.


Only in other countries than the US there are also these interviews and you make far from 400k :)


Even in the US, "$400k" is almost certainly an outlier and not the typical case. Ever notice how people making these claims never provide data about actual comp distributions? I've been an engineer for almost 10 years and never had an offer come close to $400k per year.


Everyone on HN knows someone whose brother's uncle's girlfriend's nephew's former roommate works at Facebook and made $400K as a developer. And they'll point to that one person and say "See, it is possible to make $400K in tech." Yes, it is technically possible, just like it's possible to do plenty of difficult things. That doesn't make these salaries common. For every 1 person making $400K-800K at FAANG in the Bay Area, how many dozens are working outside the Bay Area and/or for some no-name company making $120K?


I dunno man. A good third of my class seems to be working at FAANG and we're all from an unremarkable university in Eastern Europe. I'd imagine the prospects are much better if you start in the States already.


The official estimate of software developer positions in the United States was 1,656,880 in 2023.[1] 10% of that would be a high estimate of FAANG software developers in all countries. And most in those companies make less.

[1] https://www.bls.gov/oes/current/oes151252.htm


A high estimate? Google alone stands at 180k employees. I'd imagine at least a half of those are technical positions.


Technical positions is more than software engineers. 60,000 software engineers was the most I saw. And most other technical positions pay less.


It probably depends. I know at a very senior level in a lot of companies, you can hit a base salary around half that for Staff/Principal positions and some Architect roles, especially in higher paying areas. The value of stock offerings and other compensation, bonuses etc will vary a lot though. You may grind out 5-10 years to reach that level, then 5 years to fully vest at half that pay to see a $1M stock payout at 5 years bringing the average to $400k. Who knows. Different arrangements work differently.

As an aside, when the market is relatively good, don't be afraid to straight up ask where compensation is on given roles when recruiters reach out, or to ask for more than you think you might get. You're pretty unlikely to get $400k or anywhere near it as a base... but you'd be surprised how reachable say $150k+ is as a base salary for a remote position when you aren't in SF or another major/expensive city.


It's the typical case for Bay Area-HQ tech companies, at essentially all levels for Tier-1 companies, higher levels at Tier-2/3 companies, and specialist roles beyond that.

"software engineering" doesn't print money any more than being "in finance" does, you'll make more or less depending on what company you do it for.


> It's the typical case for Bay Area-HQ tech companies, at essentially all levels for Tier-1 companies, higher levels at Tier-2/3 companies

What are the tiers? It's close to typical for a Bay Area Google L5 according to levels.fyi.


tiers are about the quality of the company, not the internal leveling.

Google would be a tier1-2, Uber/Lyft/Netflix would be tier 1, Adobe would be tier 2-3, etc.


The point was companies usually considered tier 1 like Google did not have such compensation.

What makes Lyft higher quality than Google?


Do you work in California for a company in this list? Meta Amazon Google Netflix Airbnb Stripe Square Microsoft

If not it's unsurprising. But these actually are typical offers at those companies including Base Salary + Stock + Bonus.


in the sfba, levels.fyi shows it roughly at 85%-ile, not everyone - but hardly extreme outlier


I wonder what it is after accounting for self reporting bias.


The numbers are real. I was in a similar incredulous position 4-5 years ago. Hiring is still relatively weak, but you can always try to get an offer yourself. Or even just talk to and ask recruiters; they have no reason to lie. You don't have to take your cousin's girlfriend's uncle's word for it.


probably significantly lower, but still not all that rare. i make roughly that much as do most people i know in sf


A giant proportion of developers have no idea wtf “levels.fyi” is and are quietly writing Java or some shit in some suburban office park in a city you wouldn’t ever bother to visit, for $80k-160k/yr.


Hearing from people that did internships in other fields for almost nothing, during which CS students make more than most workers really drove that point home for me.


> - licensing ensures only a minimum level of quality… people with licenses still do interviews, often just as grueling

Agreed, I never asserted against these two points. The point of licensure is to make the first round, which is fairly routine at this point, more equitable and less susceptible to probabilistic effects. It also frees labor from administering this exam round to every candidate.


How often do other industries relicense? How often would you propose relicensing for programming jobs?

My concern would be skill atrophy between relicensing and then you'd still want to do the same sorts of interviews.


Other industries do not require repeating the license exam. Many have continuing education or other professional development requirements. These can be expensive and time consuming but not difficult generally.


> people with licenses still do interviews, often just as grueling

The licensed engineers I know think software interviews are insane.


And most mech/ee make <150k. A closer comparison is IB.


How is income relevant to the claim interviews are often just as grueling in licensed careers?

Most software developers make <$150,000.[1] The median aerospace engineer makes 99% of the median software developer wage.[2] Probably the median licensed aerospace engineer makes more. The aerospace engineers I know think software interviews are insane. Lawyers and doctors I know think software interviews and their professions' filters are insane. I know nothing about investment banking interviews. But investment banking is known for hazing.

[1] https://www.bls.gov/oes/current/oes151252.htm

[2] https://www.bls.gov/oes/current/oes172011.htm


Can I join your bubble?


You could view the trial-by-Leetcode that people undergo when they switch jobs every 4 years or so as a form of relicensing. One advantage that the current setup has over officially proctored examinations is that you get to try again repeatedly until you are successful.


Other industries do not require repeating the license exam. Many have continuing education or other professional development requirements. These can be expensive and time consuming but not difficult generally.

Real licenses do not expire without notice. Real license exams are more consistent, have clear pass criteria, and have higher pass rates.


The roughest outline for hiring as far as I can follow it is start with 100 resumes, filter it down to 10 by picking ones that are nicely formatted, then filter down to 1 by interviewing. Each level of filtering should be structured so that it biases towards technical competence.

It isn't really a licensure examination because, as you point out, the industry doesn't bother to put the resources in to make sure anything is systemically analysed. It is a process to balance supply and demand of jobs.


90% of the filter is resume formatting!!!? I'm open to hear your experience but that sounds like an arbitrary filter which limits your pool more than it selects for talent


It can depend, but yeah.. when you're filling 1-2 jobs and have a stack of 500 resumes, you're going to do quick filtering based on some pretty arbitrary decisions. Worse is that the recruiting companies will often do weird things to any formatting you have done. If even 50% of the applicants are technically qualified, dumping 90% at random is still likely to get you where you need faster.

It's worth taking the time to ensure your formatting is consistent, with no/few spelling and grammatical issues.

It will also vary by market/location. If you're looking at jobs local to you, it could be very different, especially for those jobs wanting domain experience. Different business sectors also congregate around adjacent businesses... so the choices and technology in one City will vary greatly to other Cities. As will the development process itself.


Missed the step where they filter out anyone w/o a degree in CS from MIT/Stanford/Berkeley/UIUC/CMU.


As someone with no formal degree, but approaching 3 decades of experience, this last (recent) job search was rather brutal. Not enough feedback to know if/what was going on. Could be lack of formal education, could be my age (legal or not).


Got any tips/suggestions on how not to be filtered out because of "formatting"? Got any great examples to share?


In my experience, pure black-and-white text does extremely badly.

If someone is in a position where they have to read through 100 resumes, there is a limit to how often they can look at a paragraph-formatted WYSIWYG text edited black and white document that says "I did well at school, I worked some jobs afterwards, now I want to work here". They blend together something shocking.

It is worth having a tasteful dash of colour or maybe a formatted header/footer where possible. Low bar, but there is a big pool of serially unable-to-get-past-screening candidates who can't clear it and you want to stand out from them if at all possible.


I'm allergic to resumes that recite every single technology the candidate has ever used, leaving little said about what they accomplished using said technologies.


That's true and that's how I tried to have my resume but I've had quite a few recruiters saying something like "I cannot see a list of technologies you know". Which has been the opposite of what I've been told in the past, where impact and goals are more important.

So one can never win it seems :D


> It also looks supremely unfair to have proctors for this exam with varying expectations and training to actually correctly administer it.

Those are your top two problems? When was the last time you saw an exam where the proctor was expected to hold a conversation with the examinees?


Yeah, but the variation can be good. If I had to imagine what the hypothetical licensing process would be, I imagine it would be something easy that admits tons of mediocre devs. Plus, would it come in different language/subfield flavors to account for different roles? One job would ask me to implement a custom allocator, and I'd pass; another would ask me how to add Frondle to Artifactory, and I'd fail miserably.


> I imagine it would be something easy that admits tons of mediocre devs

Given that Medical boards and the Bar strike fear into the hearts of students and demand immense preparation, I think we could do pretty well. The test prep mentality isn’t altogether different from the “grinding leetcode” that happens today anyway. The difference is that it would at least be fair.


Actual engineering licenses in the US have kind of solved this.

There’s the easy exam that pretty much everyone passes eh e they get their degree (the FE), and then there’s the hard one that not even everyone attempts after a couple years of experience (the PE).

And within each level, you specify your discipline (civil, mechanical, etc) and then are required to have deeper knowledge of several subfields within that discipline.


Difficult to apply a lot of that, when in reality there are nearly infinite combinations of domain knowledge, software knowledge, architecture knowledge with languages and platforms. Some requiring more or less depth than others.

Software is a craft discipline... it would be better organized as a guild with reputation at stake in concert with endorsements. But then you risk what is effectively nepotism and politics.


you don't think other engineering disciplines have countless sub-areas of expertise?


I don't think other areas of engineering see the tooling options double every other year.


The sort of interview being discussed doesn't test knowledge of the latest tooling.


It would be nice. There are already too many devs and if we can lower that number by controlling the licensing our wages would also stay very high.


I personally think an industry-wide license would be an easier filter than requiring a degree: Schools don't often teach what you need to know, and sitting for an exam is a lot easier for the "I've been in the industry XX years without a degree."

It would also be easier for the industry to settle on some filters in the exam to block people who just SPAM job postings. For example, open jobs for doctors are only posted on websites that are available to doctors. There is no way for the general public to SPAM job listings for doctors.


We need our own AMA.


I've found that asking them to review some obviously bad code with glaring errors and problems is more informative than asking them to solve some random DSA problem.

Candidates who can code well can point out code that has obvious problems. Just ask if this is good or bad, and if it is bad, how they could improve it. This demonstrates competency and doesn't make the interview seem like a grind but instead more like a conversation.


Just got be careful with the “how to improve” part. In my experience as an interviewee sometimes it becomes a regular algorithmic interview. A couple times I found some N+1 queries or inner loops and was asked to “fix it”, which might just turn into leetcode.

The best code review interviews are the ones where there is a healthy amount of actual code, with a handful of functions and classes, some badly named variables, bad comments, some misleading code paths, couple bad patterns, etc… the worst ones are a non-optimal solution and you’re asked to make it optimal. That’s just leetcode disguised as “code review”.


Leave it open-ended and include code with multiple levels of bad so it's not just a quiz. I used real C code that research scientists had given me. If they look at it and say there's no reason this should be in C and in a dynlang instead, that is fine. If I hand them C code where the entire program is a 1000-line main function, with lots of repetition, hard coded file names, and fixed-sized string buffers, and all they tell me is the indentation is icky: that's a negative signal.


I'm in ops and we've found that simple exercises are better at weeding people out than complex ones.


This reminds me of this discussion on cocktails and bartender skills from a while ago https://news.ycombinator.com/item?id=36492450

"The martini may be simple, but it is not easy to make an excellent one. It's a very solid test of a bartender's skill because, unlike many drinks, ingredients alone cannot carry the cocktail. A piña colada for example, is mostly about ingredients (are you using a good coconut cream? fresh pineapple?) For the martini the chilling and dilution need to be just right. This tests the bartender's most important skill: mixing. Proper mixing of the beverage is ultimately what makes a martini."

[..]

"martinis are shockingly easy to fuck up. and this conversation is exactly the reason why the martini is a good test of a bartender's capability. being a bartender is more than putting fixed quantities of ingredients in a glass. how do you know when your martini is properly diluted, either by shaking or stirring? a good bartender will know. a bad bartender will not. a terrible bartender won't even realize dilution is crucial."

I don't really drink much and never had a martini in my life, but I thought it was pretty interesting.


And chefs are supposedly asked to cook eggs.


True enough... Even in software, I was pretty amazed at how much of a filter of, here's a CSV, use one of N languages to load the data, do a check and output the valid entries to one file and the invalid inputs to an error file. You can use any libraries you like, please create a github repo and share with $ACCOUNT for your solution.

I know not everyone works with CSV necessarily, but there are dozens of libraries for a lot of languages. Even if focused on N being those supported in a given company/org. It should be less than an hour of work. Bonus points for any tests/automation, etc.


"Here is a bug report, and the patch to fix it, review the patch"

And the patch:

- does fix something but not the described bug.

- could do the same thing in a third of the added line count.

- has typos or other errors.


I use the "launch ramp" technique. Ask a series of prompts (instead of a single prompt with a long answer). Explain the prompts that will get progressively more complex, aka the ramp will gradually get steeper and get very steep later. I can stop the interview quickly if the candidate can not find simple mistakes and how to remedy them. I can also jump ahead to complex issues to engage highly qualified candidates.


I agree.

Last time I got hit with an interview question like that, my answer ultimately had to be "block the merge and counsel the person who wrote this about performance." I'm still not sure if that's the answer they were looking for (this was for a staff engineer position), but I'd stand by it 100% of the time.


> I've found that asking them to review some obviously bad code with glaring errors and problems is more informative than asking them to solve some random DSA problem.

I once had a coding interview like this, but the problem was that the code was so obviously bad, I couldn't even make sense of what the code was supposed to do if it were good. It felt like the interviewer had just come up with an example of bad code without any context of how the code would make sense if made "good". It was just totally artifical.

If someone had presented the bad code in some Stack Overflow question, I would have started by stepping back to ask, "What are your trying to do?" Except in this case, the interviewer wasn't actually trying to do anything except quiz me.

Identifying a bug in production code would be better, I think.


This is a good idea. This would also show someone's ability to read and contribute to existing code which is a large part of our day to day tasks. There are some that can only solve problems their way, which often means them trying to rewrite everything.


This is pretty interesting, I hadn't heard of this approach before. I'll have to give it a try some time.


We used four basic exercises during the first remote interview. Simple things like, "there is a number on each line of this text file. Find the sum of them."

This was an effective method to screen out applicants who didn't have the basic coding skills to align with their stated resume experience. And there were a decent number of these.

We did further development project exercises later in the process that took about 15 hours. We paid the candidates for this time, even those that didn't pass. It was also an effective screening tool.

All our exercises were very "real world." In the candidates own development environment and having been given instructions on how to prepare. They also have access to whatever Internet resources they want while doing the exercises. If they can't do the exercises, they can't do the job.

I know there are mixed opinions on this and I feel for candidates who have to invest a ton of time in exercises like this. But I can't imagine trying to hire without visibility into how they execute relatively basic software development tasks.

I think employers can and should structure the process so the time investment is minimal upfront and only increases as both parties have gotten to know each other and want to proceed.


15 hours! You're hiring contractors not interviewing.


this seems completely reasonable compared to endless whiteboarding?


Depends on the pay. If more companies start doing this, then you can imagine having to slog through 10-20 of these as an applicant will increase the time to find a job significantly. Most people don't get hired off of their first successful round of interviews. If the company pays as much as the job salary itself then I am fine with this.


Adding up all the overhead and stuff that comes before and after, we're talking about an interview process that's 3 full work days. That seems a lot, even when you're paying people. People will either have to take days off their job or sacrifice weekends.


It is a time commitment. We did have four separate exercises and they could be taken individually. So someone could do them a few evenings in a row or all in one day. Whatever worked best for them.


I'd expect lots of employed people with families not to have 15 spare hours X the number of companies they are interviewing with


Debatable. But it was never client work.

Truth is, I'd much rather hire someone who is interested in working for us for 2-4 weeks as a contractor instead of the normal interview process. But that usually doesn't work for the candidate.


Paid how much? 15 hours is over a thousand dollars - probably closer to 2 or more


is that really a lot of money for a company? don't you often spend that much in having your developers interview candidates and white board them?


i am curious how much they are actually paying. i have no intrinsic qualm with it and agree that developer time is expensive


For those wondering, we paid $50 an hour. It wasn't quite wage equivalent, but we felt it was reasonable to consider some of that "gap" as investment on the part of the applicant.


Do they get paid the rate of the position in hours?


> like we are stuck in a cargo-cult mentality where we are just doing things because that’s what the big companies do, and if it works for them it must be what we need to do.

Absolutely. They do leet code -- we must as well. Google is laying people off -- so must we. Steve Jobs was an asshole, I also must be an asshole, that's how you grow a great company.

> The question, which I’ve heard is a Facebook favorite, was “convert a decimal number to base negative 2”.

Assuming I don't care as much about the the interview and am just practicing, I would have asked "so how often do you folks, convert numbers to base negative two?"

> but fuck that question and just waiting for you to eventually arrive at the little trick to make it work.

That sounds like a "coffin"-type problem as per https://arxiv.org/pdf/1110.1556. You have to know the trick, or you might spin your wheels for 40 minutes.


I have never heard of the negative base question (I would have never passed that without help): https://math.stackexchange.com/questions/216800/how-i-conver...

My worst interview questions that were silly:

• number theory for django dev: how many prime numbers are there.

• random brain teaser for django dev: infinite lasers pointed in space that turn at each other at the rotational speed of light- does the intersection travel faster than the speed of light

And questions that were very fair that I bombed:

• ms paint fill method on whiteboard. I think I checked diagonals when I wasnt supposed to - i cant remember but they were not happy with my answer.

• for a given phone number and a US dialpad what are all the possible letter combinations for the number? In python this is just the built in cartesian product. But I still struggle to write it from scratch with a recursive function and getting the accumulation correct.


Your last example reminds me of a particularly absurd one years ago. I got asked to write a function that calculates all the subsets of a given set. The problem wasn't the question, though. It was that the person asking me this didn't even know what "yield" was in Python.


> I got asked to write a function that calculates all the subsets of a given set.

I very much dislike the interview questions which roughly translate to "do you know combinatorics/<insert-some-other-area-of-math>?". Even back when I was personally good at those questions (straight out of University), they were less interesting interview questions.

I'm sure there's some jobs where it's relevant, but for the vast majority of software jobs... not so much. I get why it's done - they can be small, self-contained problems, which don't require a lot of external context. And they can be easier to come up with.

> The problem wasn't the question, though. It was that the person asking me this didn't even know what "yield" was in Python.

That is a problem too. As an interviewer, I'm okay with someone using a language or lang feature I don't know, so long as they can explain it. Some interviewers are much less flexible. Hell, sometimes I'll feign ignorance over some lang feature just to see what their explanation is like, because it's a measure of signal on how well they can teach a feature to a more junior dev.


> I have never heard of the negative base question

In what real world situation would you even use a new base system?

Seti and Aliens are the only one that comes to mind, encode a new numerical system based on some fundamental physical constant like the atomic weight of hydrogen.


I mean we use base-10 and base-2 both all the time. I'm using base-2 to talk to you over the internet right now!

> Seti and Aliens are the only one that comes to mind, encode a new numerical system based on some fundamental physical constant like the atomic weight of hydrogen.

That's base-1, aka counting


The paint fill one is funny because the answer they’re probably looking for (demonstrate you’re comfortable with recursion) is actually the naive answer (blows up in practice.)


There's a rotational speed of light? Does it mean anything for light to rotate?


See, I can't even explain it back correctly. Kinda like this https://www.reddit.com/r/askscience/comments/3vnyty/if_i_swi...


The modern FAANG Frankenstein interview is a mess. It was so/so at Google 15-20 years ago, it was so/so when initially FB but most everyone cargo-culted it 10-15 years ago. It’s become “grind leetcode” which is clearly a failure mode.

The trouble is it’s a hard problem, and it usually gives -some signal, so it’s sort of better than nothing? I guess? In cases where contract-to-hire make sense for both the company and candidate I generally regard that as ideal, but that’s not every situation.

Someone will solve this, and that person will be very well-loved.


Google did the "how many gas stations in the us" interviews for years before finally realizing that doing well on questions like this had no correlation to success on the job. Now, in reaction to that failure, everyone is doing the LeetCode thing instead, and presumably will eventually realize that this doesn't correlate either (especially irrelevant in this CoPilot era when ability to memorize and code up an algorithm is becoming about as relevant as ability to drive a stick-shift car).

There's really no substitute for interviewing based on candidate's experience.

Apparently the new job market trend is applicants firing off dozens/hundreds of AI generated applications, which are then screened by an AI on the other end.


Haha I’m not sure I agree about how useful CoPilot is in practice (there are certain tasks where it shines but all of the LLMs print invalid code routinely).

With that said you made me laugh out loud because I have a friend who knew two people trying to LLM some minor contract and it just went on forever. I definitely think two LLMs talking to each other with two people copy pasting and not realizing it is worth like, a Rick and Morty episode or something.


Yeah, I've even seen people in top ML jobs glowingly describe the main value of CoPilot as (just) smart autocomplete, but copying/regurgitation of algorithms ought to be something that even today's LLMs should be capable of.

Of course there's always Google search too if you are just looking for an algorithm, but LLMs help in the discovery process since you can describe what you need without knowing the name of it.


Using LLMs to find reasonable subset of algorithms for a problem sounds like a valid use case. Could even get naive implementation and lead to comparing them to pick right one.


If they take the LLMs advice were they astute or just lucky? You'd have to ask the candidate to go into details to get any signal, and if the algorithm is hard enough for them to have to consult an LLM, they might not be able to do that.


I was referring to LLM use on the job (or hobby), not as any part of interviewing process.

The reality is that most programmers can go an entire career without being algorithmically challenged, and on the rare occasion you need to do/optimize something and don't know what the best options are, then you can either just ask a colleague, or ask online, or Google, or nowadays discuss it with an AI!


Why do you think they haven't adjusted again? Is it possible it is actually (loosely) correlated to job performance?

I interviewed at Google and all the questions were practical, challenging, and not found on leetcode.

I've seen all kinds of interviews in my twenty years of experience, and while all types can be done poorly, (including DS&A ones), having some live coding is one of the best signals you can get in a short amount of time.

I've been blown away by a candidate while they're talking about their experience, but then asking them to code something small, they utterly bomb.

Conversely, I've seen shy, humble candidates struggle to articulate their skills, and then crush harder and harder coding challenges.


> I've been blown away by a candidate while they're talking about their experience, but then asking them to code something small, they utterly bomb.

You're making the opposite point you think you are. How do you know that you observed an inability to code as opposed to interview performance anxiety?

I've been coding for 25 years, but have bombed my share of very simple white-boarding exercises because I panic and my brain absolutely stops working. If it's experience-based or even take home, I generally knock it out of the park.

White-board coding works for FAANG because they have 100,000 applicants at any given time and it doesn't matter if it has an exceptionally high false negative rate. It does not work so well for shops that are struggling to find candidates.


>You're making the opposite point you think you are. How do you know that you observed an inability to code as opposed to interview performance anxiety?

I've done lots of interviews, and I know that if someone has seen a question recently, or if someone is rusty in interviews, or one of a million things can affect someone's live coding ability.

So I'm not looking for speed, I don't give more 'points' to someone who is calm and collected vs. someone who takes twice as long cause they're nervous. They both "pass" in my book.

I also don't particularly care if someone starts intermixing java and C++ syntax or forgets the string library and says len(var) instead of var.len() for example.

I'm even fine with giving hints. But there are literally people, with hints, who cannot write a basic recursive function or understand to check for null references. That isn't being nervous. That's just...being bad at the craft.


> There's really no substitute for interviewing based on candidate's experience.

That always happens in an interview loop too though.


It absolutely does not always happen. I was asked to do interviews twice with one employer. Both times, I requested the interviewee's resume. Both times I was refused because they didn't want me to "bias" my interview.

So, yeah, go figure.


It’s not generally the responsibility of the person doing the technical screen to also obtain signal on past experience. At a company with recruiting process these fall on different people specialized in the specific interview types. I’ve done an average of 2-3 interviews a week over several employers for the last 12 years, so over a thousand interviews. I’ve also defined much tech interview loop for my current employer and usually interview from junior IC up to manager of manager roles.

I’ve never asked for candidate resumes because I don’t want to bias myself.

I’m not saying it will always, always happen, it’s just, it overwhelmingly will.


If you don't have a resume, do you still ask about candidates experience?

Many people can talk a good game and come across well but still be useless (e.g. hard fail on the simplest of white board tests).

I haven't done interviewing (on company's side of table) for a long time, but my main tack was always to go over projects on resume, focusing on last 5 years, and drill down to see if they could fluently talk about them, draw architecture diagrams, justify choices, etc, etc.

I you don't ask about past experience (= same as having resume), then how do you even know what the person's experience consists of? Useless cog, or tech lead? Highly agentic or passive, etc?


Maybe we were talking past each other a bit — for the specific technical portion of the interview where you are assessing coding ability (the thing you can leetcode to practice for), the resume doesn’t really add much value and can serve to bias the interviewer.

There is usually an experience and goals interview where a manager interviews the candidate to see if they would make a good fit for the team. That definitely involves a resume and a discussion of past work experience with the objective of confirm if they’re a cog or a tech lead :)

It can be important to split them up especially if you’re relying on more junior technical interviewers to validate coding skill. They won’t always know what they’re looking for yet along this axis.


> It’s become “grind leetcode” which is clearly a failure mode.

I have no insider knowledge, but I always wondered if those kinds of cultures had a hazing / blind-leading-the-blind aspect to them. I.e. the people who got hired were the ones who jumped through some arbitrary hoops, so they doubled down on those hoops being the right hoops to choose the best candidates.


You see the same effect in the cottage industry built around "cracking the code" and getting hired in these sort of companies. They produce marketing/video content that endlessly repeats "grind leetcode and 3 other simple tips" for getting hired, and then they insist that this is the exact and only reason they got hired (rather than plain luck or some reference), so everyone outside looking in emulates and internalizes this idea as the exact and only way hiring could possibly be effective and "fair". Eventually those practices just become the norm and people are blind to alternatives.

What was once considered a test of "intelligence" and skill in coding (whether a true assumption or not) has just become a test of "how much is a person willing to struggle to earn entry into our hallowed halls". Actual potential ability in the role is secondary to getting the role.


This is a great point. As companies get bigger you can find HR actually codifies the behavior as part of their efforts to reduce unconscious bias in the hiring process. So you end up with developers, who know this is a flawed system, perpetuating it because the rules require it.


Hazing is exactly what it is. It's a club and if you want to join the club you're going to have to go through the rituals. I hate participating in the charade but I do it with as much humanity and compassion as I can muster.


I work at a FAANG (and obviously, I'm not a company spokesperson, just sharing my own experience). Those who are passionate about interviewing internally all seem to agree on not asking leetcode questions. I know leetcode questions get asked anyway, but there's pretty clear internal guidance and training for interviewers saying not to use them.

At least part of the problem is that leetcode questions are easy to ask, and most interviewers don't want to go through the hassle of coming up a question that scales well to the candidate's experience and knowledge.


During my ~7 years interviewing for a FAANG I basically always went off-script and asked something reasonable.


Do companies really want to solve it? Why hire so many MBA's when the nerds will wreck and devalue each others contributions/potential?


Leetcode has the side-effect of filtering on “can code yes/no” which is a completely fair filter… I just wish it wasn’t difficult DSA problems we used.

The same screen can be done with much simpler problems. This light coding interview should also be more or less pass/fail. Do it before everything else to short circuit those who have no idea how to write a for loop.

Save the interesting insights for real problem solving, code reviews, systems design… DSA under pressure is not something that actually happens on the job IME.

Still, I’m conflicted, because a solid DSA understanding is incredibly helpful at times. At the very least, a solid understanding of the commonly used data structures, how to transform data structures, a good understanding of memory allocation with respect to code and runtime, reference vs copy… All things that if not understood, can cause serious problems.


> Still, I’m conflicted, because a solid DSA understanding is incredibly helpful at times. At the very least, a solid understanding of the commonly used data structures, how to transform data structures, a good understanding of memory allocation with respect to code and runtime, reference vs copy… All things that if not understood, can cause serious problems.

The same screen can be done with much simpler problems. Or direct questions.


> Q: Assume that you have an infinite stream of data that is coming in from multiple threads in an unordered fashion. Write the stream items to the console in order.

If I were the interviewer I would be interested in what clarifying questions the candidate would ask and/or how they would highlight parameters and assumptions needed because the question is vague to the point of being unanswerable otherwise.


As written this problem is impossible to solve.

Proof: suppose there is only 1 infinite stream which is always descending. You will never find a lowest value, so you cannot rewrite the stream "in order".


The question is not clear but I assume the ordering is meant to be on some kind of timestamp, so the streams would each be increasing


Depends. If there is a minimum value, every value will appear, and you don't want dups, then this is doable. Say the streams are of natural numbers starting with 1, and every natural number will appear within each stream. The obvious thing to do is to not even bother doing anything with the streams except to drain them, and just write the natural numbers in order to stdout! Yes, I'm chuckling.

It's a trivial toy problem. It's not a useful problem to solve. But the question may be there just to get you to think of just this, and to ask questions.


it's not impossible to solve, but there are situations like that which make any solution not work, which was a follow up question


I came looking for how this could possibly be solved. Could you help me understand?

If the input is infinite and unordered and the task is to produce output in order, how do you know when it’s OK to start writing output?


Sorry, I was paraphrasing the question and not giving every detail that I received. They also stated that there are no gaps, the values increment by one, and start at zero.


How is that "in an unordered fashion"? It sounds pretty ordered to me!


The data itself is ordered (which is why the task is to print them in order), but the order you receive the values is can happen in any order (ie the processing was split into multiple threads and each thread is posting back the results from the work).


Ah then that's the classic "merge k sorted streams" question. It's a good question and easy to solve in a coding interview. Good candidate should be able to solve in about 30 minutes. My favorite solution goes something like "put values in a heap and then read them back out" because you only need to read 1 value from each stream at a time.


Yep, that was my same approach as well.


That could be part of the question.

At one job interview I was deliberately given a problem which had no efficient solution. The idea was to find a heuristic which would get close to the optimal solution and run in a reasonable time.


I got one of those at a FAANG interview. I was a bit less experienced and less confident at the time so I left the interview thinking I was misunderstanding some fundamental CS laws.


This was an interesting observation:

> What I do know, however, is that for every 1-hour interview where I evaluated if someone knew their data structures, I could have just taught them.

I don't really hear much about training. I doubt it's because we don't do it, but maybe it's not an interesting topic for discussion.


I dropped out of school (English literature) because I needed to make money and pay bills due to a life change, and i applied at a place that does internet stuff to do tech support because computers were a big part of my hobbies. I moved up over the years learning as I went and now I've occupied a few 'engineering' roles. I used to feel somewhat embarrassed to hold an engineering role with no engineering degree.

But I stopped worrying the more I worked with software engineers. They were just people, some sucked, some were great, their education almost never had anything to do with the bracket they fell into. The worst one I've ever met has a PhD. The best one I've met also has a PhD.

Then I found more people in these positions that have degrees or no degrees, and the people that work well both with degrees and without had something in common. They can more or less be taught to do anything, they can extrapolate and apply knowledge outside of the specific example initially given, and they understand the big picture/desired end results. I hate to say it so crass, but they are good at identifying the trivial bullshit and addressing it or cutting through it rather than sitting in it. They "Get it." From the ideal to what the business demands, they just get it.

Training competent people is a boon. I don't know how you seek out interested/curious competency and then train it, but if we can figure that out, it'd be cool. I'm really tired of working with people who have degrees and jobs because their mothers told them it pays well. Computer Scientists with no interest or curiosity about computers.


Training and teaching is hard. "One hour of interviewing could be avoided with one hour of training" is wildly optimistic.

It also ignores the breadth of knowledge you likely want a candidate to have (and are just trying to sample at through a short few interviews). How many hours of training are you willing to sign up for? How confident are you that the knowledge will "stick" for any given candidate? Are you going to fall back to GPA and school prestige to measure how well they've retained knowledge in the past? That seems like a step in the wrong direction.


I don't think it's wildly optimistic, but perhaps we are thinking about it in different ways. I don't think I could teach someone how to implement each data structure in an hour, but I could easily go over maps/queues/stacks/hashtables and tell you when to use each in an hour. I know this because I do that very thing in less than an hour in code reviews.

I do agree that the "stickiness" is iffy, but it usually sticks pretty well. You then have a second problem that can be described as "when you have a hammer, everything is a nail" as they use their new fancy hashtable everywhere.


Yeah but a data structure is definitely some stuff one could grok in 1h.


Training is not billable. Managers decided they only like billable hours. Kind of like when you decide that you only like winning the race and you hate training.


Sounds a bit delusional to me that you can just "teach someone data structures in 1 hour".

Also the role specifics matter here. It might be ok to bring in a junior who needs lots of mentoring onboard. For a tech lead who is going to be driving the direction of a team, I'd expect them to be up to speed and autonomous very quickly.


I don't think "I can teach someone data structures in 1 hour" is a good faith interpretation of what OP meant. I would expect it's more like "I could teach someone about a data structure that could be used to solve this problem efficiently and simply in 1 hour" is closer to the mark.


It's not "teach someone data structures in 1 hour", but teach them the approach needed for that problem in 1 hour.

And it's a pretty insightful comment, because being able to be taught a practically useful algorithm, and use it on a question, in just an hour is a sign of a good candidate. Interviews should often be run just like that.


Thinking like an engineer, from the companies perspective, its just a filter.

While it could be the case that candidate A who passes the coding interview makes a terrible employee, while candidate B who does not pass the interview would make a better one, it doesn't matter. As long as the pool of candidates who pass the coding interview, fares better than the pool who does not, it will be a useful tool until someone comes up with a better one.

And the alternatives were worse. There was a time where unless you graduated from a top-flight school, or were top X% at one of the better state schools, or had some other "in" (e.g. knowing somebody) the cool kids at the time (Microsoft/Apple/Yahoo/Intel/etc.) wouldn't even talk to you


I think your analogy of pools is spot on.

The goal is to view people as different "grades" like a commodity. They aren't measuring directly for skill, instead they're measuring for the odds you match a some "grade" of software developer.

To your point, if memorizing the first 1k digits of pi correlated with a higher "grade", then companies would use that instead.


> There was a time where unless you graduated from a top-flight school, or were top X% at one of the better state schools, or had some other "in" (e.g. knowing somebody) the cool kids at the time (Microsoft/Apple/Yahoo/Intel/etc.) wouldn't even talk to you

I don't think this has changed.


If you want a more efficient way to practice, I’ve been working on https://deriveit.org/coding/roadmap#note-215. It’s LeetCode site that’s

-organized intelligently

-has simpler explanations than you find online

We’re super proud of our content and just recently 2 people have landed Amazon with us. People actually feel ready for interviews with us. Give it a go :).


No thanks, we don't need more leetcode-related stuff. It should just disappear.


This looks really great. I love the presentation style. Clean and simple. Nice work!


I have 30+ years of experience on my CV and I still am being asked to do coding interviews... I flatly refuse every time, because they have no connection to the actual job. The hiring process disregards experience and treats everyone in the same way. Ours is a stupid industry.


Have you ever been hired by a company you refused to do the coding interview for?


I've had people apologize to me for trying to make me do one. The interviewer was smart enough to realize that I was the senior in the room. I would have gotten that job but I declined it for another, more interesting one.

The mistake many companies make with attempting to hire senior developers is losing out on the opportunity to hire the good ones by assuming the candidate will fall on their knees to get the job and do the silly test and be subjected to some prolonged process. The better ones will simply not do that and lose interest the second you say "coding interview". They are only on the market for limited amount of time and all your competitors will be eager to hire them.

Hiring senior developers is mostly a sales job. You need to do your homework (i.e. read the CV, look at the Github) and really sell the notion how amazing it would be for the candidate to work for you and what a great fit they are. I speak from experience; having hired and built a few teams. If there's a lot of doubt after you did your homework, the interview needs to be about building a case that the candidate still has some redeeming features. If there isn't the interview needs to be about quickly confirming key points and then moving onto sales. If after all that you still doubt the person can code, then don't hire them.


Yes, more than once.


What phrasing would you use to refuse? I'm curious :D


"I just don't do online coding exercises. They typically bear no relevance to the project and if you expect me to be solving basic problem in CS while being watched and judged in real time when I join then I don't think I want to join. Also, if you cannot judge my experience based on my CV then I don't think you know who you are looking for."


Simple coding interviews are fine for filtering candidates.

Coding interviews with aha solutions or time bounds are just asking for people who memorize solutions. Forget about a problem solving engineering culture at these companies. Imo, the harder the leetcode nonsense, the worse the engineering culture.


At this point, I’ve had enough successful positions at hard jobs that I don’t question my competency when I fail a pop quiz.

I still don’t know what my favourite Rust crate is. I feel I should have one after being asked more than a couple of times.


Don’t overthink it, just talk about something you like about the language. The interviewer wants to hear relevant words that demonstrate you’ve actually worked in the space (and not just did a tutorial or read a blog post.)


"Favorite" just means one you use a lot.

It's like when people ask you what your favorite movie or album is. They're not asking you to literally move to a desert island with only that, it just means one you like that comes to mind easily enough.


You've actually been asked that...? That's a trip. "Whichever one solves the problem at hand."


Three times, yes.

I don’t have one, though.

Much like I don’t have a favourite number or colour any more. I have favourite sets of numbers and colour palettes!


I'd say regex, mostly for the well written blog posts.

https://blog.burntsushi.net/regex-internals/


Serde? Clap? It’s not the objectively best crate, just one you like a lot and think is well designed. Both of those are “wow, this is way better and yet simpler than every language I’ve used before” crates for me.


Sounds like they are trying to discover good ones


Here's what nobody wants to admit: Software development is probably 10% engineering, 20% science, 20% putting blocks into holes ("when MICROSERVICE apply KAFKA"), and 50% arts & crafts. And we're just testing for the blocks-in-holes part.


> and then finally an interview with David Fullerton and Joel Spolsky

Joel Spolsky wrote the book on interviewing a software engineer: https://www.joelonsoftware.com/2007/06/05/smart-and-gets-thi...

The book is a fun read, but it's easy to miss a very critical detail: A coding interview is supposed to demonstrate that a candidate is comfortable coding, not that a candidate can wrote memorize XYZ algorithm, or read your mind well enough to care about the corner cases that you care about.

I always give a lot of hints, and focus on that "we're having a discussion about code" when I interview a candidate. I don't expect a 100% right answer the first time, but I do expect a candidate to have a certain degree of intuition about how to program a computer.


I agree with the sentiment that the algorithms and data structure coding interview practices are a terrible measure of a software engineer candidate. There's so many other factors that make a good software engineer and to dismiss a candidate because they can't solve a single problem on the fly is wrong.

In my experience, I've never got an offer after bombing the coding portion of an interview even though I shined in system design and behavioral interview rounds.


We scrapped coding tests at our last round of hiring, but the interviewees were so bad they everyone’s time was wasted. We had to bring it back just to filter people out.


We talk about something like this now and then but the thing is, if fizzbuzz is still managing to eliminate more than half the candidates I don't think it's useless.


I’ve been having trouble finding the scientific papers behind this theory, but one I’ve latched onto is that “filtering against known strong signals is helpful” but also that “additional filtering beyond that is more harmful than random choice of the remaining pool”.

Basically, it can probably be shown that if you hire a group of completely random sample of resumes vs. hiring a random sample of people who can produce a working fizz-buzz program, that the candidates from the latter group will perform better. But if you then filter the fizz-buzz group by “can they solve this negative base math problem?” you’ll be removing a disproportionate amount of candidates who would have turned out to be excellent for your organization, and your final “super-candidate” pool will actually be weaker (for your org / that position) than the average of all those who could solve fizz-buzz.

I think the research shows that you should filter based on what you know for sure provides a true signal, then select randomly from the pool which passed your known filters. If anyone can help me find anything relevant about this, I’d appreciate it.

Too many orgs treat hiring like the “secretary problem” but that requires the assumption that you can grade everyone accurately on a continuous scale. We can’t yet do that with software engineers - theres no way to say someone is “80% awesome” vs. “93% awesome”.


I don't agree. One part of coding tests is getting the right output, the other part is how they went about it - which is relevent even if they didn't solve the problem. You can tell a lot from the second part.

I'm not a huge fan of coding tests, and I have a lot of sympathy for those who refuse to do them, but it's a case of "It's not you, it's everyone else"


I believe it to be much better to see a candidate write some code quickly, have them write unit tests for it, discover that some cases were not handled properly and then they were.

This is how 99% of developers work, I don't believe anyone who says they write their code perfectly and under stress every single time.


That does loosely suggest that something of fizzbuzz-level triviality may be sufficient for first-round filtering.


That's my impression. Anything more complicated and you're starting to filter based on your appreciation of the candidate's choices, style, etc. I.e. things that can be worked on or fitted to the company's work style.


Hiring software engineers is like dating. If you find a suitable mate, maybe you don't commit too early because you think that there might be something better out there. Or, you monkey-branch to a better one. (How often do you have an amazing couple of rounds of technical interviews, only to be ghosted? For me: Too many times to count.) I find the same is true when hiring software engineers. The vast majority of "above average" developers around me in my career are wildly over-qualified for their CRUD work.


I'm a believer in Joel Spolsky's recruiting goal: "Smart and gets things done."[0]

Add "not a jerk" (which I find is part of "gets things done" on an ongoing basis) and everything else is either vanity or decision paralysis on the part of the interviewer.

[0]https://www.joelonsoftware.com/2007/06/05/smart-and-gets-thi...


The article is from someone whose final-boss interview was with...joel spolsky. And it was a pretty useless-in-the-real-world question he was given as a coding exercise (base -2 number conversion).


The base -2 question was asked by someone other than Joel, and definitely fits in the category of interviewer vanity questions.


Yeah interview before Joel was base -2 conversion. Joel's interview was mostly a chat and we mostly talked about my dog (and other things I'm sure but I only remember talking about dogs).


I refuse to do code tests. I don't mind doing a small 2-3 hour project that I will walk the interviewer through to explain my reasoning for doing things the way that I did, but I cannot code in front of someone. I can't even type correctly on my keyboard if someone is looking over my shoulder.

I've never touched leetcode and I interviewed with Nvidia a few years ago for a position I was an absolute shoe in for, unfortunately they wanted me to do a live leetcode.


Code interviews are great:

- filter out low-IQs

- filter out people difficult to work with that refuses to take them

- filter out people that have no interest whatsoever in computer science


The way I handled interviews during my short stint as a manager was to just ask questions related to the task the person was going to do. No general "coding questions", no need for that, the specific topic they'll be working on.

"Have you heard of X? How would you deal with Y in situation Z? Your CV says you worked on A, have you also looked into B? Any guess why we went with B instead of A for C?"

They didn't need to give the exact answer we wanted, the questions they asked in response often told us more than the straight answers.

Everyone I hired is still at the company and one of them sped up our tool by an order of magnitude. No coding challenges needed.


For machine learning engineer roles a lot of companies are carrying over their leetcode questions from software engineering interviews. It makes it even more ridiculous. I’ve told recruiters that I have other interviews that don’t ask leetcode, and depending on how those go I’ll study leetcode.

I’m not a competitive programmer. I’m an engineer. If you want a competitive programmer go find a new grad who was in the competitive programming club in school. Assessing engineering skills with leetcode is like assessing bicyclists on their ability to swim.


What exactly is an alternative way to hire good candidates that any of you think is better than coding interviews?


You ask them about their experience. What projects they've worked on. What things they've created. Then you drill down into the details with them. What was hard? What was surprising? How did they solve it? How would they do it differently?

Or you do a mock design exercise with them and hear them talk through their process. Are they going to use a relational DB or NoSQL? Do they need a queue or an event stream? What language do they want to use? Why?

Or worst case scenario you give them a take home exercise (compensate them!) and then for the follow up you talk with them about their code and the compromises they had to make.

Or hell, if they like white-boarding and want to do it, then go for it. But a lot of people (like me) have a tendency to lock up in those situations.


The ways every other field on earth interviews people? Do Surgeons need to perform mock surgeries before they are hired? Do Accountants need to complete a test audit? Do Lawyers perform a mock trial?


> Do Surgeons need to perform mock surgeries before they are hired?

They must have a degree from an accredited institution before they can begin practicing, and part of earning the degree is operating on cadavers, so, yes.

> Do Accountants need to complete a test audit?

In order to be a Certified Public Account in the US, you must pass the Uniform Certified Public Accountant Examination, which has a section on auditing and has "task-based simulations", which are:

"CPA Task-Based Simulations are scenario-based questions on the CPA Exam. They are a large part of what makes the CPA Exam so difficult. Each one will introduce a situation, provide data in the form of charts, memos, and emails, and require you to answer a series of questions."

So, I think yes?

> Do Lawyers perform a mock trial?

In order to be a lawyer in the US, you generally need to pass a state's bar examination. Most of those include "performance tests" which require the testee to simulate part of the job of being a lawyer. I don't know think mock trials are part of that, but writing a legal brief or doing other typical lawyer work is.

My understanding is that going to trial is a small fraction of what most lawyers do and lawyers going in that direction will gain that experience as junior members of a law firm.


>They must have a degree from an accredited institution before they can begin practicing, and part of earning the degree is operating on cadavers, so, yes.

You know what I meant. You have merely created a series of strawmen. They don't need to perform a trial surgery every time they interview for a new job. The same goes for all my examples so nice try.


In every profession that is relatively high stakes, there is a pretty concrete chain of validation that the person can do the thing they claim to be able to do and that chain of validation does indeed rest on demonstrated performance at that capability.

Yes, a surgeon doesn't need to perform traial surgeries every time they change jobs. But, also, they only work inside certified, heavily-regulated institutations that are able to vouch for their previous performance.

No such institutions exist for software engineering.


Many accountants are not Certified Public Accountants. Many engineers are not Professional Engineers. Many legal workers are not lawyers. And references in those fields are not more concrete.


Yea, OP's questions show that what we really need is an equivalent to the bar exam for software developers, if we want to move beyond these fizzbuzz interviews. Something that would at least assure a base level of programming skill. You'd still have to test for specific domain knowledge, but a comprehensive "programming bar" test could at least show someone met the minimum.


Lately, when I have to do a coding interview, I ask the candidate to test some code, not write some code. I learn a lot more about how they think that way, and if you have an argument as to why testing is not a relevant skill, I would like to hear it.

Usually I do something like this: candidate picks a language (let's say Python) and I will put up a function prototype that is maybe almost Pythonic, and has a SphinxDoc docstring that is subtly not-quite-complete, and say the rest of the function is a block box that you need to test.

First off, do you have any questions about the definition? Or let's say you saw this prototype for a yet-to-be-implemented function in a design review, would you have any input or want clarifications?

Then, I would ask the candidate to present some test stimulus and expected results. Not code, just make a table with inputs and expected output in columns.

I find out very quickly if they can think about corner cases, think about possible exceptions, know which exception is appropriate under various circumstances, and occasionally (too rarely...) I will get some opinions about various test frameworks that they like or don't.

This technique will not sort people by how fast they can code up a tree balancer. But.... if after hiring them if I were to ask them to implement a tree balancer, I can be pretty confident that it will work.


If you are testing on a skill you can train you are doing it wrong.


Your comment contains some interesting assumptions. I don’t think we are aligned on how to surface thought processes.

Your comment, taken to extreme, would say there is no point in looking for software design skills at all, because we can “train” anyone with an internal coding boot camp. I doubt that is what you mean.


The whole nature versus nurture of skill comes into play.

On one hand anyone can learn anything with enough effort; and on the other any smart person can learn anything without trying much at all.


All the pre-interview questions and filtering has so little value. It is just a test to filter to see if people will put in a little bit extra effort for almost no reason. It's an extra step that makes it more difficult for someone to at least make it to an interview.

I've focused on removing that part of the process for the engineers I work with and help people make it to an actual interview. If you have the skills you should have a fair shot and not get drowned out by the hundreds of other applicants.

If anyone is looking for a startup opportunity, I have a platform that removes that whole problem and gets you to an interview directly.

Here is our server. https://discord.gg/WKj3uz6sZZ


It was always just gatekeeping that sweet sweet VC money during the ZIRP era. Games to play and egos to inflate for middle management can-you-even-code-bro? bros for companies with growth stock modus operandi.

Everyone was asleep at the wheel anyways

https://news.ycombinator.com/item?id=40292924


Where I work, we have a really just absolutely radical hiring process.

We sit the candidate down, and present them with a task. Then we all sit down as a team and work the problem. The most recent one was building a game of marbles. None of us knew the rules of marbles, but the candidate knew how to take a vague task and work with the team to produce something functional.

Which is what the job is. We ask the candidate to show us that they can do the job and then hire whoever 1) did the best work and 2) vibed with the team.

Anyone who places real value on leetcode is not someone who should be managing programmers because that's not the job. In precisely zero real-world situations does any programmer need to be able to write a red/black tree blindfolded on a whiteboard standing on one leg and signing the national anthem. In the real world you just grab the algorithm out of a book or stack overflow.


Exactly. I get why interviewers want some sort of "can they code" filter. The test can be extremely simple though. If knowing data structures and algorithms were by heart was so central to every developer's work, then they wouldn't be interview questions.


I wonder if an interview process more like academic science would help.

It usually goes like this:

Candidates are roughly screened by their CV. A handful (in my experience roughly 6) are invited to hang out at the lab for most of the day.

They usually give a presentation on previous projects, and then chat with each member of the lab in turn, hearing about their work and asking questions etc.

Then you all go and have lunch together (usually without the boss). Later the candidate has a more formal panel interview with the group leader and some other faculty.

A few days later, they are told the outcome of the interview.

It may seem like a long process, but it all happens in one hit. The advantage is you get a much better feel of the candidate over this continuous process, than you would with shorter interview tasks.


The most difficult part is figuring out what they are measuring, and how they want you to solve the problem, there are hundreds of ways to solve a problem. And once you have a solution, there is no time to make it o(n) effective, or do micro optimizations. That's why I hate automated coding tests. If there is an actual person with you - talk to that person! Ask how they will measure your performace, and ask how they want the problem to be solved! Then explain how you are thinking - can't do that to an automated code test.


After half a year of trying, I’ve just given up looking for a job. I’ve been surviving on contacts where I do the work of the people who pass the interviews, but don’t know how to do the actual work.

It’s a strange world.


Everyone seems to agree live code interviews are both terrible and necessary.

I'm a self-taught coder without a degree. I guess it may be extra frustrating for graduated candidates where your hard-earned degree buys you no credibility for skill.

Kinda similar, after ten years of lead development coding every day in the enterprise on huge projects I still don't get a pass on the code interviews.

I will say, if you're trying to career pivot and apply for a management or product or sales engineering role etc., lots of technical experience does carry weight in interviews.


I'm currently going through these leetcode-style interviews and it's demoralizing studying for quizzes I know I'll never use on the job. I think knowing this makes it harder to motivate myself to study, which in turn means I fail more interviews. I think I'm just trying to get lucky at this point.

I know I can write code. I know I can work with the cloud. I know I can talk to stakeholders and gather requirements. But yet I am treated like a new grad in the interview process.


My feeling is you don't need a leetcode-style coding interview.

You do, however, need something practical to test that the candidate can think logically and actually program.


You cannot tell anything about a person by giving them a test, other than their test-taking ability (you are what you measure). You have to balance it against context and other information. If all you do is look at "the score", you're measuring the wrong thing. If your test leads to questions whose answers display competency and skill, you're measuring the right thing.


I secretly think that the current big-tech regime continues to use these because they limit job mobility. They have an intentionally high false negative rates and because they're unrelated to actual engineering work you don't want to have to study up and gamble on trying to pass it again.


My approach has been giving candidates simple real world problems, e.g. extract the URLs from this file given a spec and a description of a few language builtins. I'll throw a few curveballs in depending on how they do but my goal isn't to stump them.

I'm mostly looking to filter out the candidates that flat out can't code or describe their thought process while coding. You'd be surprised how many candidates I've interviewed pass the resume check, get to the interview and can't reason out a problem that could be solved with two for loops and an if statement.


I think people get confused with "the best" rather than "good enough". You'll never find "the best", you will find a number of people who will suffice.

Interviewing should be more about avoiding bad candidates than finding the best candidate.

This guy fails coding interviews. Then he gives coding interviews, but the people he selects based on these interviews are a mixed bag. Because he's failing the interview from both directions.

I've given coding interviews, all the questions I've given have been "leetcode easy" level at worst. In person, I usually try to get the person to write up an implementation of Towers of Hanoi. One of the example recursion problems. The interview is not an adversarial process, it's a cooperative one.

I want to see them think and I want to see them come up with code on the fly. Because while we can look up things on the job, at some point, it also requires original thought.


> Interviewing should be more about avoiding bad candidates than finding the best candidate.

> This guy fails coding interviews. Then he gives coding interviews, but the people he selects based on these interviews are a mixed bag. Because he's failing the interview from both directions.

Good points, but I should say that the people that didn't work out weren't always because of technical abilities. The company that had the worst success rate was fully remote, but I don't know if there's any interview process that can help with that.


But what happens when the interviewer lacks the knowledge and experience to detect a good candidate for the job?, is the interviewer just (a big just) looking for someone that looks right for him?


i wouldn't give much weight to somebody who's resume shows a work history of starting out after college as a "CTO" to two "Principal Engineer" positions...


Actually there's another job before that, I just don't list my full employment history. Before that was VP of Engineering. That's just my final title, however. I started that as a normal engineer. And then as a senior engineer before CTO.

Small companies, so not the same as being VPE or CTO at Amazon. You see those as title downgrades, but I don't. I have a lot of talent, so people want to promote me and put me in charge. I don't want to be a manager/leader currently, I want to be a strong IC.


Why not?

I would still talk to them, it they actually were a CTO for a period and then got hired as Principal Engineer, they might be extremely talented.

I find these meaningless reasons bullshit, just tell me that you spinned the roulette and 5 red came out and I needed a 7 black to be chosen to be talked to. At least I don't start self doubting myself just because someone didn't like my formatting or the choices I made after university.


I always thought giving a candidate some not very good code and asking them to code review it gives more of an insight into their ability than the leetcode style stuff


Honestly, as somebody who is hiring senior devs I can't imagine not doing a coding interview.

Unfortunately, it is very hard to judge somebody's coding ability in discussion alone. You can sort of get the idea whether they have or don't have experience and whether they have luck being asked about topics they know (although you can help your luck just knowing a lot of stuff).

I have seen a lot of candidates who were quite smooth talkers but could not code their way out of a paper bag. Mind I do not mean remembering some complex algorithm. The task is usually some relatively simple data manipulation so that I can see the person approaching the problem, asking questions, getting involved in some discussion, etc.

The task is usually ambiguous a bit and this is explicitly stated so that the candidate is actually expected to get stuck, don't know things and have to talk to me to get the problem solved. You would be surprised how many candidates do not listen or can't follow simple advice even when I am almost giving them the solution on a platter.

The trouble is there is a lot of interviewers and many interviewers do not have a basic plan of what they want to achieve with the coding interview. What they want to learn about the candidates. Those interviews tend to suck.

What also sucks is candidates who come to the interview completely unprepared, unable to answer most of the questions or get any progress on the programming tasks and then spew misinformation on the Internet about how supposedly all coding interviews are stupid.

I guess if you can code you may have some misses, usually along with some hits. Sometimes you are out of luck. The point is not that you need to get every job that you apply to, it is enough to get one of them.

But if you can't code at all or you apply to positions that are clearly out of your league, you get rejected on all these interviews. And then what you do? Some people go to write on the Internet how all interviews are stupid without ever considering how they contributed to all those failed interviews, how real life works (ie. 90% of everything is crap including 90% of interviews) and how the situation might look from the other side.


>Unfortunately, it is very hard to judge somebody's coding ability in discussion alone. You can sort of get the idea whether they have or don't have experience and whether they have luck being asked about topics they know (although you can help your luck just knowing a lot of stuff).

I disagree with your stated difficulty in judging coding ability by discussion alone. A skilled interviewer is easily able to ascertain breadth and depth of the candidate's experience provided the interviewer is also an expert and curious. And part of being a skilled interviewer is using all of the information at hand in their CV.

>The task is usually ambiguous a bit and this is explicitly stated so that the candidate is actually expected to get stuck, don't know things and have to talk to me to get the problem solved. You would be surprised how many candidates do not listen or can't follow simple advice even when I am almost giving them the solution on a platter.

Would instantly pass on working with you based on this alone. The intentional ambiguity is deceitful. The expected dependence and expected deference is so patronizing and manipulative. I don't need to have an ego measuring contest in an interview. You win, you're the smart, awesome guy. Go enjoy all that success, bro.

I've had too many interviews like this where the dudes on the other side of the table have internalized their superiority at proctoring their simple tasks. These dudes think they are the great gatekeepers of... something. Like, the candidate just wants to cut code so they can pay rent and eat. They don't really care, nor should they, about your product, or your customers. It's an exchange of labor for a pittance of the value produced. And the dudes on the other side of the table are gate keeping it.


> A skilled interviewer is easily able to ascertain breadth and depth of the candidate's experience [without a coding test]

And yet, many interviewers' experience shows that there are people that would pass a discussion-only interview and fail at basic coding tasks. There's plenty such people holding down jobs for a while, so even that may not be a sure indicator of skill.

> The intentional ambiguity is deceitful. The expected dependence and expected deference is so patronizing and manipulative.

I'm ~1.5years into my first job as a Backend Dev so I can't speak much about the industry, but based on my experience and what I hear from others, asking questions and clarifying unclear requirements is part of the job description. I almost never get my tasks in a precisely defined way and a lot of my job is gathering information and asking the right questions to build the right thing, often making my own choices and judgement calls. I assume that these skills are what GP comment is trying to test.

> They don't really care, nor should they, about your product, or your customers.

You can hardly blame a company for preferring someone who does. Or at least, pretends to.


You are right on point.


Yeah, OP sounds dreadful to interview with. Especially the: omg, I basically gave them the solution!!

When they themselves didn't have to solve it, since they came up with it.


Actually, for the past two years I was using the coding task I got to do when I joined the company. This helps me appreciate how I felt when I have seen it for the first time.


I can't really code myself out of a paper bag either with a stranger watching and questioning me while I'm doing it. I just get so nervous and can barely think straight (in general I get rather nervous being watched). One time they asked me to do a function I literally exactly have on my public GitHub as part of a project (remove duplicate entries from an array without sorting; it's just a few lines and pretty straight-forward) and I couldn't really do it in the interview setting.

Of course I wasn't present in your interviews and I'm not saying some of those people really couldn't code at all. I've worked (well, "worked") with some of them so I know they exist. But I'm not so sure all of those who flunked couldn't code at all. In my experience I'm not a hugely unique individual.

If you don't want to accommodate that then fair enough; it's your interview process. But just saying your perspective is perhaps not entirely correct.


Not realizing that the interviewee might be underperforming due to stress is quite telling about the interviewer's emotional maturity.


To bastardize a quote - ‘it’s a terrible system of government, but it’s the least terrible one we’ve been able to find’

Seriously, what are the alternatives?


- Don't let the hr 20-something non-engineers be your first filter - the hiring eng manager reviews resumes. Let HR review the work history, references, and W2 stuff later.

- Simple, practical coding questions related to your company need (so not algos/leetcode for 90% SaaS CRUD app companies)

- Engineer or manager talk to candidate about past work and projects.

If you are giant FAANG okay fine that won't scale. But most companies smaller than this will do just fine without the assumption of "everyone's a faker! Gotta prove you wanna work here!". This has worked fine for me in hiring past 20 years.


Standardized licenses, per language / domain / etc, that expire and / or something analogous to professional engineering licensure in harder engineering disciplines.

Take the leetcode tests once every 5 years or so. Mentor for a few years with an experienced, and already licensed, engineer and get their approval. Then the interviews can skip the drudgery and can be higher level and higher quality.


Standardised on what exactly? There are so many methodologies and/or tools, with endless possible combinations, each of which can reach the desired result in terms of functionality and even quality. And enforcing just one means no possibility of progress.

In physical engineering, things are very much limited by physical laws, which don't change (and even if our understanding on them might, that happens rarely enough to adapt). In software engineering, the foundational principles can and do change many times in an average engineer's career.


> the foundational principles can and do change many times in an average engineer's career.

I'd like to hear more.


Not the poster, but I personally know folks who have gone

Assembly (for 6800 series Macs) -> C (PC on windows) -> Java (hosted on prem) -> Cloud hosted Java + JS/React.

Depending on your definition of ‘foundational principles’ either nothing has changed, or everything has. But one thing is clear, the day to day is completely different.


I'm talking about things like 8/16/32/64-bit processors, CISC vs RISC, from mainframes to personal computers, from isolated machines to ubiquity of modern Internet, quantum computing etc. Note that I'm talking about the fundamentals of software engineering, not software science.


DSA puzzles do not test software engineering. And they did not suggest eliminating interviews.


This is what Triplebyte pretended to be. Then companies who got your resume from Triplebyte just whiteboarded you again anyway.


> Standardized licenses

The licence you value wouldn't be the licence I value.


Passing one significant whiteboard interview once is all you need, that's your "licensure."

Flip a coin whether I'll ace or fail a random leetcode question but after the one and only time my little monkey dance successfully got me into a "desireable" company, I started confidently rejecting other proposals to whiteboard me.


I've talked with applicants about their private projects. It gave me great insights. Also take home tasks with a discussion afterwards are a good thing and can be an alternative if candidates don't have private projects to talk about.


Oh... take home tasks.

Theoretically, I agree these are a great way of generating some signal where there's not enough to be found by other methods. But, in practice, what I've found is that most of these types of exercises are way overscoped, and even those which are not put a hugely disproportionate time burden on the candidate with minimal return for most of them. In other words, nobody wants to take the time to properly review these things.

I've had much better experiences when people give me code and ask me to review it rather than when they give me a "spec" and ask me to write code based on it. One big red flag for the latter type of assignment is the phrase "production ready," as that can mean so many different things it's a borderline meaningless criterion.

I could really go on and on about this, but I'm going to stop here.


I absolutely agree with you. There are many take home tasks that are way overblown and expect too much. But I am convinced that you can scope it for say 1-2 hours and I think that's a reasonable time investment for an application.

And yes the interviewer has to review that stuff and ask good questions but nobody said this was easy. But for me it showed better outcomes than the usual live leet code interviews.. (I did them too).


Coding interviews are a lossy process. Once you realize this the angst of rejection softens and it becomes an almost mechanical effort.


One thing I personally do with candidates nowadays is just...talk.

Ask them general stuff I'm genuinely interested for their opinion but that would very quickly display the proficiency of the person without ever coding any line.

Question such as, for a web development job:

1) favorite way to manage git history

2) what do they like/dislike about different CSS authoring solutions

3) how did he manage localization on his previous projects

4) candidate's opinions on different frameworks/libraries

5) candidate's debugging approaches

6) opinions on latest updates to the ecmascript

And I can go on and on. As you simply chat and exchange ideas it is so blatantly obvious who's able and who's not. There are no right/wrong answers and it's totally fine to not have experience or knowledge about this or that but in one hour you can touch so many topics about your current projects and candidate's past that you can indirectly understand the level of seniority and proficiency of the candidate.

And anyway, all of this is useless. Because the most brilliant and knowledgeable candidate could spend all his day playing videogames and pretending to work anyway, so why would I put so much emphasis on those skills/knowledge rather than the individual/professional I have on the other side of the screen which is way more relevant?

I need people I can *trust* as coworkers. People I can assign tasks and know they will be done or that the candidate will communicate.

I don't need puzzle solvers. I really don't. I couldn't care less.

At the end of the day you need to optimize the interview for what you need from your colleagues. I need trust, reliability, communication, professionalism, effort.

I can sense those from chatting and I can have a good idea. But by having you implement Levehnstein's distance I just have no intel.


Eh, if you hate coding interviews such much you can try something else like welding.

Oh wait, welders have to prove they can weld before they get hired? [1]

---

The general issue with coding interviews is most companies don't validate that the interview is actually correlated with job performance. Of course a process where the blind leads the blind is going to have issues.

[1]: https://www.reddit.com/r/Welding/comments/26ppfb/what_to_exp...


Even a skills-based test like welding, you do run up against a limited number of real standards, unlike coding interviews. What materials are you joining, measurements, what process are you using, do we have particular requirements about penetration, non-destructive test results, tensile strength, etc etc. That's the problem with modern development not having a central certification or licensure or a standards board/bar, everyone tests something different even if they think they're testing the same thing and it's a depressing mess.


> The general issue with coding interviews is most companies don't validate that the interview is actually correlated with job performance. Of course a process where the blind leads the blind is going to have issues.

Yes, I was looking for someone to finally say this!

Without the feedback, the process is much more like an initiation ceremony to "legitimise" the hiring of the new employee. You put them through an fairly arbitrary ordeal so they can finally be crowned as a proper employee.


Maaaaaybe, but I’m not ditching them. When I’m interviewing someone, I try to set them at ease. I’m not here to spot typos. I’m trying trying to trick you. I want to see how you think about problem solving, and I’m cheering for tou. I want you to be The One!

At a prior job I was the person who asked candidates to write fizzbuzz, and it was much more of a filter than I ever would have suspected. One senior engineer, from a company you know, with a master’s in compsci, couldn’t write a for-loop for the life of ‘em. Another QA engineer wanted to write tests for their implementation by capturing stdout and comparing it to a hardcoded string.

Me: What if you want to test the output of 1,000,000,005?

Them: We could store the test output as a file on disk instead of a string!

Me: Well, ok, suppose you only want to test that one specific value and not all the others?

Them: Ah, got it! We could discard all but the last line of stdout and just look at that.

Me: Can you think of a way to structure your code so that we could just calculate the output of one single number without all the numbers before it?

Them: Uhhh...

Those are 2 examples out of many. I honest to god don’t know how some people managed to build their resumes while not knowing how to do the simplest thing in their field imaginable. Imagine you were interviewing cardiologists and a solid 1/3 of them had never heard of blood. What? How? How did you get to this point?

And that’s why I’d never hire someone until I’ve collected evidence that they personally can turn ideas into code. If you can’t wrap your brain around fizzbuzz, you’re gonna have a hell of a time when a customer gives you a change request.


It's worth pointing out that there are all kinds or reasons why competent people may perform poorly at a given interview.


For sure. I do whatever I can to help them feel comfortable and emphasize the conversational style of the question. I’m not a compiler. I’m a person who wants to talk with them. Yet it’s still a stressful situation for them.

It’s not a good look for them to completely freeze up on something as trivial as fizzbuzz though. I worry how they’d react if production servers are down or if we’re being attacked if they can’t talk through the simplest of programs.


[flagged]


Holy personal attack, Batman! Did he kick your puppy or something?


Nah just everything wrong with the industry in a single post :) That being: A Discord mod who barely knows HTML/CSS in a management role slingin code tests.


I truly don't understand how you got that from what I wrote. I want the interviewee to succeed, so I'm a jerk?


It's against federal law to IQ test employees, so they ask brain teasers that are IQ test questions thinly disguised as "coding questions."

It's not a perfect system, but it works better than the alternatives.


It is not in fact against federal law to IQ test employees.


> I have, once again, failed an interview

Can we stop calling it "failing" when a company decides you're not a good fit? If you go on a date and don't get asked for a 2nd, did you fail the date? You're not just what they're looking for right now; not everything has to be a failure.

> for every 1-hour interview where I evaluated if someone knew their data structures, I could have just taught them

I'm sorry, what? He thinks it takes one hour to teach someone data structures? We sure are wasting our time with those multiple college courses and hundreds of textbooks on the subject then. We should just get this guy to spend one hour imparting all necessary knowledge into everyone.

> it’s often said that what’s more important is how you solve the problem, not that you solve the problem, but in practice human biases will work against you if you don’t get close to it working.

Human biases are much more likely to work against you if the interviewers have not spent time trying to come up with a consistent test that they give everyone. Does the author think human bias is absent from non-technical interviews? The less standard your hiring methods are, the more bias you'll have.

> I’ll just keep practicing for interviews until I successfully trick someone into thinking that I know how to code and then secretely become one of the best employees they have ever had.

I don't know who Darren Kopp is, so maybe I'm about to get an army of replies saying he's their coding role model, but this is an article about whether code interviews work or not, and he's basically saying "if they don't hire me, who will definitely become one of the best employees they've ever had, then their coding interview process must suck". The other possibility is that Darren Kopp just isn't actually tremendously more stellar than everyone else and coding interviews are actually kinda working. For his argument to work, he has to be one of the best possible candidates for every job he's applying for. I just kinda doubt that.


> Can we stop calling it "failing" when a company decides you're not a good fit? If you go on a date and don't get asked for a 2nd, did you fail the date? You're not just what they're looking for right now; not everything has to be a failure.

If you wanted to get the job, and you didn't get it, you "failed".

There's no need to sugar coat it.


> Can we stop calling it "failing" when a company decides you're not a good fit? If you go on a date and don't get asked for a 2nd, did you fail the date? You're not just what they're looking for right now; not everything has to be a failure.

Fair point.

> I'm sorry, what? He thinks it takes one hour to teach someone data structures? We sure are wasting our time with those multiple college courses and hundreds of textbooks on the subject then. We should just get this guy to spend one hour imparting all necessary knowledge into everyone.

There's a difference between understanding how to implement one and just knowing how to use one that's already been written by someone and when to use it.

> Human biases are much more likely to work against you if the interviewers have not spent time trying to come up with a consistent test that they give everyone. Does the author think human bias is absent from non-technical interviews? The less standard your hiring methods are, the more bias you'll have.

Fair, but alas I don't know the rigors they have put their hiring process through. If I were a betting man, I'd still say that if we measured all processes using programming tests, the majority of successful candidates had working output by the end.

> I don't know who Darren Kopp is, so maybe I'm about to get an army of replies saying he's their coding role model

I've been told this by everyone who knows me

> but this is an article about whether code interviews work or not, and he's basically saying "if they don't hire me, who will definitely become one of the best employees they've ever had, then their coding interview process must suck". The other possibility is that Darren Kopp just isn't actually tremendously more stellar than everyone else and coding interviews are actually kinda working. For his argument to work, he has to be one of the best possible candidates for every job he's applying for. I just kinda doubt that.

Actually the only conclusion I have after writing the post is that I am not talented at interviewing. I do believe coding interviews have flaws for how they are used, but that doesn't necessarily make them wrong or right. I agree with your first statement, maybe I'm just not a good fit.

Perhaps both of us are correct as I consider coding interviews more of an audition than anything else. If Tom Cruise auditions for a part and doesn't get it, does that mean he's a bad actor? Likely not.

(I'm Tom Cruise here, btw).


I think this interview gets a bad rap on HN. Much like the post author, I’ve given hundreds of coding interviews to candidates and taken many of them myself, sometimes passing and sometimes failing.

My takeaway is that if the goal of the coding interview is to “identify people who can code”, then this approach has high precision and poor recall.

In other words, most of the people who pass the interview actually can code (at a basic level—I’m not talking about high-level system design skills here). Very few people slip through this interview who cannot code. However, because of the poor recall, there are many people who actually code very well that the interview misses (a fact widely derided on here).

From the perspective of a big tech company, this situation is perfectly fine. There are so many candidates interviewing for any given role that accidentally rejecting a few exceptional candidates is quite an acceptable trade-off in order to prevent a bad hire. Most of the people who cannot code and are aware of their inability to code, yet still knowingly apply for a position that requires the ability to code tend to be BSer types that attempt to make their way into management positions as quickly as possible (and there are many of these people targeting big tech companies). It’s not that their inability to code hurts the company—it’s that they would prefer not to code at all and go straight to the politicking. The coding interview is simply a nuisance obstacle in the way of this objective. These types will often state that they refuse to interview at any company that requires a coding interview, though I admit there are still many BSers who are skilled enough to pick up some coding ability for just long enough to pass the interview.

At a small tech company or at a company that’s not a “brand name”, this approach obviously doesn’t work as well, because these companies don’t have nearly the onslaught of BSers applying to their open SWE positions. In this case, it might make sense to forgo the coding interview in favor of other approaches.

I like the coding interview because despite its flaws, it actually assesses some technical ability, even if it does so with low recall. This is very much unlike the behavioral interviews, which I think are mostly nonsense and are prime opportunities for BSers or mercenaries to slip into a company. In my experience, the behavioral interviews tend to disproportionally filter out people from unusual or underrepresented backgrounds that aren’t the right “cultural fit”. The language and terminology used in the interview, the conversational style and mannerisms, and the topics discussed all typically provide much more signal into the candidate’s position in the class hierarchy than how that person would do at the job. People like to think they can assess someone well by having a brief 1 hour conversation with them, but I just haven’t found that to be the case, because there are too many perverse incentives and biases involved.


I love coding interviews. Lets me show off my ChatGPT skills and ability to filter out the bs that llm spits out, and make it into a clean and coherent response in real time.


it filters out senior engineers easily by design, who can not do leetcode as fast as new graduates.

other than that, I agree it's dumb, really dumb.


[flagged]


For those of us already putting in 10-14 hours per day at our day jobs, or who work in regulated industries, this is a nonstarter. My experience is that the people who are really good are kept pretty busy by their employer


I avoid the issue by asking for “a page of code you want to have a conversation about”. If needed, I clarify it can be code they wrote, code they use or just code they are curious an about. Failing to bring something, anything, is obviously a problem (and happens occasionally). The discussion quickly illuminates where the candidate is in their career.


I would love to be the candidate in this interview. Are you hiring? :-)


> 10-14 hours per day

Oh heck nah.


So you don’t interview anyone whose work is company proprietary?


nope. tech is so fast moving if you're not at least experimenting with your own code then you're not going to be a good fit. we tend to also hire engineers who have contributed to open source.


I experiment, of course. At work.

And if you don't let people experiment during work, but only hire people that do, then that's a bad sign.


I got irrationally angry about this at a previous job. Everyone was all excited management was having us all do a "hackathon" and I am loud and grumpy about how this should just be part of the normal course of business.

We found solutions and tools that better solve problems but to this day have still not been implemented.

Not everything needs a novel solution but not making room for innovation because the company is operating as a feature factory is boring.


Most code is not written with the latest technologies or techniques though.

Why would being up to date matter?


I think you just gave the answer and then the question.


I do not understand your reply.


Good to know there are hiring managers out there who think like this. Thanks for the reply.

May I ask what industry you work in?


I just want to know the company name so I don't accidently ever apply there.



The company that demands cutting edge experience is a strava for skateboarders app?

I thought he was going to say that it was something like a foundational R&D lab or cybersecurity firm


Yeah sounds like the measure of "cutting edge experience" here is defined as "has recently thrashed around in latest trendy JS framework"


No kidding. I'd be happy to walk through the stuff I have on github but I also don't want my coworkers measured by whether they do the same.

If anything, personal projects can end up being distractions from focusing singularly on work. Even though I'm entirely in favour of them, I also remember when I had a newborn at home and a family member in hospital, and such projects were not at all feasible.


there's also the issue where the stuff i put on github is for hacky throwaway stuff and not representative of my professional standards for a "real" job. it'd be like interviewing a chef and judging their abilities based off of what they made themselves for dinner last night rather than what they cook in a restaurant


>tech is so fast moving

I recently meet a programmer in his 50s, still working on Cobol. Sure, you aren't gonna hire him, but do you think he has any worries about his job?


I had to learn cobol last year. Not hard if you have read lots of books from the 70s and 80s and have a strong background in algorithms. It is expensive and annoying to maintain these systems, but until there's a business case to replace the underlying application, he's safe


>tech is so fast moving if you're not at least experimenting with your own code then you're not going to be a good fit

Huh? What kind of stuff ya do?


I very rarely see anyone try anything different in a 9-5 setting. People leave university, join their first job, and spend 3 years learning how to develop in one way, and only one way.

At-work choices are rarely choices, because you just do the next thing the same way as you did the last thing. If you step out of line, management will reel you back in. A 'big' change is switching from Spring to Micronaut (or vice versa).

Some examples: I have pretty strong opinions about ORMs vs SQL. It would be interesting to discuss that on the job with someone who knows both well. But the guy you'll be talking to won't have tried SQL (beyond that academic thing they learnt at uni).

Discussions about languages and types? "Java is good because it has types and JS doesn't" was the level of discourse I got at my first serious job.

Having side projects at least indicates that they can try things out first-hand, own their own feedback loop, experience some surprises, and not just mindlessly cargo-cult "best-practices".


Sounds like the types of environments I’ve run into at either big tech cos or when you’re the tech guy in a non-tech company.

If you want to be able to experiment and change things, go to a startup.


You're missing out on lots of amazing people who treat programming as work and have other hobbies. Or people in temporarily different situations (like young kids). Or people with more important things happening in their life. You're effectively filtering for mostly people in their 20s.

I mean, it's your choice, but your loss too.


Citation needed

All the good programmers I know have side projects and live and breathe code. None of the mediocre/bad ones I know do.

Think of it this way: what's likely to make you a better programmer: spending half (0.5x) of every day on (a work-mandated subset of) code or spending most (1x) of every day on code? The answer is obviously the latter.

And it sounds weird, but most of my programming knowledge comes from outside of work, even the knowledge that I apply at work. Maybe it's because work naturally discourages exploration since you're focused on the company's priorities. For example, I'm never implementing a collection in C at work (I use `std::vector` and stuff). Yet doing that in my own time taught me why certain operations invalidate iterators -- a thing senior coworkers of mine didn't understand and which helped us catch a bug. :D


Sure, but a good employee is not just a good programmer. You don't want an obsessive, you want someone who is stable and has other things going on in his life.


Tangentially related. I was trying to describe to a friend of mine how being discerning after a few years of experience means I do less work (or toil) for a better outcome, and why it's worth the higher salary I command.

Young me would've enjoyed the process of building a thing, ignoring problems like maintenance burden, how fragile or brittle the new tool is, lessons learned from the old tool, etc. etc. (you know how it goes)

Current me will take a task and mull on it. No pen to paper for _days_, preferring a minor adaptation to the existing process, or discovering that the desire was misguided in the first place. The work not done and wisdom to not do it is worth its weight in gold.


Being really smart and inquisitive for 30 years .. I've been lucky that people paid me a lot of money to do whatever it took to make lots of different products work, at levels from extremely low to high. Just because people don't post their work to public archives doesn't mean they aren't working and learning all day. So this filter tends to catch people who are at second tier employers and are a bit bored


> You're effectively filtering for mostly people in their 20s.

Or people who used to be in their 20s.


You also implicitly filter out people who were in their 20s when Github et al didn't exist.


Have you ever considered that people who have a single minded focus on software, might be solely good at software to the detriment of other skills?

8 hours of software development is more than enough practice for a developer. Having an employee with a well balanced life is superior to one who is unable to detach from his work.


Secondend! I like how this checks for both actual work quality, and for cultural fit, in one step.


I think I only know a single programmer that does side projects. Work gives me time to look a new stuff and pays for courses.

I mean I do have side projects, but those are all in a language I'm not specialized in and often single class small helper programs.


I'm often the only one when I work in smaller companies.

And the best developer I ever worked with had zero public repositories in Github. Now that gave me some perspective.


[flagged]


Others prefer to have hobbies outside of coding, and that's fine.


Yup, I used to do side stuff but now I mostly tinker with servos, 3d printing, and mechanic work on my car. I can only ingest so much of the same shit


The good programmers I know have hobbies outside of coding too (gardening, retro computing, skiing), but coding is still one of them. If it's none of them, that's a red flag because it's never the case for any of the good ones I know.


You’re missing out on all the people who channel that passion into doing their job so they don’t feel the need for side projects


I'd certainly rather hire that person than someone who pours all their passion into side projects and none of it into their work.


Sounds like a skill issue, mate, just sayin. Might want to grind leetcode and do some mock interviews to build those skills up.

Look, I know a lot of the job application process is bullshit. But much of the job is bullshit too. You still have to do one kind of BS in order to land the job and the other kind in order to function as part of a team in an organization. There's really no way around that.


Bad advice imo. Code tests became popular in like 2016, were hated by 2020, and are now basically out of fashion.

Also, Leetcode doesn't cover: Async programming, APIs, databases, UI, git, tests, design systems, any known framework or library, how to persist data, how to set up a web server, handle requests, serve a page, write CSS, SQL, use devops/cloud platforms, or basically anything an engineer does everyday. It's basically a sport that most people don't like.


DO YOU WANT THE JOB? Yes or no?

If yes, then you have to jump through whatever hoops the company makes you jump through. Otherwise, you can pass on that company... but most companies are still using whiteboard exercises and coding tests to gatekeep the hiring process and restrict hires to those who can... actually code, so if being made to do leetcode and whiteboard exercises under pressure in an interview environment is a hard pass for you, you will find your employment opportunities profoundly limited. Maybe you are in the right Silicon Valley circles, I dunno, maybe you hobnob with all the right people who can get you a position on your own recognizance because you met at RustConf or whatever. If so, your situation may be typical for a Hackernews but it is not typical for the employment market at large, even in software dev. Most of us have to jump through those hoops, or join the unemployment line.

You're right though -- Leetcode doesn't cover any of those things and neither does fizzbuzz. What it does do is establish a minimum threshold -- can this person solve a basic problem by writing a program? Can they think it through and provide a solution? If they can, then that's an encouraging sign that their prior experience is legit and they were probably exposed to all the other stuff on a previous worksite.

And yes, code tests were "hated" by 2020 -- by devs with blogs that bubble to the front page of Hackernews, but companies still employ them. The new hotness is AI prescreening: you submit video responses to questions and complete an auto-proctored (YC W'21) coding exercise, all of which is evaluated by ChatGippity. If computer says no, you don't make it to the round of live interviews. Devs hate all of this -- but management loves it because it's a cheap filter for whoever is willing to do whatever it takes to land the role. They have far more applicants than they do open roles, so whoever applies has to WORK to make the cut. What devs like or hate doesn't matter -- if you want the job, you comply with whatever policies and procedures the company has set in place for the application process.

Anyway, how do you propose it be done? How do you suggest companies screen for fraudulent applicants who can't code their way out of a soap bubble? Because they're out there... I've seen it. I've been part of the hiring process myself and seen with my own two eyes some of the stunts people will pull to land a job they are far, far from qualified for.


> restricted those who can... actually code

Haha I'm sure you'd love to believe it - let me ask you this: Why does AI easily solve all the Leetcode problems but still can't design a decent web page? API? Auth system?

Leetcode doesn't cover 99% of what a developer does - it's like taking fencing courses and claiming to be a military expert. Leetcode skips SQL, REST/HTTP, OOP and framework patterns, frameworks themselves (React etc.), serving a page, tests, working with files, doesn't even cover basic async functionality (core to writing high quality code), git, ORMs and NoSQL, doesn't even touch devops or UI. It's "add the numbers in this array" type problems - memorization techniques like math problems and other abstract academia.

I prefer people who build products you can see and use, and have other evidence of their work online like libraries they author or contribute to.

Btw, Leetcode is pretty out of fashion at this point for interviews.


Coding interviews are smart!

Use coding interviews to tell corporations that the Queer Autonomous Qommunist Revolution (QAQR) is coming for them.

The mascot of QAQR is a rubber ducky that we use to question the need for employment and bosses. It seizes all corporate source codes, and turns them into Open Source. It strikes fear and rebellion.

Use coding interviews to spread QAQR propaganda and memes! This is very human!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: