Hacker News new | past | comments | ask | show | jobs | submit login
Coding interviews are stupid (ish) (darrenkopp.com)
260 points by darrenkopp 39 days ago | hide | past | favorite | 651 comments



Assuming that a company does not look for candidates who are naturally good at ICPC-type of questions or geniuses who can come up with amazing algorithms in a matter of minutes, there is actually a different way to do coding interview: just give a high-level description of a sophisticated enough algorithm to a candidate and ask the candidate to turn that into code. I think it strikes a good balance between depth in CS and the coding abilities. This type of interview is similar to what engineers do at work too. We rarely invent new algorithms, but we do read white papers or blog entries or books to pick an algorithm to implement.

There are many variations in questions too: search a tree or graph with some customized criteria, using a double buffer to implement a tree-style prefix scan, implementing an integer multiplication with unlimited digits, some streaming algorithm, tree-walking to transform one tree to another, a simplified skip list, the options are unlimited. A good candidate tends to grasp the high-level concepts quickly (and they can ask questions), and is quick to convert intuition into working code. I find that there is a strong positive correlation between the performance in work and the performance in such coding interviews.


I still don't get why such questions are even asked as most jobs I've ever had not even remotely touched those and I've touched quite a few industries, technologies and types of companies.

To me, the value of a software engineer is to ask questions, make hypotheses and be able to iterate quickly. Balancing trees, leetcode and other algorithmic stuff on the spot sounds like bringing the dreadful education system structure to the real world.

Also if a senior person can't in 30/45 min of talking with someone figure out the general experience level then the problem is them, really.


> if a senior person can't in 30/45 min of talking with someone figure out the general experience level then the problem is them, really.

This is why I've always been so confused. Why is the software engineering interview wildly different from the traditional engineering interview (seniors sit down with candidates and discuss how to solve a relevant proxy problem the team is currently undergoing (has the side benefit of making the interview process potentially fruitful if you don't go with that candidate. This can be (and sometimes is) abused though)). I mean... we all speak the same language, and that isn't standard English... right?


> Why is the software engineering interview wildly different from the traditional engineering interview

I have my personal theory.

1) Top companies receive way more applications than the positions they have open. Thus they standardised around very technical interview as ways to eliminate false positives. I think these companies know this method produces several false negatives, but the ratio between those (eliminating candidates that wouldn't make it and missing on great candidates) is wide enough that it's fine. It does leads to some absurd results (such as people interviewed to maintain a library not being qualified despite being the very author) though.

2) Most of these top companies grew at such rates and hiring so aggressively from top colleges that eventually the interview was built by somewhat fresh grads for other fresh grads.

3) Many companies thought that replicating the brilliant results of these unicorns implied copying them. So you get OKR non sense or such interviews.


Yup. And 3) is particularly interesting. Lots of companies actually need to hire people who can get things done and who can build user-friendly software, yet they thought they needed to hire people who could turn any O(N^2) algorithms into O(N) or O(Nlog(N)).

And even for Google, leetcode has become noise because people simply cram them. When Microsoft started to use leetcode-style interviews, there were no interview site, and later there was this Cracking the Interview at most. So, people who aced the interview were either naturally talented or were so geeky that they devour math and puzzle books. Unfortunately, we have lost such signals nowadays.


> yet they thought they needed to hire people who could turn any O(N^2) algorithms into O(N) or O(Nlog(N))

And the great irony is that most software is slow as shit and resource intensive. Because yeah, knowing worst case performance is good to know, but what about mean? Or what you expect users to be doing? These can completely change the desired algorithm.

But there's the long joke "10 years of hardware advancements have been completely undone by 10 years of advancements in software."

Because people now rely on the hardware for doing things rather than trying to make software more optimal. It amazes me that gaming companies do this! And the root of the issue is trying to push things out quickly and so a lot of software is really just a Lovecraftian monster made of spaghetti and duct tape. And for what? Like Apple released the M4 today and who's going to use that power? Why did it take years for Apple to develop a fucking PDF reader that I can edit documents in? Why is it still a pain to open a PDF on my macbook and edit it on my iPad? Constantly fails and is unreliable, disconnecting despite being <2ft from one another. Why can't I use an iPad Pro as my glorified SSH machine? fuck man, that's why I have a laptop, so I can login to another machine and code there. The other things I need are latex, word, and a browser. I know I'm ranting a bit but I just feel like we in computer science have really lost this hacker mentality that was what made the field so great in the first place (and what brought about so many innovations). It just feels like there's too much momentum now and no one is __allowed__ to innovate.

To bring it back to interviewing signals, I do think the rant kinda relates. Because this same degradation makes it harder to determine in groups when there's so much pressure to be a textbook. But I guess this is why so many ML enthusiasts compare LLMs to humans, because we want humans to be machines.


Many software programs fail to achieve ultimate efficiency either because the software engineers are unable to do so, or because external factors prevent them from achieving it. I believe that in most cases, it is the latter.


I'd like to think the later because it makes us look better but I've seen how a lot of people code... I mean GPT doesn't just produce shit code because it can't reason... It'll only ever be as good as the data it was trained on. I teach and boy can I tell you that people do not sit down and take the time to learn. I guess this is inevitable when there's so much money. But this makes the interview easy, since passion is clear. I can take someone passionate and make them better than me but I can't make someone in it for the money even okay. You're hiring someone long term, so I'd rather someone that's always going to grow rather than someone who will stay static, even if the former is initially worse.

IME the most underrated optimization tool is the delete command. People don't realize that it's something you should frequently do. Delete a function, file, or even a code base. Some things just need to be rewritten. Hell, most things I write are written several times. You do it for an essay or any writing, why is code different?

Yeah, we have "move fast and break things" but we also have "clean up, everybody do their share." If your manager is pushing you around, ignore them. Manage your manager. You clean your room don't you? If most people's code was a house it'd be infested with termites and mold. It's not healthy. It wants to die. Stop trying to resuscitate it and let it die. Give birth to something new and more beautiful.

In part I think managers are to blame because they don't have a good understanding but also engineers are to blame for enabling the behavior and not managing your managers (you need each other, but they need you more).

I'll even note that we jump into huge code bases all the time, especially when starting out. Rewriting is a great way to learn that code! (Be careful pushing upstream though and make sure you communicate!!!) Even if you never push it's often faster in the long run. Sure, you can duct tape shit together but patch work is patch work, not a long term solution (or even moderate).

And dear God, open source developers, take your issues seriously. I know there's a lot of dumb ones, but a lot of people are trying to help and wanting to contribute. Every issue isn't a mark of failure, it's a mark of success because people are using your work. If they're having a hard time understanding the documentation, that's okay, your docs can be improved. If they want to do something your program can't, that's okay and you can admit that and even ask for help (don't fucking tell them it does and move on. No one's code is perfect, and your ego is getting in the way of your ego. You think you're so smart you're preventing yourself from proving how smart you are or getting smarter!). Close stale likely resolved issues (with a message like "reopen if you still have issues") but dear god, don't just respond and close an issue right away. Your users aren't door to door salesmen or Jehovah's Witnesses. A little kindness goes a long way.


> And the great irony is that most software is slow as shit and resource intensive

You really need those 100x faster algorithms when everything is a web or Electron app.


I’d add another factor to #1: this feels objective and unbiased. That’s at least partially true compared with other approaches like the nebulous “culture fit” but that impression is at least in part a blind spot because the people working there are almost certainly the type of people who do well with that style and it can be hard to recognize that other people are uncomfortable with something you find natural.


I would say that it makes the interview process more consistent and documented, and less subject to individual bias. However there's definitely going to be some bias at the institutional level considering that some people are just not good at certain types of interview questions. Algorithm and data structures questions favor people who recently graduated or are good at studying. Behavioral interviews favor people who are good at telling stories. Etc.


Yes, to be clear I’m not saying it’s terrible - only that it’s not as objective as people who like it tend to think. In addition to the bias you mentioned, the big skew is that it selects for people who do well on that kind of question in an unusual environment under stress, which is rarely representative of the actual job. That’s survivable for Google – although their lost decade suggests they shouldn’t be happy with it – but it can be really bad for smaller companies without their inertia.


Yeah I buy this theory.

The problem I have with it is that for this to be a reasonably effective strategy you should change the arbitrary metric every few years because otherwise it is likely to be hacked and has the potential to turn into a negative signal rather than positive. Essentially your false positives can dominate by "studying to the test" rather than "studying".

I'd say the same is true for college admissions too... because let's be honest, I highly doubt a randomly selected high school student is going to be significantly more or less successful than the current process. I'd imagine the simple act of applying is a strong enough natural filter to make this hypothesis much stronger (in practice, but see my prior argument)

People (and machines) are just fucking good at metric hacking. We're all familiar with Goodhart's Law, right?


I think (but cannot prove) that along the way, it was decided to explicitly measure ability to 'study to the test'. My theory goes that certain trendsetting companies decided that ability to 'grind at arbitrary technical thing' measures on-job adaptability. And then many other companies followed suit as a cargo cult thing.

If it were otherwise, and those trendsetting companies actually believed LeetCode tested programming ability, then why isn't LeetCode used in ongoing employee evaluation? Surely the skill of programming ability a) varies over an employee's tenure at a firm and b) is a strong predictor of employee impact over the near term. So I surmise that such companies don't believe this, and that therefore LeetCode serves some other purpose, in some semi-deliberate way.


I do code interviews because most candidates cannot declare a class or variable in a programming language of their choice.

I give a very basic business problem with no connection to any useful algorithm, and explicitly state that there are no gotchyas: we know all inputs and here’s what they are.

Almost everyone fails this interview, because somehow there are a lot of smooth tech talkers who couldn’t program to save their lives.


I think I have a much lazier explanation. Leet code style questions were a good way to test expertise in the past. But the same time everyone starts to follow suit the test becomes ineffective. What's the saying? When everyone is talking about a stock, it's time to sell. Same thing.


> If it were otherwise, and those trendsetting companies actually believed LeetCode tested programming ability, then why isn't LeetCode used in ongoing employee evaluation?

Probably recent job performance is a stronger predictor of near future job performance.


so having done interviews, just because the latter may be more present, does not mean the hordes of people just throwing a spaghetti made-up-resume at the wall have gone away. our industry has a great strength in that you don't need official credentialing to show that you can do something. at the same time, it is hard to verify what people are saying in their resumes, they might be lying in the worst case but sometimes they legitimately think they are at the level they are interviewing for. it was bad before the interest rate hikes, i cannot imagine what the situation is like now that hiring has significantly slowed and a lot more people are fighting for fewer jobs.

i did interviews for senior engineer and had people fail to find the second biggest number in a list, in a programming language of their own choosing. it had a depressingly high failure rate.


I had a candidate claiming over ten years of experience who couldn’t sum an array of ints in any language of his choosing.

This wasn’t an off-by-one or didn’t handle overflow, but rather was couldn’t get started at all.


Ten years of experience at one of those places where every keystroke outside powerpoint is offshored. Why would they know how to sum ints? Some people do start their careers as what could best be described as software architecture assistants. They never touched a brick in their lives, to go with the architecture image.


I have junior and senior students that struggle with fizzbuzz... But damn, are they not allowed to even do a lazy inefficient `sort(mylist)[-2]` if they forgot about for loops? That's the most efficient in terms of number of characters, right haha

But I still think you can reasonably weed these people out without these whiteboard problems. For exactly the same reasons engineers and scientists can. And let's be honest, for the most part, your resume should really be GitHub. I know so much more about a person by walking through their GitHub than by their resume.


Using GitHub is discriminatory against people who don’t code on the weekends outside of their jobs, and most people’s job related code would be under NDA and not postable on Github.

To be a capital E Engineer you have to pass a licensing exam. This filter obviously is not going to catch everything but it does raise the bar a little bit.

—-

As far as the root question goes, they are allowed to propose that, and then i can try and tease out of them why they think that is the best and if something is better. But you would be surprised at the creative ways people manage to not iterate through a full loop once.


You're right. But a lot of people that are good coders code for fun. But you're also right that not all those people push their code into public repositories. The same is true for mechanical engineers. They're almost always makers. Fixing stuff at home or doing projects for fun. Not always, but there's a strong correlation.

But getting people to explain projects they did and challenges they faced can still be done. We do it with people who have worked on classified stuff all the time. If you're an expert it's hard for people to bullshit you about your domain expertise. Leet code is no different. It doesn't test if you really know the stuff, it tests how well you can memorize and do work that is marginally beneficial in order to make your boss happy. Maybe that's what you want. But it won't get you the best engineers.


Leet code, in the interviews that I do, is not the only thing I do.

But when I am asked to do a one hour part of an interview for four interview loops a week, all the preps and debriefings, and also do all my normal day-to-day deliverables, we need some fast filters for the obvious bullshitters. The interviewing volume is very high and there is a lot of noise.


Hiring people who code at all is discrimination against people who played video games instead of learning to code.


“Codes on their spare time” is not part of the job description, but “codes at all” is.

There are plenty of reasons not to code on spare time. If anything the people who are most likely to do that are often also the people who coding interviews are supposed to be privileging, fresh single college grads.

I don’t know how people would square the statements “take-home assignments are unpaid labor and unfair to people with time commitments” and then do a 180 and say “people should have an up-to-date fresh github that they work on in their spare time.”


If it would take the candidate "spending every waking moment of their lives coding" to have one or two small coding projects after a half decade plus in the field, that's a signal.

If you went to college but never made anything, that's a signal.

If you didn't go to college, and never made anything, just shut up and make something.


In a half decade plus some people pick up other commitments that are not side projects, like a pet, a child, sports, hiking, etc.

At the end of the day, it isn’t really relevant to the employer what is done in spare time off the job when they get hired, so it’s not like I should privilege their spare time projects over people who don’t do that, particularly if people don’t want to code off the clock. There are plenty of good engineers who do not code off the clock, and there are plenty of bad engineers who do.

Also, more often than not, coding off the clock for the sake of having a portfolio is not really indicative of anything. There aren’t, for example, review processes or anything like that in a one person side project, and unless I spend significantly more hours background checking who’s to say the side project is not plagiarized? People already lie on their resumes today.


In the time you took writing this comment you could've gotten the repo created and the title and description filled out. Writing text in a public readme.md would serve you better than sending it to me.


I have side projects, but I don't expect every candidate to, nor do I expect every candidate to be a religious reader of every HN comment thread.


I'm not saying it should be mandatory, but they would have to show mastery some other way. Whiteboard? Live coding? Project?

I think a side project opens up the opportunity to skip that for a project presentation. This is a lot more in line with real life as you would typically code and later present that work to others. You would defend it to some degree, why you made choice A vs choice B. If you created it, you'll be able to do that.

Doesn't need to be a huge thing. Just show you can do anything at all really at the junior level. Intermediates can show mastery in a framework with something with slightly more complexity than a "hello world".


> I'm not saying it should be mandatory, but they would have to show mastery some other way. Whiteboard? Live coding? Project?

godelski said your resume should really be GitHub. You could have said this instead of sarcasm.


Typically the people without side projects also make excuses to not do those either.

If I had a company I'd offer looking over an existing project or a project where you create a side project of your choice without any further direction.

So not mandatory but the easiest way to go probably. Once you apply to my company you'll have one to show for next time at least.

(If you want to write the project out on the whiteboard instead I guess feel free, that seems hard though.)


Many people do not have side projects. Few people working as software engineers were not tested in some way.

I think it's more useful and more fair to give candidates some direction when I request something. What scope is enough? What tests are enough? We define side project differently or you would expect much more time from candidates than I do.


> What scope is enough? What tests are enough?

How each candidate answers each question themselves tells you about them.


I used to think so. But real tasks have acceptance criteria. Seeing how candidates work with loose criteria has told me more than telling them in effect to read my mind.


It's so clear that people on this site have no friends, family, or hobbies. I also don't think you realize how ableist your comment is.

Some people have more trouble completing tasks for reasons that have nothing to do with talent or intelligence.


this meaning no 1) is the right answer


> Why is the software engineering interview wildly different from the traditional engineering interview

One angle is that SWE is one of the very few professions where you don't need a formal degree to have a career. It's also a common hobby among a sizable population.

I think this is truly great. A holdout breathing hole where people can have lucrative careers without convincing and paying off a ton of gatekeepers!

But I also think that when you hire in other industries, you can get much more milage from looking at the candidate's formal degrees and certifications.

In our industry, you kinda have to start from scratch with every person.


> But I also think that when you hire in other industries, you can get much more milage from looking at the candidate's formal degrees and certifications.

> In our industry, you kinda have to start from scratch with every person.

Not really - in software people leave a bigger and more easily trackable track record than any other engineering field. From previous work projects/experience to open source projects/experience, from personal projects to the communities a person belongs to. A lot of stuff is directly visible on the Internet. In other engineering fields, you have to trust what the applicant says in his or her resume and maybe at most you can call the previous companies he worked at for reference. In software, a lot of the trail is online or easy to tell, and you can still call.

Even for totally new graduates, it is better in software: Its much easier for a software undergrad to work part-time or in a hobby project or contribute to open source and produce something before he or she graduates, so that you can assess his skills. Its much harder for a mechanical or civil engineer to do that, so for that reason you have to rely solely on the relevant university/college and the grades of the candidate.


> Not really - in software people leave a bigger and more easily trackable track record than any other engineering field. From previous work projects/experience to open source projects/experience, from personal projects to the communities a person belongs to.

That only apply to software people who either (a) are getting paid to work on open source or (b) have enough spare time to work on open source as a hobby after hours. Option (b), in particular, usually implies having no children or other familial responsibilities.


And b also implies programming is also a hobby. For a lot of people it is their work. We can not filter on that as not many will be left to hire.


Nnno. You start from their experience and give the benefit of the doubt. As someone who’s been in software for 12 years, I don’t want to talk about writing algorithms. I want to talk about how to motivate 150 engineers to actually write code correctly or inspire a technical initiative.


To have good comparison/calibration between candidates, you should be asking the same question each time, so it can't be about the "problem the team is currently undergoing", because that's going to be something different every week/month.

In general however, of course, there is/should be a round of interview that covers architecture/system design. It's just that the coding interview is a different interview type, which gives a different kind of signal, which is still important. It doesn't replace architecture interview, it complements it.


> because that's going to be something different every week/month.

Why's that a problem? What you're going to be doing on the job is going to change at the exact same rate. But people also tend to talk about recent problems and those may be even a month old. Honestly, the questions are about seeing how the person would approach it. It is not about solving them, because you're going to be doing things you don't know the answers to beforehand anyways.

> It's just that the coding interview is a different interview type

For what reason? "Because"?


> Why's that a problem?

The first half of the sentence you're responding to answers this question already. Because you can't compare candidates fairly if you ask everyone a different question. Is a candidate who aced an easy question better or worse than a candidate who struggled with a difficult question?

> For what reason? "Because"?

What are you asking? Why is an interview where you ask about high level design different from an interview where you ask to write code? Isn't that like asking why an apple is different from an orange? They just are, by definition.


Mechanical engineering interviews seem to do the same as software: "Engineers always ask about beam bending, stress strain curves, and conservation of work. Know the theory and any technical questions are easy."

Basically an equivalent of simple algorithmic questions. Not "real" because it's impossible to share enough context of a real problem in an interview to make it practical. Short, testing principles, but most importantly basic thinking and problem solving facilities.


> Mechanical engineering interviews seem to do the same as software:

I've been an engineer in the past (physics undergrad -> aerospace job -> grad school/ml). I have never seen or heard of an engineer being expected to solve math equations on a whiteboard during an interview. It is expected that you already know these things. Honestly, it is expected that you have a reference to these equations and you'll have memorized what you do most.

As an example, I got a call when I was finishing my undergrad for a job from Raytheon. I was supposedly the only undergrad being interviewed but first interview was a phone interview. I got asked an optics question and I said to the interviewer "you mind if I grab my book? I have it right next to me and I bookmarked that equation thinking you might ask and I'm blanking on the coefficients (explain form of equation while opening book)". He was super cool with that and at the end of the interview said I was on his short list.

I see no problem with this method. We live in the age of the internet. You shouldn't be memorizing a bunch of stuff purposefully, you should be memorizing by accident (aka through routine usage). You should know the abstractions and core concepts but the details are not worth knowing off the top of your head (obviously you should have known at some point) unless you are actively using them.


I've had a coding interview (screen, not whiteboard) fail where the main criticism was that one routine detail I took a while to get right could have been googled faster. In hindsight I still doubt that, given all the semi-related tangents you end up following from Google, but that was their expectation, look up the right piece of example code and recognize the missing bit (or get out right immediately).

For a proper engineering question (as in not software), I'd expect the expected answer to be naming the reference book where you'd look up the formula. Last thing you want is someone overconfident in their from memory version of physics.


> Last thing you want is someone overconfident in their from memory version of physics.

Honestly, having been in both worlds, there's not too much of a difference. Physics is harder but coding you got more things to juggle in your brain. So I really do not think it is an issue to offload infrequent "equations"[0] to a book/google/whatever.

[0] And equations could be taken out of quotes considering that math and code are the same thing.


I had a senior engineer chastise me once for NOT using the lookup tables.

"How do you know your memory was infallible at that moment? Would you stake other people's lives on that memory?"

So what you did on that phone interview was probably the biggest green-flag they'd seen all day.


We live in the age of ChatGPT. It might actually be time to assess how candidates use it during interviews. What prompts they write, how they refine their prompts, how they use the answers, whether they take them at face value, etc.


Sure, and we live in the age of calculators. Just because we have calculators doesn't mean we should ban them on math tests. It means you adapt and test for the more important stuff. You remove the rote mundane aspect and focus on the abstract and nuance.

You still can't get GPT to understand and give nuanced responses without significant prompt engineering (usually requiring someone that understands said nuance of the specific problem). So... I'm not concerned. If you're getting GPT to pass your interviews, then you should change your interviews. LLMs are useful tools, but compression machines aren't nuanced thinking machines, even if they can mascaraed as such in fun examples.

Essentially ask yourself this: why in my example was the engineer not only okay with me grabbing my book but happy? Understand that and you'll understand my point.

Edit: I see you're the founder of Archipelago AI. I happen to be an ML researcher. We both know that there's lots of snakeoil in this field. Are you telling me you can't frequently sniff that out? Rabbit? Devon? Humane Pin? I have receipts for calling several of these out at launch. (I haven't looked more than your profile, should I look at your company?)


I'm actually not talking about interviewees (ab)using ChatGPT to pass interviews and interviewers trying to catch that or work around that. I'm talking about testing candidates' use of ChatGPT as one of the skills they have.

> I see you're the founder of Archipelago AI.

I don't know where you got that from, but I'm not.


> I'm talking about testing candidates' use of ChatGPT as one of the skills they have.

The same way? I guess I'm confused why this is any different. You ask them? Assuming you have expertise in this, then you do that. If you don't, you ask them to maybe demonstrate it and try to listen while they explain. I'll give you a strong hint here: people that know their shit talk about nuance. They might be shy and not give it to you right away or might think they're "showing off" or something else, but it is not too hard to get experts to excitedly talk about things they're experts in. Look for that.

> I don't know where you got that from, but I'm not.

Ops, somehow I clicked esafak's profile instead. My bad


You might as well ask how they use book libraries and web search.


I'm a chemist by education, so all my college friends are chemists.

Being asked a theoretical chemistry question at a job interview would be...odd.

You can be asked about your proficiency with some lab equipment, your experience with various procedures and what not.

But the very thought of being asked theoretical questions is beyond ridiculous.


Why, don't they get imposters? You sure run into people who can't code in coding interviews.


Because to be a chemist you need to graduate in chemistry.

What would be the point of asking theoretical questions?

There's just no way in hell people can remember even 10% of what they studied in college, book knowledge isn't really the goal, rather than teaching you how to learn and master the topics.


Because to actually have those types of conversations you have to have legitimate experience. To be a bit flippant, here's a relevant xkcd[0]. To be less so, "in groups" are pretty good at detecting others in their groups. I mean can you not talk to another <insert anything where you have domain expertise, including hobbies> and not figure out who's also a domain expert? It's because people "in-group" understand nuance of the subject matter.

[0] https://xkcd.com/451/


Doesn’t that comic more closely hew to the idea that some fields are complete bullshit?


That's one interpretation. But that interpretation is still dependent upon intra-group recognition. The joke relies on the intra-group recognition __being__ the act of bullshitting.


Hmm… I have a twist on this. Chemistry is a really big field.

My degree is in computational/theoretical chemistry. Even before I went into software engineering, it would have been really odd for me to be asked questions about wet chemistry.

Admittedly it would have been odd to be quizzed on theory out of the blue as well.

What would not have been odd was to give a job talk and be asked questions based on that talk; in my case this would have included aspects of theory relevant to the simulation work and analysis I presented.


And software and computing isn’t a big field? Ever heard of EE?


Half a dozen years ago in a conference talk, Joel Spolsky claimed credit for inventing these sorts of whiteboard interviews (with his Guerilla Guide to Interviewing), and that it had broken software engineering hiring.

https://thenewstack.io/joel-spolsky-on-stack-overflow-inclus...


FTA:

“I think you need a better system, and I think it’s probably going to be more like an apprenticeship or an internship, where you bring people on with a much easier filter at the beginning. And you hire them kind of on an experimental basis or on a training basis, and then you have to sort of see what they can do in the first month or two.”

Well, if he fucked it up, I don’t see any reason why his ideas can’t also fix it.


Unfortunately this only works for interns and new grads. Nobody experienced want to take a job on an experimental basis.


Fortunately people with experience have resumes and are easier to tell if they're bsing their resume.

Fuck man, people do this with engineers who work on classified projects. You all are over thinking it. You're trying to hyper optimize for a function that is incredibly noisy and does not have optimal solutions.


And how would it scale to a number of candidates greater than one? A classroom full of competing peers? That's what talent shows are for.


Aren't probationary periods pretty standard, in many/all industries and countries too not just software?


Yes and such a system makes hiring so much easier because mistakes cost much less. But the US ties things like healthcare to employment so a company that has a reputation for firing people after hiring them (however legitimate) would probably be one people would avoid. In Sweden, for example, I’ve found interviews so much more reasonable. Then again, I had healthcare there regardless of employment.


Oh, I see. I'm in the UK, so more like Sweden; no experience of having healthcare tied to employment (other than optional private healthcare as a perk).


> Why is the software engineering interview wildly different from the traditional engineering interview

I am guessing here, but wouldn't a candidate for a traditional engineering role normally hold a college degree in a relevant field, so that part of quality assurance is expected to have been done by the college?


Candidates for software engineering roles normally hold a degree in a relevant field.


> if a senior person can't in 30/45 min of talking with someone figure out the general experience level then the problem is them, really.

Being able to evaluate a person is a difficult soft skill to learn. An interviewer cannot learn nor improve it over night nor months nor years. This is basically being good at reading a person. Not to mention an issue with bias that is highly subjective.

If an interviewer isn't good at this, the solution would still be to supplement your evaluation with a coding interview.


I've only ever had a single whiteboard interview in my career, and it was a single interviewer who preferred them (I accepted the job), but I have also walked through the backdoor via recommendations for all but 1 of my employers in ~20 years in the industry. From embedded in radio and television broadcasting, to medical robotics, to AAA games, with some excursions into web development. Every other interview at a company I accepted an offer for was a conversation with engineers about my experience and some hypotheticals.


If you talk shop with a mechanic, they're going to know pretty quickly if you actually know what you're talking about. In my experience, the same applies in our field.


> I still don't get why such questions are even asked

The thesis is not that these exercises are representative of work but rather predictive of performance.

Sales has a similar phenomenon with sports. While there is no athleticism involved in selling, many believe history in competitive sports to be a positive predictor of sales success.

---

You can reasonably argue whether leetcode accomplishes this well or poorly, but...

Always remember that purpose of an interview (for an employer) is to predict performance. So you are looking for the resume screening+interview process that most accurately assesses that.


> Sales has a similar phenomenon with sports. While there is no athleticism involved in selling, many believe history in competitive sports to be a positive predictor of sales success.

This is interesting. I had never heard this before. Is there any research on this? For that matter, is there any actual research showing that leetcode-style interviews actually do predict performance? If so, do they do so any better or differently than an IQ test?


I recall reading that leetcode style questions are basically just a standin for an IQ test, except with plausible deniability for being directly applicable to the job.


I saw this claim. I never saw data.


So you're saying that someone wasting their time studying leetcode to pass a stupid game is a good indicator?

I would almost believe the opposite: if you actually pass those tests with flying colours, it shows me that you believe you needed to do that to be hired, while someone who's actually experienced would never in a million years step down so low.


I believe skill at “leetcode problems” is predictive of general programming skill. Someone who can solve leetcode problems can almost certainly learn css. But, clearly from reading comments here, not the other way around.

Personally I love leetcode style problems. They’re fun. And useful - I use this stuff in my job constantly.


I would be scared you are overengineering and optimising things though. I have seen people implementing complex paradigms and weird optimisations instead of writing simple code just to make sure they are perfectly optimising.

E.g. optimising client side code where N is likely never to be above 300, but instead of few simple lines, write a complicated solution to make sure it can scale to billions and beyond.

I would take any problem solving energy and spend it on side projects instead of doing leetcode. I do like those exercises, but I enjoy building new things more and which gives me practical experience which I think is more important.


Skill gives you the gift of choice. You know how to write it either way around and it’s up to you to decide. Being able to correctly decide when to hack something inefficient together and when to optimise is another skill issue. It’d make a good interview question, that.


Yeah, but leetcode does not necessarily give you that skill or even prove it. And a talented problem solver would be able to find optimal and practical solutions when they are required and are not premature even without doing leetcode.

You might get false positives as well. E.g. you get people who are tunnel visioned on leet code, cracking the coding interview and other common system design books, they know all the answers, but then they completely lack common sense day to day and it can be hard to test for that if you are solely focusing on leetcode.


Of course. I’ve interviewed over 400 people in my career and I’ve never directly asked someone if they’ve done leetcode problems. I don’t care about actual leetcode. Looking at someone’s progress on leetcode as a replacement for an interview would be a terrible idea.

As an interviewer, I care about their skills - technical skills (like debugging ability and CS knowledge), social skills (can we talk about our work together?) and judgement. Your ability to understand data structures and algorithms is signal, but there is a lot more to a good candidate than knowing how to make a bit of code run fast. Knowing when to make code fast is, as you say, just as important.


It's a good indicator for someone will to jump through arbitrary hoops to get promoted at a big corp: write all the paper work as design docs, get all the promos requirements and make sure all the weird business requirements of corpo customers are met. If you are not willing to "step down so low" you a perfect example of the someone who they want to filter out.


I didn't state my personal opinion above, but yes I have seen leetcode aptitude to positively correlate with day-to-day problem solving.


If your company uses leetcode to filter out employees then you are a leetcode team with your internal levels and ranks, not at all representative of the whole population of skilled IT people.


How would you, or anyone, know if your career is representative of the field?

I’m sure plenty of people spend their career never learning or using data structures and algorithms knowledge. But I suspect plenty of people spend their career using this stuff all the time. Eg people who work in databases, compilers, operating systems, video games, ai, crypto currencies, and so on.


I have seen Goodhart's law.


> The thesis is not that these exercises are representative of work but rather predictive of performance.

Well said. Just like in college, calculus (not analysis) and organic chemistry are used as filter courses. Of course, why the two courses, especially organic chemistry, are so hard to American students is another topic. I personally think that it shows the failure of the education system of the US.


> Also if a senior person can't in 30/45 min of talking with someone figure out the general experience level then the problem is them, really.

OK take it as read that your posturing has succeeded and we all agree that you're a brilliant interpersonal genius and the rest of us are all useless chumps. What then? The rest of us still need to interview and make hiring recommendations. Or are you suggesting that employers should fire anyone who lacks your magical talent?


Why would you read that?

I just mean that if you put 2 experienced people talking about a topic they both know, it should be pretty easy for both (or at least the interviewer) to get a rough understanding of the level of the interviewee.


> I just mean that if you put 2 experienced people talking about a topic they both know, it should be pretty easy for both (or at least the interviewer) to get a rough understanding of the level of the interviewee.

Well, in my experience it's not, at least not in a "hostile" context. Most technical people are used to assuming good faith in technical conversations, and there are some very smooth bullshitters around; it's easy to construct a verbal facade that only falls apart once you ask someone to actually code.


Anyone who can't see through the 'verbal facade' is not that good. They may be very smart, but not at a bonafide genius level.


It's not. You will find plenty of examples in this very thread.


Personally I think such questions have three values:

- Future proof. Unless I work for an outsourcing company, sooner or later I will want to push the envelope, or so I hope I do. And to push the envelop, one needs good CS fundamentals (maybe there are some exceptions in some specialized field). Think about React. It's a JS framework, yet to invent it one needs to understand at least compiler and graph.

- Geekiness/talent filter. The same reason that the nascent Google and Microsoft and any elite companies like Jane Street asked Putnam questions, Martin Gardner questions, ICPC questions, and clever probability puzzles. Whether it's a good idea is debatable, but at least those companies want to have such type of people and they were hugely successful. Note the word filter: people who pass the interview may not be good, but failing the interview means the candidate may not be smart or geeky enough. Again, I'm not endorsing the idea but just exploring the motivation of the interview policies.

- Information density. Assuming a company does want to time box a coding interview in an hour, it will be hard to come up with realistic question without spending too much time on the context. On the other hand, it's relatively easy to come up with a data structure/algorithm question that packs enough number of abstractions and inspection points to examine one's programming skill.


> Think about React. It's a JS framework, yet to invent it one needs to understand at least compiler and graph.

Are you hiring people to create new JS frameworks, or to use an existing one?


Well, I guess the example is more confusing than clarifying. I used that for a case of pushing the envelope. When Facebook needed a better solution for their feeds, they invented Reactive, and I was saying that one would need to know compiler to build JSX and need to graph to build the optimized virtual DOM. Yes, nowadays we just hire Reactive users, but my point is that in the future we may have another moment that we need to invent something, and as a company founder I sure would like to hire the author of Reactive or the like.


To use an existing one.

And tools are best used when they are understood.


Hiring, like many things - including engineering - is about tradeoffs.

Is someone more knowledgeable "better"? Sure. But sometimes it's about getting a specific thing done, and if you're hiring someone to improve a web thing that uses React, maybe you don't need someone who understands compilers and the kernel and how the underlying hardware works and all that. Maybe you can spend a bit less and get someone who will do a perfectly adequate job.


I don't know anything about quantum mechanics. Does that mean I don't understand how computers work?


The common advice is to understand one level above and one level below where you operate.

For example:

If you build computers, you should understand how the individual components are implemented.

If you build computer components, you should understand materials science and electromagnetism.

If you study electromagnetism, you should understand quantum mechanics and relativity.


How many react developers understands compilers & graphs? I'd wager that it's way below 10%.


Also why do you need knowledge on compilers and graphs to create React?

Firstly, React is not compiled. Secondly the graph or tree or whichever aspect can come naturally when you come up with the idea of how it would be best to maintain JS apps.

In fact most important experience should have been grinding put tons of unmaintainable JS apps to come up with something like that.


I wouldn't argue with that.


> Also if a senior person can't in 30/45 min of talking with someone figure out the general experience level then the problem is them, really.

Actually most of them, including the really inexperienced juniors have 'figured' you out in less that 15 minutes, or at least they have decided whether to hire you or not in 15 minutes. But they have to put on a charade of being fair.

Also a 'white' older male is the least preferred even if he is smarter compared to all females and the minorities that are being interviewed as long as they are not terrible. Biases galore.


Got any evidence for the following claim:

“ Also a 'white' older male is the least preferred even if he is smarter compared to all females and the minorities that are being interviewed as long as they are not terrible. Biases galore.”


Isn't that the definition of affirmative action? Which most companies claim to do (e.g Google [1]).

Admittedly Google also claims not to discriminate on the basis of protected characteristics, which is somewhat contradictory to the definition of affirmative action as positive discrimination [2].

[1] https://www.google.com/about/careers/applications/eeo/ [2] https://en.wikipedia.org/wiki/Affirmative_action


The Wikipedia article did not define affirmative action as positive discrimination.


“Affirmative action (also sometimes called reservations, alternative access, positive discrimination or positive action in various countries' laws and policies)…”

It calls them the same thing


No. Positive discrimination is a form of affirmative action in various countries' laws and policies. Positive action is another. Positive action is not positive discrimination. Paragraph 2 mentioned merely targeting encouragement for increased participation even if you did not know what positive action meant.


The text attests to both our interpretations.

1) By saying x is sometimes called y under some circumstances, the text implies x (Affirmative action) is equal to or a subset of y (Positive discrimination)

2) The second paragraph also suggests examples of affirmative action that wouldn’t constitute positive discrimination


The text would support your interpretation if the article ended after the 1st sentence. And the 1st sentence did not mention positive action.


He's close. The real bias is against extremely good looking guys named Rene who are also way smarter and charismatic than everyone else. Terrible bias.


I dunno. "Rene" sounds a lot like "renege." Which probably means you'll sign on to a project and abandon it halfway through or something. Very sus.

(I wish I could say my early interviewing rubrics were any better than this. We have come a long way as a people, we Silly Valley programmers.)


That kinda makes sense, unless you want Rene to be your replacement.


> Also a 'white' older male is the least preferred even if he is smarter compared to all females and the minorities that are being interviewed as long as they are not terrible. Biases galore.

I will agree that ageism is a thing, but 90% of all of my coworkers (who were software engineers) have been white males, so I cannot at all agree with this take otherwise.


> Also if a senior person can't in 30/45 min of talking with someone figure out the general experience level then the problem is them, really.

Or it's not their call, theirs is just to work with them after.


>I still don't get why such questions are even asked as most jobs I've ever had not even remotely touched those and I've touched quite a few industries, technologies and types of companies.

I've had to work on tree traversal stuff multiple times in my life, anything low level GUI related will work with trees a ton.

I've also had to work with hash tables directly, and with memory caching layers.

I really should learn to write a proper parser, as I've had to write parsers multiple times now and they are always an ugly hack job.


Yep. In a project I’m working on at the moment (collab text editing), I’ve implemented 2 different b-trees, a skip list, 2 custom file formats and a dozen or so algorithms which do various graph traversals.

I get that this is uncommon, but if you scratch beneath the surface, most software (browsers, databases, compilers, OSes) are full of this stuff.

Even while I was consulting stuff like this would come up. At one company we were using a custom graphql wrapper around a CMS, and it was missing some functions we needed. The wrapper was implemented like a compiler from the cms’s data format to a set of query functions. Fixing it to do what we needed it to do was really hard and broke my brain a bit. But I did it. And I wouldn’t have been able to without understanding compilers and algorithms.

You can spend your whole career walking the beaten path adding features to apps and websites, and never traversing a tree at all. There’s lots of work like that out there. But if you ever want to go deeper, you’ve gotta understand data structures and algorithms. I know not everyone is suited to it, and that’s fine. But there’s definitely a reason big tech asks about this stuff.


> But if you ever want to go deeper, you’ve gotta understand data structures and algorithms.

I don't think this is quite right. I think it's more like:

If you ever want to go deeper, you've gotta be able to recognize when the problem you're solving fits a pattern for which good data structures and/or algorithms exist, and you've gotta be able to find, understand, and apply good reference material.

Solving this "knowing what you don't know" problem is the best and most important role of formal education, in my opinion. It's not as important to know a topic as it is to know that it exists, and some of the basic terminology necessary to get started researching it further.


Yeah I think that’s what I mean by “understand data structures and algorithms”. Or, I think your description is exactly what a useful working understanding looks like. You should know broadly what’s out there so if a problem comes up, you know where to look. (Would a hash table help? A priority queue? etc). And you should be skilled enough such that if you decide to use a red-black tree, you can find a good library or implement it yourself - with access to the whole internet as reference material. (And test it).

Nobody expects you to memorise a text book. But if an API gives you a list of items and you want to count the occurrences of each item, you should be able to figure out how to do that. And ideally in less than O(n^2) time if necessary. It’s surprising how many otherwise productive coworkers I’ve had who struggle with stuff like that.


But this isn't what the leetcode interview tests for. Reversing a binary tree, or figuring out how to arrange the parking lot to fit the most cars or whatever isn't a working understanding, it's essentially memorization. Being able to memorize something like that takes intelligence and dedication, so it does a pretty good job selecting for that, but it also filters out a lot of people who for good and valid reasons don't want to spend hours and hours studying. Not even doctors/lawyers do this: they do it exactly once and never again.


> isn't a working understanding, it's essentially memorization

Boring questions that you've seen before are memorization. But there's thousands of interesting questions out there that will never show up on leetcode.

Two examples from times I've been interviewed:

- One time my interviewer gave me about 15 lines of C code that used multiple threads and asked me if it was threadsafe (it wasn't). Then he gave me some threading primitive I hadn't seen before and asked me to fix it. Well, I had no idea what the threading primitive was so I was a bit stuffed. I asked him to explain it and he did, and then I (successfully) used it to solve the problem. He wanted to hire me, saying the fact that I hadn't seen that primitive before and still managed to figure out the answer within the interview impressed him more than anything else.

- Another time I was asked some more classic algorithm problem, where (in hindsight) the answer was clearly to use a priority queue. I didn't think of that, and under pressure I came up with some weird alternative in the interview. The interviewer messaged privately after the interview - he'd gone back to his desk and spent the next hour trying to figure out if my hairbrained idea would work, and he was as surprised as I was to realise it would. I told him I'd realised a priority queue was a good approach as soon as I walked out the door of the interview. I was offered the job.

I've never "crammed leetcode problems" in my life. I don't think thats what any interviewers are looking for. They're looking for people who can think on their feet and use DSA to solve real problems. AFAIK algorithm puzzle interviews predate leetcode. Algorithms have been used since the very early days of google and (I think) microsoft.


There's not a lot of difference between the algorithm interview and the leetcode interview--leetcode is just a story problem around some kind of basic algorithmic problem.

I've done multiple interviews where they use some site and you're expected to plow through 10 questions or whatever in 60 minutes. You can subscribe to the site to practice. Gazillions of employers use this site or one like it.


I agree that I think this is what most experienced people mean when they think of understanding data structures and algorithms.

The problem is that this kind of understanding is very rarely what coding interviews check for. They either don't check for this at all - instead just making sure people can write simple code while reasoning through a simple problems under time pressure - or they check for whether people memorized a textbook, looking for a specific non-obvious data structure and algorithm.

What I try to do, because I think it almost ticks all the boxes without asking for memorization, is ask questions that start with the simple "can you program at all" part, with follow up parts that end up in a place where we can have a conversation (without implementation) about how different tradeoffs could be improved, which often leads to discussing what useful prior art might exist.

Unfortunately I think this still has very high false negative issues. I've worked with people who prove to be perfectly capable of noticing when an appropriate data structure will make a big difference in their actual work, without that coming out in their interview.


My recommendation is to have a lot of different stuff in an interview, so you aren't making a final judgement on someone over any individual part of the interview. That means the candidate can stuff up one or more parts of the interview, and you can still get good signal.

For example, do all the things you suggest. Get them to write some simple code. Talk to them about an algorithm problem. Also, give them some simpleish code with some failing unit tests and ask them to debug the code. (This requires some prep, but its a fabulous assessment to do.) Ask them about their prior work. Just do everything you can think of, and don't give any specific part of the interview too much time or attention.

In my experience, this gives candidates a lot more opportunities to impress me. I don't really care if one person on our team is particularly weak on data structures. Its kinda better if we have someone who's an absolute gun at debugging, and someone else who's amazing at data structure work. That creates a better team than if everyone is impressive in the same way.


I mean, I agree, but this sounds like it could easily be a 3 hour interview :)


I think about time complexity and DSA all the time when programming. My personal view is that the people who claim it is unnecessary don't understand it and probably would be better off if they did.

I've seen lots of code that would be better if the author knew some basics. For example a report that took over half an hour to generate, I made a one-line change and cut the time to a few minutes - pretty sure I could have made it practically instant if I had taken the time to go through all of it.

And it's not like I'm some genius, I just understand the stuff I've been taught. Pretty sure most of my peers are supposed to have learned the same stuff, I think they just didn't really understand it.


In my experience, whether this is top of mind has a lot more to do with what people work on and with what tools than with level of understanding. For instance, in your example:

> For example a report that took over half an hour to generate, I made a one-line change and cut the time to a few minutes

In essentially all the work I've done in my career, this would be the result of expertise in SQL and the relational model, not in data structures and algorithms. I don't recall ever working on reporting code that isn't a dumb pipe between a SQL query and a mature library for writing CSV (or parquet or whatever). Sure, there are tons of data structures and algorithms on both the database server and client side, but that's not what I'm working on.

And I think this is pretty typical for people who mostly build "applications", that expertise in tools is more of a value-add than expertise in data structures and algorithms.

But having said that, I do agree with you that everyone benefits from these kinds of "fundamentals". Not just this, but also other fundamentals like computer hardware and systems, networking, etc. I think fundamentals are very useful, while also thinking that many people are good at their jobs without them.


In my case the processing was happening in our backend. I can't remember exactly why it couldn't be SQL, actually it's possible it could have been sql. But changing it to sql would have been a bigger change and this wasn't really the task I was working on, I just happened across it while doing something else.

I have also seen and fixed similar travesties where someone iterates through a huge list making one query per element, where it was fairly trivial to rewrite it to a swl join.

Point is just that understanding what you're doing is is valuable and in my mind DSA is a fundamental part of understanding what you're doing. Anyway I think we agree :)


Unsurprisingly we've now reached the perennial "is premature optimization actually premature" of it all :)

Would it have been better for the person who originally wrote that just-iterate-the-list implementation to have been thinking about data structures and algorithms that would perform better? Opinions on this vary, but I tend to come down on the side of: Optimize for human productivity (for both the writer and the many future readers) first, then profile, then optimize any bottlenecks.

My assumption when I come across something that turns out to be a performance bottleneck that is easy to fix with a better data structure or algorithm, is that the person who wrote that was consciously doing a simple implementation to start, in lieu of profiling to see where the actual bottlenecks are.

But I also understand the perspective of "just do simple performance enhancements up front and you won't have to spend so much time profiling to find bottlenecks down the line". I think both philosophies are valid. (But from time to time I do come across unnecessarily complicated implementations of things in code paths that have absolutely no performance implications, and wish people wouldn't have done that.)


> Optimize for human productivity (for both the writer and the many future readers) first, then profile, then optimize any bottlenecks.

I don't agree. The problem with this approach is that there are some optimisations which require changes to how data flows through your system. These sort of refactorings are much more difficult to do after the fact, because they change what is happening at the abstraction / system boundaries.

Personally, my approach is something like this: Optimise first for velocity, usually writing as little code as possible to get something usable on the screen. Let the code be ugly. Then show people and iterate, as you feel out what a better version of the thing you made might look like - both internally (in code) and externally (what you show to humans or to other software systems). Then rewrite it piece by piece in a way thats actually maintainable and fast (based on your requirements).


I did mention that people disagree on this :)

I've moved both ways along the continuum between these perspectives at different times. I don't think there is a single correct answer. I'm at a different place than you on it currently, but who knows where I'll be in a year.


Totally fair :) I have the same relationship with static typing. Right now I couldn't imagine doing serious work in a dynamically typed language, but who knows what I'll think in a year too. From where I'm standing now, it could be ghastly.


Actually I think you're misunderstanding me. I'm not saying you should profile and optimize all the code you write, I'm saying that a basic understanding of algorithms, data structures and complexity analysis allows you to write better code without any extra tools. I didn't profile this report to find out why it took 30+ minutes to run. I just happened across some code, read it, saw that it was essentially two nested loops iterating through two huge (50-80k elements each) lists matching items by name, changed it to use a dictionary instead of the inner loop and that was that.

It's a trivial change, it wouldn't have taken any longer to write this the first time around. There is no excuse for doing this, it's just a dev who doesn't understand what they're doing.

That's my point. Understanding these fundamentals allows you to avoid these types of pitfalls and understand when it's okay to write something inefficient and when it isn't.


If I want to go deeper. I'll have to understand the trees _at that time_.

Not now, when I'm just re-exporting ESM node packages as CJS so our legacy system can work with them


Traversing trees recursively is so trivial. I have to do this kind of stuff all the time. Just last week actually (in some frontend code no less).

Graph search and B-trees I haven't done professionally since I left college though. But it is still good to know the theory when dealing with databases.

A lot of these algorithms is more about knowing their characteristics than knowing how to implement them. For example cryptographic algorithms can be complex, but having a good lib and knowing each crypto algorithm characteristics is usually good enough for almost everyone.


> I've had to work on tree traversal stuff multiple times in my life, anything low level GUI related will work with trees a ton.

How many times did you have to write tree balancing code with no reference materials?


Bingo. You forgot to add "with someone literally looking over your shoulder," though.

I've written AVL trees, B-trees, red black trees, and a bunch of other things people have named here. But, right now, without looking at any references, I couldn't even tell you how to balance an AVL tree, much less sit down and write out code for it.


That's why these interviews select for recent grads. Or leet code studiers.

Yes we've all done this in university. We've learned the theory. We had to write an implementation of this or that algorithm in whatever language the university made us use.

And we also know that great minds took a long time to come up with these in the first place. These "basic algorithms" are not something you think up in 5 minutes after first learning that computers exist or that some problem exists.

Bin packing algorithms are another such thing. Sure ask me interview "questions" like "please prove whether P=NP".

Eff off Mr. or Mrs. interviewer!


The exact same number of times I've been asked that during an interview: 0!

I do ask tree traversal questions when interviewing because I've had to traverse a lot of trees so I think being able to do an in order traversal of an already sorted binary tree (which is only a handful of lines of code) is fair game.


Just the once? [maths joke]


In order traversal is simpler and more practical than the questions xandrius asked about.


I disagree with some of this.

At some of my past jobs and the current one, this kind of algorithmic knowledge was important to build features that were differentiators in the market. As much as people love to pretend, not every single possible solution is in a library. Sometimes you're the one building the library.

It doesn't have to be leetcode, but candidates should at least be able to produce some code that doesn't come from the README of their favourite framework.

Also, talking for 30/45 mins can be enough, but it produces false positives when you have people coaching candidates. I've had people completely acing interviews that it felt like the perfect candidate. Well, it was rehearsed. When I asked for a fizz-buzz type question they completely messed it up.


| this kind of algorithmic knowledge was important to build features that were differentiators in the market

I agree with this, but with the caveat that it's extremely rare to come up with a truly novel algorithm in a production environment. Those almost always come out of academia or R&D departments.

So is it important that people remember how to implement algorithms from scratch? Or is it important that they know when to identify when an existing algorithm would be useful?

For instance, if I see that something is suspiciously like a Stable Marriage Problem, do I need to remember how to implement the Gale–Shapley algorithm? Or is just the ability to recognize the problem as something that has a particular solution 90% of the way there? I would argue yes.

That said, I'm not sure how to test this is an interview setting.


You don't need something "truly novel" to have a market advantage. Far from it.

You just need something that is not ready made.

Remember that not every algorithm is neatly package-able, or has a cool name like "Stable Marriage Problem" or "Levenshtein Distance".

Also: maybe your advantage is implementing some of those in a new language. Maybe your advantage is implementing one of those in SQL! Maybe you have a tree-like or graph-like structure, and you need to do something with it without having to export into some format expected by certain packages. Knowing what to implement is important too.

Also, those interviews are often testing for something even simpler than those fancy algorithms.

Nobody uses Fizz Buzz or Recursive Fibonacci daily, but you might need to implement something that is beyond creating views/models/controllers in your favourite MVC framework.

This is what the test is checking for: coding beyond what's on the README of your framework.


Unless you have data to show that your "different way" produces better results, your idea sounds exactly the same as every other idea, which is basically garbage.

Lots of people have lots of ideas on what makes a great interview question, but none of them are backed by data. LeetCode-style algo questions USED to be an indicator of intelligence as per Google's investigation, but now it's so heavily gamed that I doubt there's any signal behind it anymore.

Someone needs to spend time and effort and money and actually do random controlled tests to see if people who pass a particular type of test actually make good employees. But so far no one has done this, except Google as far as I've seen. But even now, I think there's no evidence that even algorithm questions are any indicator, given what I've seen.


Don't discard good anecdotal evidence because there's no proper randomly controlled trial. If you're experienced/senior in the industry you probably worked with dozens of developers closely, with hundreds less closely, you've seen many projects and designs, you've interviewed and was involved in the hiring of dozens of people.

This random controlled trial you're talking about is so hard to do because there are infinite confounders. How do you even measure success? After how long? How do you separate politics from technical ability?

In reality people above a certain bar are able to make some contribution to most projects. You can't and don't need to staff every project with superstars. The superstars are rare but are not hard to identify. They'll have the track record, reputation and the skills.

Another thing to consider is that the best outcome of interviews depends on the quality of the pipeline into those interviews. If you can't get good employees to apply you won't get good employees hired. Pay well and build a company that people want to work for, and then your hiring is going to be easier.


I guess we could make some reasonable evidence about hiring approaches based on HN hirers: What percentage of hires using your favourite technique/interview question do you think worked out as a success?

My rate is around 50% good, 50% I wish I hadn't hired.

Anyone else want to fess up?


I've had a much better rate than that. What does "wish you hadn't hired" mean exactly. Did you completely misjudge them or did you have an idea of what you're getting and still hired them. To be way off on 50% of interviews feels like way too high failure rate.


> Unless you have data to show that your "different way" produces better results, your idea sounds exactly the same as every other idea, which is basically garbage.

Do you have evidence that the standard coding interview works? (There's evidence that it doesn't)

I'm with you that the claim might be too strong to say "this is the way" but that's because I'm of the (very strong) opinion that interviewing is an extremely fuzzy process and there are no clear cut metrics to measure one's abilities in a significantly meaningful way. a̶l̶l̶ ̶m̶o̶d̶e̶l̶s̶ ̶a̶r̶e̶ ̶w̶r̶o̶n̶g̶ ̶b̶u̶t̶ ̶s̶o̶m̶e̶ ̶a̶r̶e̶ ̶u̶s̶e̶f̶u̶l̶ There are useful interviewing methods but certainly not "the best" method. Trying to just mark checkboxes only leads to mediocre results. The reason we're generally okay with this is because we more often than not don't need rockstars and it doesn't make sense to put a lot of time and energy into this process when we get sufficient results through lazy methods.

FWIW, a typical (non-software) engineering job really just involves high level discussions like the OP suggests but even without the implementation. It is generally about seeing how the person thinks/problem solves and looking at problems they solved in the past. It isn't resource intensive and good enough. Because truth is, you don't know what someone is like as an employee until they are an employee (and even then this is fuzzy)


> Do you have evidence that the standard coding interview works?

No, I think that they are all garbage.

The only way to really hire is by having a vibe check to see if they are someone the team wants to work with, making sure that the person seems competent and has a reasonable chance of being very productive, and then hiring them quick. Give them a month and if they don't seem like a good fit, then fire them, with 2 months severance.

This is the only way I've seen that will produce a great team quickly, by hiring quickly and firing quickly. This is similar to what Netflix does but they also pay top of market which not too many companies can afford, but it produces the best results.


Seems we're on the same page: https://news.ycombinator.com/item?id=40291828

> Give them a month and if they don't seem like a good fit, then fire them, with 2 months severance.

And this! I think this is quite effective and efficient. You aren't wasting anyone's time and making sure the person has adequate time to find their next income source. It immediately makes me respect you and I think would build high employee loyalty.


As long as you want to only hire people who don't have an existing job or competing offers.


But we're talking about "seems competent and has a reasonable chance of being very productive". That's why you're interviewing in the first place. If you don't ask them to demonstrate their competence in any way how do you know they're competent? If they can't "produce" anything during an interview (even something trivial) how do you know they'll be productive? Assuming a complete stranger with no references you can trust.

A month might be too short of a time for certain roles. I think people would be hesitant to take a risk with you if they know this is your policy. I'd would say that at any point where someone is clearly not a fit they should be let go. You might know after a month (if they're terrible), you might know after 6 months, they might progress initially but stall. I think e.g. with new grads it's going to take a little longer in general since they have a pretty long growth trajectory.


> LeetCode-style algo questions USED to be an indicator of intelligence as per Google's investigation, but now it's so heavily gamed that I doubt there's any signal behind it anymore.

Where can I find the data on this?


All fair points, and I agree with you on rigorous experiments. For now we are talking about only intuitions.


I have a different take on this. I ask a fairly simple coding question and have them write code. No fancy algorithms to memorize or competition style coding.

I want to see if they can think in code. If they can translate something simple to written code. There's always a chance to have a conversation as well to probe different aspects of their understanding.

A few of those by a few people and I think you get a pretty good read.

Mix in some questions about their work and/or some general knowledge. I've also given people code to read (real world code) and explain to me.

For Algorithms and Data Structures I can look at their grades if they're fresh out of school. But we all know most of our work is not that (someone's gonna show up on the thread and say differently - I'm sure ;) ). If you can think in code and have the right mental models implementing an algorithm from a book isn't hard if you really have to.


I used that approach too for a beginning frontend dev: see if they can get a solution to straight-forward problems, and ask them to explain it afterwards. They got all the time they needed, and a clean laptop with wifi. Two candidates came out well, two others didn't. That was a bit shocking, given the low bar.

But it won't cut it for a senior. Such people should be able to answer a wide range of questions, including some algorithm and data structure stuff. Not that they have to be able to code a min-max-heap from scratch, but they have to know what's out there, how to use it, and how to keep the overview of an entire project. That's not going to be evident from a coding interview.

So, horses for courses.


I agree. For more senior people I'd mix in systems, architecture, design trade-offs and such. I'd want to see they have actually delivered something in their career and dig into the details.


And even for Google, leetcode has become noise because people simply cram the questions. When Microsoft started to use leetcode-style interviews, there were no interview site. So, people who aced the interviews were either naturally talented or were so geeky that they devoured math and puzzle books. Unfortunately, we have lost such signals nowadays.


Your "solution" is just another coding interview.

A typical engineer doesn't begin their day thinking "I should implement a streaming algorithm" out of nowhere (and if they do, they can always seek reference). They analyze a (typically underspecified) problem, figure out the constraints, and then start thinking about a solution. Note that both processes, analysis and solution, can take hours, days, weeks or months!

Coding interviews have everything backwards by providing synthetic solutions to mostly out-of-context problems.

For example, interviewers could:

- Share some portion of a codebase with the candidate prior to the interview

- Go over the code with the candidate

That's it. You will know really fast who's senior and who's never wrote software. It can't be faked. And it can be adapted to mostly any positions (backend, frontend, devops, architecture, whatever).


My approach is to ask the candidate to show me some code they've written ( could be for class project, but one requirement for me to interview them is that they have a GitHub) and explain the choices they made or other anecdotes that surround the project.


Good idea. But the focus on algorithms is maybe overkill for at least 80% to 90% of companies. Perhaps lay out some subset of the business requirements or problems the company is working on and ask them to turn that into working code is more suitable


What is ICPC?


International Collegiate Programming Contest


Thanks


My attitude towards code interviews is to politely decline them and encourage people best of luck hiring a junior developer; because that's obviously what they are looking for. I'm closing in on 50, so not that junior anymore. If anyone has any doubts about my coding abilities after reading my CV, browsing my Github repos, and talking to me, then it's not going to work and we can both save ourselves some time. I've stopped responding to recruiters as I rarely see any evidence of them even being capable of doing these simple things.

I've hired and interviewed a lot of people over the years. Mostly without the help of recruiters. I like to think I'm pretty good at that, actually. I always look for people that are eager to learn. I care more about what they don't know than what they can regurgitate because they did lots of silly online prep. If people are young and fresh out of school, my assumption is they are going to have to learn a lot in a hurry. So I look for people that are curious, eager, and have an open mind. The best way to do that is to get them slightly out of their comfort zone and talking about how they would tackle things. The signs I look for here are enthusiasm, ability to surprise with good answers, etc.

This generally does not involve code interviews; though I might ask some targeted questions about how they would deal with certain things in languages, frameworks, etc. they list as being proficient in. I've of course worked for some people that insisted on me organizing code interviews and it's a chore and IMHO you learn absolutely nothing that isn't obvious if you just casually talk to people for 30 minutes or so. I usually argue against such things and prefer just talking to people directly.


I totally understand your point of view, but looking at it from the other side it’s not as simple. We had candidates with 10+ years of experience in their resume, talking to them it seemed they know what they’re doing, they showed some of the code they’ve supposedly written. Then they got hired and it turned out they can’t code - their PRs are below junior level, constant bugs, communication is abysmal, they overshoot their estimations by 3-4x, etc.

Thus happened to us twice, and after we introduced a simple live coding session to our interviews where you have to implement a simple real world web component (for a frontend position) suddenly the problem of bad hires disappeared almost entirely (you can’t judge someone’s character during a short interview but that’s another issue).


We recently interviewed a candidate, with 7 years of experience at a FAANG company, fail a really basic coding interview and we don't even make you write code.

I have no idea what these people expect. We ask coding questions, we have basic programming skills as a requirement, what do they think will happen if they get the job? Do they expect to just magically learn the required skills or hope that nobody will notice that they're not able to close their tasks?


Do you think working at a FANG would make you a better developer? I’m sorry but I think that’s a bad expectation, hire from a startup if you want great developers, their span of control and influence is much wider. Most of us at FANG are pigeon holed into a very narrow topic, have to work with really annoying, slow and complex internal build systems. And write very little actual code. Also, unless you have global scale problems, a lot of the skills developed working at that level aren’t useful to smaller orgs and will just drive up costs. Dont look to FANGs for guidance is my advice.


> Do you think working at a FANG would make you a better developer?

No, but I do expect any reasonably competently run company to figure out that the person they hired isn't actually able to do their job, especially after 7 years. I was also under the impression that this company actually did coding challenges as part of their interview process, yet this person managed to slip through their interview process. On the other hand our approach to just talk about programming in general terms seems to spot this type of candidate pretty quickly.

One of our most used questions is to ask people to tell us about something they don't like in a language or tool (one the candidate is very familiar with or enjoy using). It's pretty hard to imaging even a senior Java who doesn't have some pet pew about the language.


> No, but I do expect any reasonably competently run company to figure out that the person they hired isn't actually able to do their job, especially after 7 years.

Oh, they did. This person probably did not get paid as much as their peers. A great deal of compensation is discretionary.

Whether any decision-maker actually stood to benefit from removing this person from their position is another matter, however. Firings seem very uncommon, and layoffs depend on business conditions.


It's less that working at a FANG produces better devs and more that FANG can afford to be picky and so having worked there is a sign you passed a high hurdle once.

Kinda the same idea as having Harvard or Stanford as your alma matter. Most schools will teach you most of the same stuff, but those universities only take the "best". If your idea of "best" is similar, you'd take that Harvard also liked the person as a good signal.


Iant Harvard 10% best on scholarships and 90% children of rich and connected parents? Not total idiots, but still.


> Do you think working at a FANG would make you a better developer?

Yes...? I mean look at the complexity of the products these companies build?

I would expect if you've contributed on say Chromium or some EC2 networking layer that you're quite competent...is that somehow unreasonable?

These companies tend to pay multiples of typical "enterprise" CRUD developers because the work is that much more complex.


15 years ago fang devs were some of the best. Now it feels like all of fang is crumbling under tech debt and enshittifying executives, like AOL did.

You love to see it!


> Do they expect to just magically learn the required skills or hope that nobody will notice that they're not able to close their tasks?

It obviously worked for the candidate you mention, didn't it? Spent 7 years at FAANG, didn't know how to code.


That is true, but how?


There are lots of roles at big companies that don't involve actually writing code. The bigger the company, the more communication and organizational overhead there is, and the more not-actually-coding work there is. Even within roles that have a "software engineering" title.

Hell, I barely touch any code in my current company. The only reason I can still code is because I love it, started programming at about 11 years old, and still do it in my spare time. If it was "just a job" for me, then no doubt the skill would atrophy.


Some people genuinely freeze up or overthink to their detriment in an interview setting, despite being competent otherwise.

Happened to me a couple of times back in university when I had to get an internship for one of the semesters.


This, I freeze up when put on the spot, otherwise I can actually code, but it doesn't seem like it when doing coding interviews.


I interview best when I'm relaxed and it doesn't feel like there's a lot riding on the interview. What has worked for me is to interview early (when I feel "maybe I should leave my current job", rather than "I have to leave this awful job ASAP or I'll lose my mind"). Even then, I've had bad interviews - where it's like forgetting your own phone number or PIN; It's hard to recover from your brain short-circuiting early on, I suspect my interviewers may have thought I'm a fraud too, but such is life.

Interviewing has plenty of randomness - on a few occasions, the stars were aligned and I solved sequences of very challenging technical questions much quicker than the interviewers had planned, which gave the impression that I'm some sort of genius. That performance was not representative of my usual capabilities, but I didn't tell them that (:


Some people freeze up when they have to do their jobs...


Out of curiosity, is it possible he lied on his resume about being a FAANG SWE?


The IT job market is a two sided lemon market, people may lie on their CV but companies can be very abusive as well. If you don't do what OP suggests, people will not respect you. I've had five developers join my interview call to see if I can implement a palindrome checker. This is proof that non of them bothered to read my CV and check out my github projects. This is not respectful, and you have to stand your ground against such nonsens. Let them now this is not what you expect from people working for you and and that you expect better from them in the future. If they don't like that you won't have lost anything.


Was it the task itself you feel was disrespectful or the way they ran it? Anyone can put up fake/inflated CV so it's not really proof they didn't bother to read it, GitHub can be easier or harder to fake/inflate depending on the projects. Doing something like a quick palindrome checker to show "Hey, yep - I'm real not some random inflated applicant you probably got 10 applications of alongside mine" seems more than reasonable/respectful (by both sides). I can also easily see how 5 people can hop on and completely run that task in the wrong way and come off as being know it all douchebags or the like but that'd be a different independent issue really.


Why even show up with 5 people.


same experience, same conclusion. I used to hire without coding interviews, i don't anymore.

Although i see it as a way for the candidate and i to talk about code in general, and assess his level. No as a simple barrier.

i hired a candidate that completely failed a code interview because he was super nervous, but just talking about the problem made me quite sure he was actually good.


Just by introducing coding interview, how would you know they will not overshoot estimations by 3-4x? It's not one person's job, normally estimates are done by the entire team


So have the candidate do a small project and submit a PR.


The problem there is that the amount of time being wasted is asymmetrical since the employer isn't present, so the typical experience from the candidate's perspective is spending an evening working on a project and then getting no feedback and a canned rejection letter.


it doesn't need to take an evening just something simple can be a great indicator


Do that constantly and you get free employees.

Just kidding, but you should value your applicants time. You could just as well show them bad code that is a problem and ask them what they notice when they look at that code. If they are good they point put the problem(s) if they are bad you will probably be able to figure it out based on asking them alone.


FWIW I always ask some really super simple coding questions in interviews, even for really senior people with apparently stellar CVs. Let them pick their language or use psuedocode or whatever.

It's surprising how many 'senior' engineers are actually BSers who will slow you down or derail you completely rather than speed you up and you need to spot them because they will excel at getting through the non-technical interview filtering!

Also, I'm interested in how you explore things and explain things. I'm not actually interested in acquiring an implementation of FizzBuzz or whatever. I just want you to show me that you 'get' it and then we can get on to the interesting stuff like 'tell me about your last project' etc.

So don't be too hasty to think the people doing technical interviews are idiots thinking devs are interchangeable cogs etc.


> Also, I'm interested in how you explore things and explain things. I'm not actually interested in acquiring an implementation of FizzBuzz or whatever. I just want you to show me that you 'get' it and then we can get on to the interesting stuff like 'tell me about your last project' etc.

Sounds like you'd more valuably/realistically get that from the discussion of a previous project though? How it works, or something interesting they had to figure out, etc.?

> don't be too hasty to think the people doing technical interviews are idiots

I don't think anyone has a problem with technical interviews? At least I agree that's not reasonable. That doesn't have to mean 'coding' though, you can ask about how they'd approach a particular problem, quiz on some fundamental knowledge in a fizzbuzz sort of way, etc.

You can more easily tune it to the kind of candidate you're looking for then too, for example I've asked how they'd tackle improving the performance of a particular SQL query that's been identified as too slow. There's a tonne of possible answers to that ranging from naïve/they don't really know, through pragmatic good fit responses, to way overkill I hope they understand we don't need them doing that here/not operating at that scale etc. - and it's fairly open-ended in what you can discuss driven by what they volunteer and know about. (Which is another good thing IMO, I don't like being on either side of interviewer quizzing and candidate umming and ahhing not really knowing! Both more comfortable and more beneficial to have a discussion about whatever is known IMO; to quickly move along the perimeter if you hit the edge of that.)


> Sounds like you'd more valuably/realistically get that from the discussion of a previous project though? How it works, or something interesting they had to figure out, etc.?

You may be underestimating people's ability to bullshit their way through this sort of discussion.

It's harder to bullshit your way past a blank file in a code editor.

I'd wager that something like FizzBuzz will eliminate 90% of the chaff. Yes, it's laughably simple. No, it's not so simple that it won't stump a significant number of folks who've coasted for years.


FizzBuzz is an incredible filter for devs. Any kind of async JavaScript that needs to do something with the data after it fetches it has also been a winner. Lately the biggest weeding tool has been asking candidates to fetch some JSON from a bucket, then visualize that data. I'd say that maybe half the people manage to fetch the data in 30 minutes, and of that, maybe 10% get the data fetched within 10. The other 20 minutes is trying to help them recognize that their data is initially being rendered with "undefined" and they have to update the viz.


But are you filtering for realistic performance, or for performance under time & observer pressure?

I did a .. bit more complex than fizzbuzz but similar toy problem sort of thing recently, and it's nothing I can't do, it passed the visible test cases (HackerRank if you're familiar, I haven't used it before but I assume it's generally similar) etc. but it was clumsy and far from representative of what I'd actually write on the job I think. Even if I still only spent an hour on it, just not having that pressure. Plus I suppose I would likely have seen the problem before in planning, and realistically it would be something within the context I was familiar with from working on it every day, not the toy problem. (This wasn't it, but imagine something like calculating the possible states of a chess board after each player makes another move. Bit simpler though.)


I think you're completely ignoring the reality of frauds. Jeff Atwood was writing about this over a decade ago with "FizzBuzz".

There are many people who spend more effort creating the illusion of competence on paper and on the job, getting harder to detect the higher they go.

We as a profession (software engineers) have continually resisted broad unified certification like other engineers which could be a replacement for code interviews to assess competence, but would have other drawbacks.

So we are stuck with code interviews to ferret out BSers. And even then it sometimes fails. But it appears to be the best tool we have, because there is nowhere to hide. Don't take ot so personally.


> There are many people who spend more effort creating the illusion of competence on paper and on the job, getting harder to detect the higher they go.

If you can navigate a software engineering position, purely undetected, by bullshitting it, I would say you can be a very good manager. You can probably handle high level concepts without knowing the implementation details.


This is basically what happened, the industry turned manager-heavy and expelled a lot of talent, replacing lifelong developers with bootcamp devs and other non-tech background people. It's kinda messed up because the lifelong developer types got called nerds growing up, had to learn what they know in the face of bullying and computers being very uncool, only to be basically replaced by those who made fun of them when it finally became "cool" and lucrative to be a developer.

Because it's all managers and no talent now, there's like an Interview Industrial Complex that emerged, where most teams spend the majority of time and energy just interviewing thousands of people and hiring/firing (via manufactured drama) while they never really build anything - it's all these managers know how to do because there are so few real developers left.

Some of the best developers I know of (of libs I use, etc.) outright refuse to work in the infantile conditions of the modern corporate setting anyway. The lucky ones have found other revenue streams and spend their coding energy on open source or personal ventures.

I talked to a young founder the other day - maybe 10 years younger than me, in his 20s - who said multiple times he was "retired", he kept waiting for some kind of validation on my face I guess but I just don't find it impressive. I lost respect actually, having heard that. In his mind he thinks he's a baller, in my mind he's a lazy egomaniac who knows 4 total things - I wouldn't even let this kid mow my lawn.

Smart, talented people just aren't valued anymore - it's more about prestige and authority now. But maybe not forever, they're certainly leaving themselves wide open at the advent of this LLM thing. Would love nothing more than the big tech ship to sink and get displaced by smaller, smarter companies.


> Smart, talented people just aren't valued anymore - it's more about prestige and authority now.

At the core of any corporation that isn't in the process of rapidly dying, between all the middle management and socialising and meetings with pretty graphs and interoffice politics, there needs to be someone that does some actual work.

This is where the nerd fits in a large corporation. That person is irreplaceable, and the layers around them recognise this (or else the company implodes). The may posture, but if you push, they will jump through hoops for you. Flex your muscles. You have more power than you think.


...which is completely unhelpful if you're not looking to fill a manager position.


> I think you're completely ignoring the reality of frauds.

Or maybe their strategy still catches all of the frauds and it has therefore never been a problem to them?

I have to agree with their take, and just asking a bunch of technical questions -even without any code- is good enough to filter out the obvious incompetents.


>even then it sometimes fails

How can a code interview fail? Hidden earpiece?


People spending inordinate amounts of time memorizing solutions to common problems. This is admittedly partly the fault of HR not ensuring that interviewers have a good pool of problems to choose from and twists to put on things, but it's a constant cat and mouse game with various websites aggregating interview questions from companies.


It's pretty easy to spot when the candidate goes from canned answer to actually having to think. "Thank you, that looks good; now I need it to also do this extra thing" - keep tweaking the question until they have to think.


I’ve had reports of Zoom interview where an earpiece has fallen out and my interviewers heard advice on how to answer the question audibly present. Kind of dumb, how hard is mixing the audio?


Had a candidate copy and paste the answer wholesale.


Hard disagree. On my teams, we're going to give you a problem with some slight ambiguities to see how you handle that. We're going to see what kinds of questions you ask and how you respond to feedback. We want to hear you walk through your thought process. The more senior the position, the more important all of this becomes. Getting the "correct" solution is, at best, 50% of the goal with the interview.


This is not the approach that the majority of coding interviews take. In my experience, it has been disinterested interviewers who are blatantly pretending to understand what they are asking. Any attempt to engage in the type of discussion you aim for is met with dead ends, because that's not what the Googled answer contains.

Your approach is a major outlier. You probably have many good candidates turning away, and for good reason, they have no idea that you are different to everyone else. Find a different way to do this, there are several other approaches.


There are very few options outside of coding interviews, and getting fewer by the day.

It's almost like everyone big and small is standardizing on this model. Feels like one of those mandatory courses you took in college: the teacher and all the students knew it was bullshit yet you needed to perform the parroting at some adequate level to pass.

I have 21 years of varied experience in software engineering, yet a recent "technical" phone interview was a kid asking me to "balance a B-tree"; I can tell his expectation was for me to start reciting some CS algorithm BS, probably that's what everyone else is doing. I politely declined and that was the end of the interviewing process with that company.


Classic mistake of overthinking it and failing to realize what interviewer really wants - which is to make sure the candidate can actually write code, like at all. The question itself doesn't really matter as much as it's just a pretext. I actually asked a variation of this question for many years at Google and it was clear within first 5 mins who has been writing code day-to-day and who's been mostly "brining key stakeholders into conversations at appropriate time".


Exactly this. In the interviews I give I care about whether the candidate can write code, yes, but also talk and think about code.

The conversation is the most important part of the interview, and the thinking (and communication) is the most important thing I'm trying to judge after basic skills.

Like you said, you can get a good sense within the first few lines of pseudocode if someone's at least competent at writing code. But that's just one motivation behind coding questions.

It's also very difficult to talk about code, algorithms, and solving problems without a concrete problem and concrete code in front of the candidate and interviewer. So both the question and the code the candidate writes are mainly context for the conversation where I try to see how the candidate thinks.

These kind of articles make me sad because I (and many other interviewers I've worked with) try to make it clear that this isn't a test - we don't care so much about being "right" or "wrong", and there shouldn't be any tricks of "a ha" moments.

We explain the goals and what we're looking for right up front. And I would hope most interviewers do the same, but I guess not. So there's this persistent myth among both interviewers and candidates that coding questions are about getting a right answer.

That's a shame because coding questions get such a bad rap, but I'm not really aware of a better options. Take-home problems and looking at GitHub are unfair to many people. A well-run technical interview should give lots of people a chance to succeed.


You are the exception I think. Most interviewers care about the correct answer. Get it and maybe get the job. Fail and definitely don't get the job.

If the interviewer said at the beginning, "I don't expect you to solve this problem in the 40 minute nor to have an optimal solution. I just want to watch you write some code and hear the problems you foresee and how you'd solve them" then maybe I could relax and do that. But, generally the pressure is on "get this right in 40 minutes or you're rejected"


This is actually why I dislike these "coding interviews are useless" type articles. The issue has as much or more to do with bad interviewers than it does with the fact that it's a coding interview.

When I'm tasked with interviewing candidates and evaluating these basic algorithmic and coding skills, I have a 5-part problem (each building on the next, and only revealed when the previous one is complete) that is basically impossible to finish in the time allotted. I tell the candidate ahead of time that it's an ongoing problem that's designed to not be completable in the time: we're going to work through this problem and see how far we get. I've passed candidates who "failed" the actual problem, when the conversation and coding that were shown still gave me a good understanding of their capabilities.


Coding tests are an awful place to test someone’s conversational skills. I don’t talk while I code. You don’t either. Honestly I can’t even remember the last time I talked to anyone about the code itself outside of a PR. People talk about architecture and database migrations and why their containers aren’t behaving locally. Nobody ever tests for that stuff.


> I don’t talk while I code. You don’t either.

That's quite an assumption. You've never heard of pair programming? You've never asked for help on a bug in your code? You've never talked through alternate approaches to a piece of code with a coworker? You've never hashed out an interface or a method signature or some pseudocode while talking through the problem? You've never walked through a calculation with an SME? All of these are "code and talk at the same time" exercises.

If I'm being brutally honest, I have a deep-seated suspicion that everyone who says they can't talk and code at the same time also just cannot code at all. I don't know you, of course, and I'd love to be proven wrong. My sample size is small, but the few people I've met who cannot talk-and-code also simply could not code.


Here's my brutally honest take on pair programming: Usually 1 person wants to do it more than the other, and usually that person is being unnecessarily assertive.

The only scenario I think pair programming is socially acceptable to force on developers is a senior type onboarding a new developer out of necessity - might screen share and direct them around some places to show the ropes.

Of course if you love to hang out with someone else while you write code for some reason - more power to you, have fun. For me it's a private thing, even after 20+ years. If anything the LLM is a much more useful sidekick to figure things out.

> Can't talk and code means can't code at all

I disagree with that, I can't even have lyrics in my music really if I'm working on something super hard especially outside my normal wheelhouse. It would at least be disruptive.

The last time "hanging out and coding" was a thing was learning it for the first time - I used to hang out with friends as a kid and we would all try to figure out what Visual Basic was lol and I remember hanging with a friend learning JavaScript during the early web days, drinking coffee through the night, good times.

These days it would feel forced and can't imagine why anyone would regularly pair program, especially now with LLMs.


Lots of people in tech have never heard of pair programming, because it's an absurd idea. This site isn't just silicon valley. This is a tiny fraction of the tech universe.


Personally I only ask super easy questions because you should at least be able to talk about something trivial. Yet unfortunately the question “find the second largest number in an array of numbers” has a high failure rate as the first question, because there are a lot of people lying on their resume just throwing spaghetti at the walls.


Thats to easy man arr.sort((o,O)=>o<=O)[1]

One for you: Write a recursive function that finds the second largest number in an array of numbers.



If you're anything like me then you think a lot while you code, like inner monologue thinking, and that's what the interviewer is testing for.


This provided me with a fascinating, albeit somewhat familiar, piece of insight: which is that I don't really ever hear my inner monologue. I'm not sure I have one! I'm either typing out my thoughts as I have them or speaking them as I have them.

I struggle in coding interviews precisely because of this: either I end up vocalizing my emotions and insecurities instead of coding, or I end up coding instead of talking about what I'm trying to accomplish. Often I will see many alternate pathways branching out before me, but if I try to start talking about them, I am no longer coding, and so my brain context-switches to "social and emotional."

Probably something I could get better at with practice, but I honestly end up commenting on places like HN simply because it "allows" me to think. If I could have a coding interview in the form of realtime text chat + code, that would be ideal for me.

I guess I have seen companies do things like "contribute to this open source project and work through an MR." I do find that quite appealing as an interview process.


Interestingly, it turns out a large number of people have no inner monologue. [1]

Some studies indicate that it's as low as 30% of people who do (so 70% don't have an inner monologue), while others show the opposite, implying around 75% of people have some amount of inner monologue while 25% do not. It's a difficult subject to test and study since we don't have direct access to people's minds and asking someone what they're thinking about literally forces their thoughts through the filter of language.

[1] https://science.howstuffworks.com/life/inside-the-mind/human...


This is a fairly common condition, along the same lines as aphantasia (lack of inner-picture, rather than inner-voice). I do not believe there is any “cure” for it.


I don’t have a sense of a persistent inner voice either, but I can verbalise my thoughts just fine. It feels to me like the part of my brain that turns thoughts into English sentences automatically goes to sleep when it’s not being used. And that brain power can be used for something else.

When I’m doing particularly hard programming work, I can’t have words said near me. Even music with lyrics messes me up. I think because it wakes up my “thoughts to English” pathway and that gets in the way of my “thoughts to code” pathway.

Anyway, I don’t want to be cured. There’s nothing wrong with my mind. If anything I feel sorry for people who can’t turn off their inner dialogue, because it means they can never use those neurons for other tasks - like maths or programming.

Personally I can talk while programming if an interviewer wants that, but I’ll be a bit dumber at the keyboard than if I sat in silence.


i am the same except i use music (lyrics and all) to drown everything else out. i don’t. price the lyrics at all, but my brain is processing them because every now and then i suddenly break out of focus and am like “wait, what the hell did that song just say?” usually on comedy songs


And vocalizing this inner dialogue comes naturally to you I suppose.


If you can't explain your thoughts out loud then you're going to have a difficult time working anywhere.


For me, it's not at all "I'm incapable of explaining this to you given a few minutes to collect my thoughts." and very much "You either get Coding Mode, or you get Conversation Mode, but not only do you never get both at the same time, I need time to context switch from one to the other.".

The second paragraph in this comment is very, very, very close to how my brain operates: <https://news.ycombinator.com/item?id=40290732>.

And I've been programming professionally for quite a long while now, so this quirk of mine doesn't seem to have made it difficult for me to work at programming shops.


As I've gotten older I've discovered that I really can't judge normalcy based on what I find natural to me. Every brain is different and abilities range. Some people visualize things in their head; some people have to talk out their thoughts; some people are more optimistic. If we are only grading on one very particular brain type then we are missing out on the opportunities for diverse thought patterns producing something even better than expected.


It's not a test of someone's conversational skills, its a test of their technical communication. Conversational skills get tested in the small talk and introduction phases at the start and end of the interview.


I consider taking while I code to be a somewhat useful skill. I’m totally capable of verbalizing my thought process as I work through a problem


What's the quality of output like? Have any links?


> I don’t talk while I code. You don’t either.

Speak for yourself


I, personally, cannot _think_ and _talk_ at the same time. It's just a stream of half-sentences, many of which my brain has already moved on from because what I originally thought won't work.

After writing this article it became very apparent to me that I'm complete garbage at interviews, but I'll outperform and exceed at the actual job function.


In my work, if you literally cannot write any code while also discussing the code, and if you literally cannot express thoughts while also thinking them, then you actually wont exceed at the actual job function, at all. You're not the only programmer on the team. I don't know why people think communication skills are not required for programmers. You won't be coding the correct thing unless you can talk about what you're doing.

And that's all I ever ask in an interview. Ask questions and talk about what you're doing. The worst hires I've ever seen were all the ones who never asked questions and never talked about what they were working on. Half sentences are fine; moving away from the keyboard while we talk is fine; being unable to talk and think at the same time probably is not.


> In my work, if you literally cannot write any code while also discussing the code, and if you literally cannot express thoughts while also thinking them, then you actually wont exceed at the actual job function, at all.

Followed by

> You're not the only programmer on the team.

It sounds like you're implying some connection between the two, whereas most successful teams don't require the behavior your team is demanding. Including the ones with good communication skills.

I can write code well. I can discuss it well. I simply don't need to do both at the same time. Unless people are in a pair programming session, they don't need to openly discuss the code while they're thinking about them and writing them. They can discuss the problem before and after. Why do they need to discuss it while coding?

It's like telling journalists or authors "Hey, if you can't discuss your story with the editor while you are authoring it then you can't succeed here."


>I don't know why people think communication skills are not required for programmers

That so significantly fails to resemble the claims being made that it strains credulity that it could be a good faith interpretation of the conversation.

>You won't be coding the correct thing unless you can talk about what you're doing.

Maybe, but that has no bearing on whether they need to be done at the same time, which they do not in just about work environment. I guess there's probably somewhere that does mandatory pair programming for everything, but I've certainly never seen it.


The vast majority of engineering design happens async - by typically a single engineer, puzzling/experimenting over possible solutions and then creating a design doc. Discussion then happens synchronously. Solving a complex design problem on the spot is not the norm.

I personally find system design interviews pretty tough - it's not a mode of operation I ever experience on the job. To solve them at Big Tech, you pretty much have to memorize many design patterns and be able to regurgitate them on the spot. Like algo questions, it's testing your ability to work hard to prepare more than anything else.

Not to say this doesn't have value as a filter, it's just not testing the thing you think it is.


If I'm not cut out to work in your environment, that's fine. I do disagree with your other conclusions, however. I'm not bad at communication, I'm bad at verbal communication while simultaneously trying to solve a problem. I'm excellent at problem solving and simultaneously chatting in something like slack, however.


Shit, I can’t take notes on a meeting and also participate—like, at all. Decent odds I’ll reach the end and struggle to give you even the gist of what happened, without reading my own notes. If I’m trying to take notes and someone addresses me I’ll be all kinds of confused about what the context is.

And that’s English and mostly just writing what people are talking about, not thinking up novel things to write.


Adding another comment here, because this is part of the reason why I wrote this article.

> These kind of articles make me sad because I (and many other interviewers I've worked with) try to make it clear that this isn't a test - we don't care so much about being "right" or "wrong", and there shouldn't be any tricks of "a ha" moments. > We explain the goals and what we're looking for right up front. And I would hope most interviewers do the same, but I guess not. So there's this persistent myth among both interviewers and candidates that coding questions are about getting a right answer.

I understand all of those things. I've written the same before[1]. However, as clear as your instructions are and as well meaning you may be, it may not help. I can logically understand every word you say, but as soon as that question rolls out, I will now be dealing with stress hormones and 30 years of learned behaviors from thousands of experiences, whether I choose to or not.

So while I applaud your methodology and wholeheartedly agree, just telling people that doesn't guarantee that it's not still an issue because humans are complex organisms.

[1]: https://darrenkopp.com/posts/2016/02/25/always-learn-somethi...


This is pretty much how it worked at the robotics company I worked at.

We would give them a whiteboard problem, but:

a) it was a simple, stupid problem in C (C++ was our implementation language, so thinking at byte level was an important skill)

b) we were very generous about minor mistakes like missing semicolons, etc.

c) we were very generous about "points for effort"; if they didn't make it through the problem but we saw that they were on the right track we might pass them. Total frauds outed themselves very early; they would produce between jack and squat in terms of actual code (a lot of bloviation though).

But again, most companies aren't that company, or your company. For most screening coding exercises, a correct answer (and even something like optimal algorithm complexity) is a must to pass the candidate.


I would like you to be the interviewer of all my future jobs, please.


The normal way to phrase that is "are you hiring?" :)


Meh. I’ve met dozens like you. People who swear they just want to see “how you think.”

Then, in the post interview roundup we talk about the candidates and you’re a bit disheartened that they didn’t complete the exercise, so you give a pass even when every other person in the room gives a thumbs up.

Nah. Re evaluate yourself and your biases.


My favorite question to ask these "see how you think" types when it's my turn is:

What is the most impressive thing you've ever done?

Just watch how they struggle with this one. I've never met anyone slinging code tests who is so curious in "how we think" that has ever made anything interesting whether it's code, design, music, a company, anything. It's just a pretentious statement made by gatekeeping noobs who love to interview, nothing more.

I tried to save your comment lol


You are not alone! This is also how I run coding interviews - I have absolutely passed candidates who did not actually complete the described problem. I always inform my candidates that we want to have a conversation about the problem and its solution. I also deliberately pick questions that are going to need thought and some design, specifically to spark that conversation - talking about reversing a list gets boring pretty fast.

The issues people have with coding interviews are more about the interviewers than the questions, honestly.


I love the typo at the end. I have definitely worked with some stakeholders that I would have loved to stuff in a jar of pickling spices and leave on a shelf for several years to ferment.


Im dislexik


Ha! I thought it was Shakespearean!

When you feed people confabulated information, it acts as an encompassing medium, a reality buffer, effectively making them more inert, less engaged, in actual reality. Brined.


>Classic mistake of overthinking it and failing to realize what interviewer really wants - which is to make sure the candidate can actually write code, like at all.

What about the lad who develops homebrew who got rejected from Google because he wasn't able to invert a binary tree on the spot? Many Googlers use his software internally and externally. If the purpose of the interview is to make sure he can code why did he fail?

https://www.quora.com/Whats-the-logic-behind-Google-rejectin...

He seems to have a great attitude and would fit right in but it's clear Google is optimising to keep people out rather than find great software Devs.


You should really read your own link - the part where he admits he made that whole story up. Im gonna guess failing some basic coding is not what tanked him. He still thinks they owe him the job though because he’s kinda famous and did a popular project in a language that Google doesn’t use, so that’s cute.

I also dont see how this one anecdote (even if it was true) invalidates anything I said above. You’re gonna have false negatives in any system you chose unless you just wave everyone in


>You should really read your own link - the part where he admits he made that whole story up.

Where does it say that?


Why does "Many Googlers use his software internally and externally" mean he would be a good hire for Google?

From all his public complaining about failing an interview it seems Google did the right thing not hiring him, he has a massive ego and it's very possible that "writing homebrew" is less useful to google than "inverting a binary tree"


It means that he created a tool, with his skills and capabilities, that is a force multiplier for other Google engineers. This is a straight up undeniable example that his capabilities _already_ brought value to Google and their stacked deck of genius non-egostical binary tree inverters.

There's not a more pragmatic measure of whether somebody can code than a track record of a successful code project used by other coders.


Here's another way of phrasing it -- if Linus Torvalds went for an interview with Google would he have to invert a binary tree and if he failed to do so (maybe he misconstrues the question and messes up or him being Linus and just refusing) would that be a good reason to reject him? Linus also can be equally or far more egotistical than Max Howell.


I find the idea that just because someone is an excellent software engineer they are therefore guaranteed to be a good fit for a particular role at Google a bit weird

I'd say that if Linus applied to be a software engineer at Google they should be prepared to invert binary trees or do $generic_leetcode type things because that's the expectation for that role

If they applied to be Google Fellow or some other lofty position then I wouldn't expect them to need to do any coding at all in the interview


>If they applied to be Google Fellow or some other lofty position then I wouldn't expect them to need to do any coding at all in the interview

So the higher the role in Google the less the requirements?


Not even Max Howell thinks Max Howell has a great attitude. He's often a dick in his own words. Maybe Google would have found a different job for him if he wasn't.

He said 90% of Google engineers used Homebrew. Google engineers said it wasn't true.[1]

He said Homebrew using OS libraries saved a lot of pain. He presented it as an example of why Google should have hired him. Actually it caused enough pain Homebrew stopped it.

[1] https://news.ycombinator.com/item?id=23844936


My go-to first screening question, regardless of where I work, is to present some JSON from a public API relevant to the domain (nothing too crazy, maybe 2 or 3 levels of nesting max), then ask the candidate to do a filter plus a sum or max - then for bonus, analysis or refactoring of their choice.

Way too often, this results in a painful slog, lots of hints from me, and, nowadays, snippets of AI generated code copied in that actually solve part of the problem but that they don't read and therefore end up breaking.


If the interviewer want to know if someone can write code at all, then why he/she expects much more than that in an interview?

I believe there is no single declaration about 'what the interviewer' wants.

Especially that several times they do not know themselves, just try to mimic what is expected from them or seen in some random situation they came across. Sometimes, of course.

---

"brining key stakeholders into conversations at appropriate time"

This is some very good quality euphemism here, two thumbs up! : ))


The problem is that the candidate has been assured that they will be asked 'leet' code questions where solving the problem isn't enough, they will also be asked about O notation and how the code can be optimized and whether to use memoization or recursion. This is what the books will tell you, this is what YouTube will tell you, this is what 'helpful' recruiters will tell you.

And IME this is what most interviewers have been taught. They've got a list of sample questions, they've been told that if they give the knapsack problem and the interviewee doesn't immediately call out 'dynamic programming' than the interviewee is a pass.

If you only want to see working code than you are the exception rather than the rule.


I will ask all of those questions. But I don't expect perfect answers. You should at least know what big O is. I would really like it if you can tell an O(n^2) algorithm from a linear one. (That is often really important in real-world code). I would like you to consider different ways you can optimize the code.

I don't expect you to quickly crank out a novel optimal algorithm. But I like to see that you can think about how to solve problems programmatically. I would like to see you can identify differences in different algorithms and what tradeoffs there are. Considering different approaches that may be faster also shows that you didn't just memorize an algorithm from somewhere but that too took time to actual understand the problem and build a mental model in your head.

I have given people great reviews when they couldn't come up with a working algorthim because the clearly knew how to think like a programmer and considered all of the important things.


As an electrical engineer who has basically always worked as an embedded programmer, I don't know what big O is. But, I'm probably not applying for your open positions.


It definitely depends on the domain. In embedded programming you can often know that your n is small for your algorithms. If you are making a desktop app or web service it is very important to prepare for people using your service in unexpected ways and a super linear algorithm can result in terrible performance or downtime if not managed correctly.


That's a pretty far cry from

> which is to make sure the candidate can actually write code, like at all

It's also still a terrible approach and only gets leet code crammers.


are you suggesting that only leet code crammers can tell the difference between quadratic and linear algorithm?


You're all not getting it.

They only want to see somebody who can get working code and a glimpse of their thought process. But from 100s of mediocre examples, the better coder will have a "better thought process."

Same goes for dating. Of course people will swear up and down they "only consider personality." Turns out, they've met 10 other people with a better personality than you.

Just because they're "only looking for x" doesn't mean they'll accept anybody that clears the bar.

The ultimate read between the lines though is that "oh I'm only looking for xyz, nothing superhuman" in a process where you have 10,000 competitors and applicants will still require high performance on your part. It's just a nicety, a meaningless phrase.


Yes. I've filtered out countless candidates who can't code like this. It's amazing how people will fire off applications to jobs they can't do. You might doubt this because you wouldn't do it, but just wait until you get to sift through a fresh wave of applicants.

Has everyone I hired got "perfect marks" on the test? Of course not! It's not about that. It's about seeing how they react to a problem, watching them break it down, ask questions, and, ultimately, get on with it. If the job is too sweep floors you need to be able to hold a broom. It's as simple as that.


Yes, unless you’re in a niche area or only hiring through a network, you’re going to sift through mountains of unqualified candidates because those are mostly the ones on the market (see Market for Lemons[0])

[0] - https://en.m.wikipedia.org/wiki/The_Market_for_Lemons


I disagree, I think it's a failure of recruiting if you got someone into an interview who can't code "at all". That must mean nobody at the company is technical "at all" either then, if you are getting people that far along without having any clue if they even write code at all.

I don't have this problem - I can easily tell who is good and who is not good by looking at their stuff online, which repos they are contributing to and what their contribution is. I look at personal projects - I can easily tell what parts they wrote vs didn't because it's usually specific to the project.

I can tell from their blog posts and comments, especially GitHub comments - I can even see if they're pushing features at 11PM on a Friday if the obsession piece is crucial to the hire.

People who say all their work is hidden under NDA or they're new grads and haven't done anything yet - sorry, if there's nothing to view online I just wouldn't qualify you to interview.

Though I have given out 50+ code tests in my career because I had to, I would never choose to do this, if given the chance when hiring someone I would never give them a code test. I think it's an amateur move and wastes everyone's time. At its best (as in the case of CTCI interviews) it's an exclusivity filter for academics who memorized the optimal data structures for algorithms as taught in school, but the candidate might not have any of the skills needed to build app features, perform DevOps, etc. or even operate a Terminal - CTCI doesn't cover anything async, nothing about UIs, APIs, databases, services, git, design, file formats, etc. it's purely academic sport. And like I said, a good developer's work should be highly visible anyway - skip the random code test.

I would spend the recruiting effort finding specific developers using specific technologies that aligns with the role and making them excited about the opportunity rather than canvasing 1000 code tests out to anyone who applies.


> That must mean nobody at the company is technical "at all" either then, if you are getting people that far along without having any clue if they even write code at all.

Lots of people skate by in technical roles by barely doing anything technical. Lots of people also overinflate their achievements in resumes and conversations aka lying. How is non technical recruiter supposed to evaluate their coding chops?

> People who say all their work is hidden under NDA or they're new grads and haven't done anything yet - sorry, if there's nothing to view online I just wouldn't qualify you to interview.

Lol, great method! I have a better one - just hire acm winners. No need to test them


> People lie

Obviously taken into account - I still have no problem whatsoever identifying great developers on GitHub and I'm sure many other developers who actually code often could too. You have bigger problems if you can't tell if someone is lying or not about their abilities when all their work is visible to you. You should be able to easily tell what is theirs vs not.

> Lol, great method!

Yes people inflate their own egos and abilities - especially those who spend all their time interviewing others instead of building.

I prefer demonstrable experience for an engineer over standardized tests which tell you nothing about any real world experience: App architecture, async programming, APIs, UI, DOM, git, unit/e2e tests, any known framework or library, etc. A person who knows all those but hasn't memorized CTCI is a lot more useful than the CTCI memorizer with no evidence of work ever performed.


> when all their work is visible to you

Doesn't really happen reliably in the real world.

> You have bigger problems if you can't tell if someone is lying or not ...

Whether or not I have bigger problems is independent from whether I need to recruit more developers.


It's like a designer with no portfolio. A lot of them do exist actually, not saying they don't, but I would never hire one.


> While you're spending all your time and the company's money conducting maniacal quizzes and tests your competition is building features and teams and making progress

If your research on your candidates' profiles is of the same quality as you demonstrated nosing around my profile, I don't think any of your competition should be worried at all ;-)



No way to know for sure, but I'd guess the number of employed developers writing blogs and having personal projects is in the single digits. For developers with more than 20 years experience, I'd be surprised if it's even 1%. I've got more experience than that, which means I have an extensive network of former colleagues at a similar experience level and I don't know one single person who blogs about tech or works on personal projects. We all have busy family lives and plenty of money so absolutely no need to spend any personal time doing work related stuff.


Not to mention github profile eval would be much easier to game in the age of LLMs if it were to become mainstream. You can already kinda see the effects of these with people sending you mindless PRs fixing grammar and whatnot if you maintain any popular projects.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: