With programmers, the single easiest way to identify good candidates (in my experience) is sheer interest in what they do / desire to learn. This is a learn everyday field and if you're interested in what you're doing, you're going to do a lot better at it. It's hard to apply yourself mentally to something that you don't have a good level of interest in. Given that it's a learn everyday field, people with that level of interest will realistically be able to learn anything they need to do solve the problem you're hiring them to solve.
The only real differentiating factor is your tolerance for ramp up time. I expect a programmer to be able to pick up a new language or database within a couple of weeks (tops) in most cases. If I'm hiring full time, that's something I'll tolerate. If I'm hiring a contractor, I'm going to be uneasy about paying high hourly rates for him to learn the job.
The single most effective way that I've found to interview for "interest" is to just get them talking about something they've done before and ask them to go deep into the details. You get everything you need from watching somebody talk, with a smile on their face, about how they solved some problem in a creative way that makes them show some pride. Doesn't really matter what the problem was, if it was a business problem, code problem or hardware problem. The important thing is the level of attention to detail in addressing it.
I've been using this technique for about 8 years now and while I don't make it the exclusive criteria for hiring, every person I've ever hired who has passed that part has ended up in my "great hire" category.
> The single most effective way that I've found to interview for "interest" is to just get them talking about something they've done before and ask them to go deep into the details. You get everything you need from watching somebody talk, with a smile on their face, about how they solved some problem in a creative way that makes them show some pride.
Sorry, but you are being scammed. You are selecting for sales skills, not technical skills. Read what you've written. You're judging someone based on how they presents themselves - not anything about technical ability.
I have interviewed many people who are great at doing this impassioned sales speech, but are terrible at actual engineering. And in fact, many good engineers are not good at selling themselves in this way - they have spent most of their lives deep in code, not working on their social skills like normal people.
Why not actually test them similar to what they're going to be doing? If it's a sales or advocate role at your tech consultancy, well then sure, your approach works.
> Sorry, but you are being scammed. You are selecting for sales skills, not technical skills. Read what you've written.
You're asking the parent commenter to set aside eight years of personal hiring experience, over which time "every person I've ever hired who has passed that part has ended up in my 'great hire' category" (assuming, that is, that you've read what he/she's written).
Can a candidate lacking the technical skills go deep into details the way the parent commenter describes?
Yup. I've seen candidates talk a deep technical game and turn out unable to write long term sustainable (well-organized) code.
Here's the fundamental asymmetry: what makes a good programmer great is their ability to build a cathedral brick by brick, alongside the architect, the other builders, despite the external circumstances, accounting for complications along the way. Ignoring the connotations of waterfall design that "cathedral" might invoke.
Building a complex system with many components takes weeks, even months, whether or not you're talking about MVP's, or service oriented architectures, and whether or not you're iterating rapidly.
A weak engineer will be fine for a few weeks, but the complexity will catch up to them. A strong engineer pushes that bar much, much further out. Possibly indefinitely.
Problem is, how do you tell that you're talking to that kind of person when you have only a few hours? It's an extremely difficult task because you must evaluate a person for a > 6 month job in under 6 hours. No matter what you do, you're going to have trouble.
> Can a candidate lacking the technical skills go deep into details the way the parent commenter describes?
I withdraw this somewhat rhetorical question. I should never have asked it because it's not the point (and it invites answers out of the original context that overlook "the way the parent commenter describes").
Here we have two ways of evaluating technical candidates, just two of perhaps many: One seems to be more arbitrary and unreliable than many have assumed, and the other is very time-consuming, doesn't reduce to graphable data and requires a really good interviewer.
Please just recognize that having candidates jump through technical hoops is not as solid and objective a method of evaluation as it might seem to you, and could even filter out certain kinds of great candidates. Where at all possible, please try to treat the human being under evaluation more like a human being and less like a horse.
That doesn't seem to correspond to the attitudes of salesmen vs engineers. I'm not sure what salesmen you're thinking of, but few of the people in sales I've talked to can have an in-depth discussion about how React.createClass vs Class ___ extends React.component will effect "this" context, and how super(props) pattern works in the latter. Furthermore, lots of engineers who work with any given language/framework can't explain it either, and many of them have basically been copy-pasting until stuff worked their whole career.
I've found that most people who have poor communication skills are not good engineers either* and the two might actually be related.
I've seen this same thing frequently when evaluating security engineers. They often can talk the talk, but when given a simple task to perform more often than not will fail.
Coding exercises might not be a great measure, but as far as I have experienced, they are better than other items out there.
Do you mean that over your 15 years you've hired people more often than we'd think who later turned out to have been fooling you during the interview with their sales skills? That sounds terrible and I'm sorry to hear that.
On the other hand if you mean that you've discovered during your interviews that candidates more often that we'd think had been trying to fool you -- and that's why you didn't hire them -- well then that doesn't sound much different from the above person's eight years of experience.
I don't think anyone here would argue that there isn't a significant number of candidates for technical positions who BS about their technical skills.
I've hired, at several places, more than a few people who were fooling me with their sales skills. More often I work at a place where we do multiple interviews -- some "technical" in the sense that it's talking about tech, and some "technical" in the specific sense of asking a programming question with a verifiable answer (i.e. run the program, see if it works.)
Frequently people do well on the technical-not-actually-coding portions and very badly on the please-write-code portions. That is, if we didn't ask the specific coding questions, we'd be hiring people they disqualify.
At those places we less frequently hire people that just can't code -- though I've been around for that too, usually when we look at a resume and say, "oh, we don't need a coding test for him, he's been writing important code for years in a way where he couldn't hide lack of ability."
It would appear that a resume is a poor way to verify that.
It's not the exclusive criteria, just a part of the process that I've generally found to be associated with people who excel. We are still going to have them interview with some type of subject matter expert related to the position but all jobs are not created equal.
Positions for things like security, as another commenter mentioned, are very much experience driven. Java is another simply because it has a very high learning curve to become really adept with it. It's doable, but I'd extend the expected ramp up time to longer than 2 weeks.
There's nuance to everything. I didn't mean it to come across as if that was the only factor in hiring.
Interest is not enough, unfortunately. There are plenty of engineers who are attracted to the challenge and excitement of building new things, but have no appreciation for The Right Way to build things.
Great engineers think beyond "how" and ask the "should" questions as well. Mediocre engineers glue things together in a haphazard way with little thought about what's the best way to write things. Caring about maintainability, comprehensibility and extensibility is partially a function of experience, but I've met plenty of experienced engineers who still write bad code and design systems poorly. It has no correlation to how much of a tinkerer and curious person they are, in my experience.
I apply almost the same approach as OP. You can get a relatively good picture at a candidate's engineering skills and thinking ability via "go[ing] deep into the details", compared to asking algorithm questions.
I also ask candidates to code on a realistic problem. It doesn't involve any "fancy" algorithms or "tricks". What I want to see are the coding style, attention to details, and of course, if the candidate is comfortable at coding.
This approach has been working well for senior candidates. It is much harder to interview new graduates.
> I also ask candidates to code on a realistic problem. It doesn't involve any "fancy" algorithms or "tricks". What I want to see are the coding style, attention to details, and of course, if the candidate is comfortable at coding.
I have been reading a quora answer about why most of the developers fail FizzBuzz test and the author differentiates between Programming and Engineering and he says that programming is aimed at making the machine work, where as engineering is more about the artistic aspect of building software (which is highly dependent upon experience and tenacity to get the job done). That people wanna hire Engineers, but they test for Programmers.
This got me thinking about a common pattern I have been observing. Most of the algorithms I implement in a real world code are very basic. Most of the work I do, revolves around the fact that my code should be easily understandable to other people (and sophisticated algorithms prevent that), that people should be easily be able to modify it, repurpose it or expand it
In real world, if I ever was asked to implement FizzBuzz, I wouldn't even open my editor until I understood what is it going to be used for, what future features it will have, and what existing code will it work with. Once I understand it all, I might never write the code which most people write (or I myself write) during a fizzbuzz interview.
Maybe it's time to create a Software engineering fizz buzz rather than programming fizzbuzz.
The current common interview practice is more like a programming contest. The candidate has to figure out a problem and finish it within ~30~35 minutes. Very competitive, but with little engineering involved.
I like the "Software engineering fizz buzz" idea. For backend engineers, I usually ask the candidate to implement a commonly used API in a language she/he is most proficient at.
The way I envision "Software Engineering Test" as a 20 questions exercise.
The interviewer will give a simple user story to the user. "As a user I want to be able to add two numbers".
After the candidate implements that, "As a user I want to be able to subtract two numbers". At this point look how the candidate continues. If he implements a class Calc with two methods as Add and Subtract, then give him 5 points, if he just writes another method then zero points.
Then ask him to add multiplication functionality. If at this point he encapsulates the code as a class, good, if not then make him do that and let him choose the encapsulation(with no bonus points).
Now the point of this exercise is how good of a job does a developer do of figuring out what does the client want especially when client himself doesn't know what he wants.
Somehow once the Calculator class is implemented, the next user story is to "create an interest rate calculation method which offers different interest rate based on credit rating", the idea is the new user requirement is a client asking for to do something quite weird. You can't JUST add this method to the Calculator class. This is the point where the candidate shows how good he is in being an artist with the code.
It looks like a test for how good you are at over engineering things. Making classes for data processing functions is OOP wrongly done in my opinion. Simple functions or even a lambda seems more appropriate, ability to make it simple is seems less impressive but it takes experience. Plus what if your language is functional and doesn't even have classes?
> It looks like a test for how good you are at over engineering things. <
Sorry I typed it all on mobile, so laziness got to me. But it seems like that you missed the point of the test by a fair margin. The test allows you to judge overengineering too. The whole idea is, give them scenarios like this, suitable to your values and business needs. Every language has abstractions, and it is the decision to figure out good abstraction vs bad abstraction vs no abstraction.
Making a calculator class is an overkill IMHO, and this is precisely why the next task is to "create an interest rate calculation method which offers different interest rate based on credit rating", the whole idea is, that here any developer worth their salt, will be separated from the chafe. Of course, you will need to help them a bit to steer in the right direction, but the test isn't about knowledge, but of their perspective.
Oh I understand. What I skipped explicitly writing there is, if he writes an abstraction, and defends it well then give him 10 points, if he can't defend, give him 5 points. If he doesn't write an abstraction and defends it well, then give it 10 points, if not then 0....you get the jist.
The fundamental point is, who can best figure out the requirements of the customer, and write code which makes most sense with least amount of 'future problems'.
> I also ask candidates to code on a realistic problem. It doesn't involve any "fancy" algorithms or "tricks". What I want to see are the coding style, attention to details, and of course, if the candidate is comfortable at coding.
Do you do that on the spot or is it a "take home" assignment? I've known many solid programmers (myself included) that could code anything you want, but if you put them in a room, sit there and watch them do it, they'd freeze up. On the other hand, if you give them a project, give them a timeframe to develop it, then have them walk you through what/why they did, they would absolutely excel. It's sitting there in front of someone(s) and the expectation, that would just cause them to freeze up.
On the spot. I fully understand that many good engineers could freeze up, myself included. That is the reason the coding question is about a realistic problem. No dynamic programming, graph theory, suffix tree, etc. Also, before the coding, I would talk to the candidates about technologies listed in their resumes. It is mostly like a talk between two engineers in a meetup or a conference, not a technical test. My intention is to build an environment that the candidates can feel like real work environment as much as possible. It also helps to bring up the candidates spirits. We are talking about something we like!
As long as the candidates get the logic flow on the solution, can write code down in well-formatted manner, pay attention to corner cases, it doesn't matter if the candidates actually complete the coding, if I think given enough time in a real work environment that the candidate has no problem in doing it.
There are some red flags I pay attention to. For example, readable code is very important to me since we work in a team and we spend more time reading code than actually writing it. Once we had a candidate. he got the coding right, but his code was very hard to read. It reminded me of some obscure C code in the past that one statement was trying to do many things so that it became very convoluted. For me, it is not "smart". It is bad engineering.
Yeah I do wonder what you are getting out of not doing this. If they were to somehow cheat (get someone else to do it) this would be picked up very quickly on the job and waste everyone's time including their own.
> I also ask candidates to code on a realistic problem.
I just ask them to bring code, and tell them we're going to talk about it.
It doesn't have to be their own (it normally is though). It doesn't have to be good code (nobody has yet brought bad code, but I wouldn't really care).
I would rather talk to somebody in their own code and language. If they can't explain it to me, then that's a data point, too.
Attention to detail will usually preclude someone from gluing things together like you describe though. Those are important details to account for and the type of person you're describing is more task focussed (aka - I got it done) than detail focussed (aka - I got it done right).
I would say this dichotomy doesn't quite capture what's being discussed. To my mind, detail focused can still result in an ugly hack of a solution, merely with all corner cases covered.
I think one possible phrase closer to the heart of things would be "aesthetically focused" as well as detail focused -- they care not only about the details of the functionality, but about the "technical beauty" of their solution.
>There are plenty of engineers who are attracted to the challenge and excitement of building new things, but have no appreciation for The Right Way to build things.
It still takes a village to get to this point, which means even in the best case, we're all going to contribute some shitty code to the world because that's just the learning curve.
You can, of course, go a long time without peer review or having ever been shown a better way to do the stuff you just did.
At the same time you can overdesign something based on how it 'should' be engineered to the point where you haven't shipped anything and your company is bankrupt. You need someone that can pick a decent enough pattern which can get the job done in the time allotted.
I would not hire someone that has a list of a hundred 'shoulds' which need to be met for any project to be shipped. That's a recipe for never shipping.
> I've met plenty of experienced engineers who still write bad code and design systems poorly. It has no correlation to how much of a tinkerer and curious person they are, in my experience.
Can they really be considered experienced then? Or are they basically just people who are first year engineers repeated for 5, 10, 15, etc years.
> I expect a programmer to be able to pick up a new language or database within a couple of weeks (tops) in most cases.
They may be able to hack around, write a for loop, track down a bug....but you're not going to get the same caliber of work from someone who first saw python two weeks ago compared to someone whose been using the language for 5 years on real projects.
So how exactly is anyone supposed to gain new experience?
I don't mean to be combative, but so many job recs and interviews expect X years of Y. How do you get that time it? You learn it, right? I didn't used to know Python, but it was obviously useful for my job. So, I installed it and learned it. I was producing useful results for my company very rapidly. Sure, a few years later my skills are more well rounded and 'pythonic', but really, so what? This is how every. single. one. of. us. learns.
My job is only interesting to the extent I am learning new things. You are only going to hire drudges if you require the applicant already know everything needed for the job. How boring. Obviously I'm not arguing for hiring a janitor to rewrite Google's deep learning from scratch; some ability is required. But picking up basic skills in a new language? Easy peasy. Nothing to it. That's our job.
> So how exactly is anyone supposed to gain new experience?
By having commits on Github numbering in the middle five digits, of course! The only downside is eventually employers will move on to some other arbitrary bullshit metric, and you'll have to start over.
If you're good at picking things up in a hurry, the best way to gain experience in new skills is to get hired for your existing skill set, either at a very small company or (better yet) a large one with incompetent management.
In a small company, everyone wears a bunch of different hats and you will often be asked to do new things outside your official role.
In a large company with bad management, staff turnover will ensure that they regularly find themselves short on skills, at which point you can stick your hand up to pick up the slack.
Either way, if you consistently deliver the goods then they'll keep coming to you with new challenges.
> but you're not going to get the same caliber of work from someone who first saw python two weeks ago compared to someone whose been using the language for 5 years on real projects.
Careful. You probably need to define your terms more clearly.
A CS student who knows Java (4+ years of experience) is going to turn out very different programs from a 20 year veteran of Erlang who is just learning Java.
Even with bad Java idioms, the veteran is very likely to be turning out much better code because he is thinking about the underlying architectural issues (failure modes, recovery, concurrency) with far more experience.
Yeah, I watched this play out in real time when I paired them. It was actually really enlightening and entertaining.
I also found out I didn't know as much about either Java or Erlang as I thought I did.
"Even with bad Java idioms, the veteran is very likely to be turning out much better code because he is thinking about the underlying architectural issues (failure modes, recovery, concurrency) with far more experience."
Careful with that assumption. While there may be instances where this is true, I've met many veterans that couldn't think outside the small specialty they had become locked into.
Idioms are powerful in that the shape how you think about a solution within a fixed language. They shape your thinking and the shape of your thinking changes what solutions you can conceive of. A veteran that shows interest in a broad range of topics will have more failure experience and will be able to offer better results.
I too have found this assumption to not always be the case. At my company there are certainly some people who qualify as vets yet insist on writing purely procedural C style code and have yet to adopt modern paradigms like OO design simply because they are so far removed from their education.
To be fair, they might be on to something. I've programmed my whole life in OO and recently started looking into FP. I can see the veterans thinking that OO is a fad after seeing things like BeanFactoryFactoryFactory (I personally do not program in Java and have never encountered the use case for a "Factory", and as such do not know what the use case is). After that, what value do you get to adding the functions to structs? I personally see the value, but I also see the value in not doing that. I would recommend teaching them OO while yourself should study FP and criticisms of OO. After them, it may be easier to evaluate which is better for your use case. It sounds like the vets have already cornered you out of OO, but you also definitely don't want to force the wrong tool onto a project just because it's the tool you understand best.
This makes me think of that study[1] about doctors being away on conferences. I think it likely comes down to recognizing that experience rarely, if ever, trumps solid analytical thinking.
> A CS student who knows Java (4+ years of experience) is going to turn out very different programs from a 20 year veteran of Erlang who is just learning Java.
Perhaps. But I would take contention with a CS student claiming 4+ years "experience" with Java unless they were referring to something outside of class work.
When filling a position though, you're usually not comparing "was handed my degree yesterday" candidates with 20 year veterans. . . they're demanding much different compensation and degrees of responsibility.
A more apt comparison of my point would be a 20 year veteran of of Erlang who says "oh, yeah, I can pick Java up. Never seen it, but it'll just take me a couple of weeks." is not, in any way, going to produce the same quality code as a 20 year veteran of Java of relatively comparable experience and general skill.
I have seen a company pay someone 100$/hour to learn a new language as part of a 6 month contract. They were producing better code than the company's staff within 3 weeks and finished the project ahead of time.
Granted, he was an EE not a software developer. But, deep knowledge of a platform is often more dangerous than helpful. The surface layer tends to be the least buggy parts of most systems.
PS: I wish more people heard "Let's use reflection!" as "Let's use regular expressions!" for similar reasons. Yes, it can help, but you can also dig a ditch using grenades...
I hear "Let's use reflection!" as "It's time to find something else to work on!" Outside of writing developer tools or unit test frameworks, I can't think of a worse code smell. It's a klaxons blaring, we're doomed kinda thing.
Everytime I've seen it used, it turned into a quagmire of subtle regressions, strange effects at a distance and extremely brittle code.
I think the best pattern there are:
1. Thou shalt never write code that uses reflection on its own code base.
2. Thou shalt never use reflection unless all other options have been proven worse.
Which leaves mostly unit test frameworks for some reason...
I hear "Let's use reflection!" as "My language's tools for code reuse are hard to use and ineffective in practice, so I'm going to put annotations everywhere and use reflection instead! It'll be ten times harder to understand the code than it would have been with a little metaprogramming!"
But then, my opinion may be unduly influenced by Java vs Ruby.
Unfortunately you practically have to use reflection for certain things if you work in Go, although the language authors have tried to discourage it by making the APIs absolutely awful.
I don't disagree with the idea that becoming truly skilled in a language is more than a couple of weeks of work even for a great programmer, but more often than not you don't quite need that caliber of work from them right away.
Most job fills aren't going to be putting the new employee in charge of "greenfielding" the architecture of a brand new app, they'll be doing maintenance or build-out of an existing codebase, giving them plenty of time to ramp up on the fine details of a language while still being productive working with the existing code.
There's more to being a productive programmer than knowing the semantics of a language.
Someone who has been programming for a while but is learning a new language will have a ramp-up period when their code is ugly or unidiomatic, and they will take longer to write it, but it will capture the correct algorithms and abstractions. Compare that to someone who is inexperienced and writes a pile of exponential-time spaghetti that is slow and unmaintainable in any language.
Yes, and code reviews and other mature practices will get the person over that hump very quickly. Anyone can point you to the 'PEP8' of your language of choice.
> They may be able to hack around, write a for loop, track down a bug....but you're not going to get the same caliber of work from someone who first saw python two weeks ago compared to someone whose been using the language for 5 years on real projects.
From the opposite side of things, someone who's only been using the same tools for 5 years is probably going to struggle picking up new tools. For the projects I work on, it's rare to use the same languages and tools for every project. I'd rather have someone that understands how to write good code in general and knows multiple programming paradigms instead of someone claiming to be a guru in a single language.
If someone knows Java and JavaScript well for example, is it really going to take them more than a few weeks to be productive in Python (especially if you have someone on the team helping with picking the appropriate libraries to use)?
I think his point is more that someone who's been exposed to a language for years but isn't interested is going to pay off much less in the long run than someone who's using it for the first time but enjoys learning. The reason he mentioned a few weeks of training for the language meant that they would be able to start making changes within a few weeks and be making major contributions by a few months. The fact that their first month or so may be relatively unproductive is less of a concern if they're constantly getting better and better.
That said, a few weeks of training plus exposure to a large codebase in a language should be enough for even the short term unless they're trying to learn a whole new paradigm or some huge frameworks/libraries.
I would like to add that interviewing requires quite different skills from engineering. A good tech interviewer has to be a good engineer, but a good engineer might not be a good interviewer. I have seen good engineers with poor ability to read people, with strong bias of "curse of knowledge". They are more likely to fall for saleperson-typed candidates.
Therefore, a hiring manager should have the ability to tell which engineers are good interviewers, and give more weights to their opinions. Pay extra attention to those interviewers who is more of a big talker and less of a listener, if you have to ask him/her to interview. If one can't be a good listener to his/her teammates, I can't imagine he/she would listen to and observe a stranger with any substantial effort. Unless he/she is a genius in reading people, his/her feedback can be mostly ignored.
It's only a discussion if the person conducting the interview has some concept of what they ask. I've been to several interviews over the years where the HR person asked questions without any understanding, questions they were obviously handed by bosses. They then hurry to write down your every word of the response, the plan being to google key terms later rather than admit ignorance now.
I had a HR interview team recently ask me about a particular rule governing the admission of evidence in administrative or arbiter proceedings. I started by asking whether they wanted me to explain admission generally or specifically how it differed between the two.... total blank faces. I asked whether I should first explain relevance ... they all nodded.
"Explain it as you would explain it to your students."
People who have natural abilities chronically underestimate their effects. Some, regardless of interests, will always struggle more with certain physical or mental tasks.
Interest is important, but ignoring ability is not the right way to interview.
Most interviewers don't ask enough technical questions to have any idea what a candidate knows or doesn't know. If their one or two questions happen to be something the candidate knows well, they'll call them a genius. If they happen to not know, they'll label them an idiot.
You can learn a lot more from 20+ rapid fire questions than forcing a candidate to eek out an answer to something they're not familiar with. And once you establish the areas they're familiar with, you can ask them truly useful questions.
The key is to look for people who have strengths and not worry at all about gaps in their knowledge. Anyone who has earned genuine expertise in one area will be able to do so in other areas.
Along these lines, I start out with very broad questions. Something like, "tell me how you'd troubleshoot a web service that's suddenly not accepting connections / suddenly performing badly." Different candidates will focus on different aspects of that problem depending on their background: low-level networking, cloud environments, application-level problems, databases etc. Based on their resume, I like to see if their expertise matches up with their experience. I like to see that they don't consider ONLY things within their expertise. I like to see if they can make reasonable guesses outside of their expertise without BS'ing me. I like to see if they have a good approach to exploring areas they're unfamiliar with and general problem-solving. I like to see if they recognize how valuable it is to have diagnostic / monitoring / change control tools in place and to have done pro-active testing. This can be tough to do with coding problems, but I still try give problems that should be familiar to experienced low-level C developers as well as high-level Python web devs and tailor my expectations to their experience. I'm more concerned with how well they've learned based on what they've done than if they've already learned what I'd like them to do.
At my favorite tech interview we debugged a real prod issue they had a few months before. The interviewer spent a few minutes sketching the basic architecture of the system on the whiteboard and then started with the customer complaint:
From there I went explained my debug steps and he acted as an oracle when I took an action
Me: "Have messages been lost or does a page refresh always fix it?"
"Messages haven't been lost"
Me: "I'd check our logs for anything obvious errors"
"Nope, everything appears normal"
...
Me: "What sort of logging do we have with the websocket vendor?"
"They have a live console but don't provide any persistent logs"
Me: "Can we scrape that to get logs we can correlate to the errors?"
"We did that, didn't find any errors around the time a user had an issue"
....
Me: "Can we try X Y Z to reproduce?"
"When we did that we discovered that the disconnect only happens after a user has opened a navbar menu."
...
"As it turns out there was a click handler on all navbar buttons that disconnected from the websocket. The buttons used to directly link to different pages, now some of them had submenus and opening that submenu caused chat to hang."
That's fair - I wouldn't be a good judge of this if I didn't have really good breadth myself. Often a candidate in his own are of expertise will go beyond what I can fairly judge and I end up having to do a bit of reading afterwards to confirm and / or assume they were correct.
I have a similar question to this that I like to use - "Tell me how software gets delivered where you work now (or a place you've seen it done well). Idea to Customer."
Where the candidate starts, stops and goes into detail are usually quite interesting.
I have been to lots of interviews, on both sides of the table. I find most interviewers unprepared to evaluate the person for the role, and instead exercise their own biases, stroke their egos, etc. It's largely a voodoo practice that we'll look back and laugh at as a civilization at some point..
I wonder how many employ Kahneman's recommendation based on his book, "Thinking, Fast and Slow":
> Suppose that you need to hire a sales representative for your firm. If you are serious about hiring the best possible person for the job, this is what you should do. First, select a few traits that are prerequisites for success in this position (technical proficiency, engaging personality, reliability, and so on. Don't overdo it — six dimensions is a good number. The traits you choose should be as independent as possible from each other, and you should feel that you can assess them reliably by asking a few factual questions. Next, make a list of those questions for each trait and think about how you will score it, say on a 1-5 scale. You should have an idea of what you will call "very weak" or "very strong."
> These preparations should take you half an hour or so, a small investment that can make a significant difference in the quality of the people you hire. To avoid halo effects, you must collect the information on one trait at a time, scoring each before you move on to the next one. Do not skip around. To evaluate each candidate add up the six scores ... Firmly resolve that you will hire the candidate whose final score is the highest, even if there is another one whom you like better — try to resist your wish to invent broken legs to change the ranking. A vast amount of research offers a promise: you are much more likely to find the best candidate if you use this procedure than if you do what people normally do in such situations, which is to go into the interview unprepared and to make choices by an overall intuitive judgment such as "I looked into his eyes and liked what I saw."
But just to be a little bit contrarian for the sake of conversation: Has it been shown that ignoring your gut instinct or even who you like most if inherently wrong? That kind of testing definitely removes a lot of bias, but it just assumes that biases are inherently bad.
There certainly ARE bad biases (sexism, racism, etc). But you may also need to work with the individual you're recruiting, so how well you think you can get along with them or work with them is likely important too.
Plus if this method was common, do you think candidates would prep specifically for those scores? Seems like something very easily gamed. Not to mention that scoring itself can be biased.
As I said at the start, I actually like that, and think they're on the right track. But I'd likely want to combine it with something LESS subjective to get a fuller picture of a candidate.
Your question is one I've been exploring since the New Year.
One thing to consider is that the gut is a collection of relationships and groupings based on experiences and cognitive processes.
This is great for some circumstances. I don't need to take a tally of all the stats to ensure I'm making the most educated choice (based on reviews, calories, health safety, longitudinal studies, etc.) of which sandwich to buy at the deli.
On the flipside, our gut can betray us. One of the chapters of Malcolm Gladwell's Blink talks about a story of New York Cops whose instant reaction caused the wrongful death of a innocent because their heuristics that formed assumption after assumption were just flat out imprecise. Signs that should've flagged them to check their assumptions actually only reinforced their view. From their perspective they truly believed their conduct was in line with what they expected. In Thinking, Fast and Slow, Kahneman provides numerous gambles where our intuition guides us to make suboptimal or imprecise choices due to loss-aversion, endowment theory, et al.
I have to go but I don't have a perfect answer for you. But I will provide a Kindle two highlights that I highlighted in a book I'm reading called Rational Choice in an Uncertain World:
> Our decision-making capacities are not simply “wired in,” following some evolutionary design. Choosing wisely is a learned skill, which, like any other skill, can be improved with experience. An analogy can be drawn with swimming. When most of us enter the water for the first time, we do so with a set of muscular skills that we use to keep ourselves from drowning. We also have one important bias: We want to keep our heads above water. That bias leads us to assume a vertical position, which is one of the few possible ways to drown. Even if we know better, in moments of panic or confusion we attempt to keep our heads wholly free of the water, despite the obvious effort involved compared with that of lying flat in a “jellyfish float.” The first step in helping people learn to swim, therefore, is to make them feel comfortable with their head under water. Anybody who has managed to overcome the head-up bias can survive for hours by simply lying face forward on the water with arms and legs dangling—and lifting the head only when it is necessary to breathe (provided, of course, the waves are not too strong or the water too cold). Ordinary skills can thus be modified to cope effectively with the situation by removing a pernicious bias.
> The greatest obstacle to using external aids, such as the ones we will illustrate in this chapter, is the difficulty of convincing ourselves that we should take precautions against ourselves as Ulysses did. The idea that a self-imposed external constraint on action can actually enhance our freedom by releasing us from predictable and undesirable internal constraints is not an obvious one. It is hard to be Ulysses. The idea that such internal constraints can be cognitive, as well as emotional, is even less palatable. Thus, to allow our judgment to be constrained by the “mere numbers” or pictures or external aids offered by computer printouts is anathema to many people. In fact, there is even evidence that when such aids are offered, many experts attempt intuitively to improve upon these aids’ predictions—and then they do worse than they would have had they “mindlessly” adhered to them. Estimating likelihood does in fact involve mere numbers, but as Paul Meehl (1986) pointed out, “When you come out of a supermarket, you don’t eyeball a heap of purchases and say to the clerk, ‘Well, it looks to me as if it’s about $17.00 worth; what do you think?’ No, you add it up” (p. 372). Adding, keeping track, and writing down the rules of probabilistic inference explicitly are of great help in overcoming the systematic errors introduced by representative thinking, availability, anchor-and-adjust, and other biases. If we do so, we might even be able to learn a little bit from experience.
> These preparations should take you half an hour or so, a small investment that can make a significant difference in the quality of the people you hire.
Sounds more like it would take thousands of years if not eternity to prepare, as it involves a) accurately identifying personal characteristics related to professional success and b) predicting the future.
On a Manager Tools podcast episode they talk about hiring against a standard. First, have a standard (the six dimensions you mentioned fit the bill). Make every interviewer ask the same, or very similar, questions. Use the standard as your only source, and evaluate candidates against that. Hire the strongest one.
They ask interviewers and hiring managers to ignore "future potential". Humans are incredibly bad at predicting the future. Even if you think you can say how a person will perform in 5 years, the role might change, the market might change, customers preferences, personal developments, etc...
Same here, the problem is that it takes a lot of time and effort to set up and run a meaningful standardized hiring process (and is very hard to outsource) and also people don't even consider planning for it, thinking they can just ask some questions in a couple interviews.
For more junior candidates I'd go the "take home test" route, for senior candidates I don't see any other solution than sitting down and design the process properly.
I have to say, I hate take home tests. They move all the burden of time commitment onto the candidate. And even if the question says it should take 2-3 hours, there's a game theory situation where you have to assume others are spending more than the recommended time to make their answer more polished, forcing you to spend more than the recommended time.
Finally, companies never pay you back appropriately for your time investment. If you fail the question, they don't give you a detailed report of why you failed. I'll always politely turn down a take home test.
Take home tests FTW. The thinking goes as follows:
1. If the candidate can't be bothered to complete a 2-4 hour (depending on claimed seniority) code test in the language of their choice, we can't be bothered to talk to them.
2. If the candidate does reasonably well by completing the code test somewhat on time (with a fat margin allowed for them, well, having a life) and within parameters of the task, they're invited for a mostly non-technical onsite meet-and-greet.
3. During the meet-and-greet we make sure that the candidate isn't an axe murderer, is able to hold a quasi-technical conversation, and that both sides aren't immediately scared of each other.
4. The meet-and-greet can also include some low-key architecture discussion. Any nerds worth their salt will be able to conduct this line of questioning without making it obvious that an interview is taking place. Hopefully this isn't a critical step, as a good take-home code test will require the candidate to spend a little time designing or architecting their solution.
After the above has taken place, it should be pretty clear whether the candidate in question is a fit or not. Note that this process is by design missing the useless traditional CS questioning component, contrived problem solving exercises, or a whiteboard code beatdown.
>1. If the candidate can't be bothered to complete a 2-4 hour (depending on claimed seniority) code test in the language of their choice, we can't be bothered to talk to them.
A good reason not to work at your company. Why would I want to invest 4(!) unpaid hours into something where I am not even considered seriously yet?
I recently had a coding challenge, which was not only vague, but also took up two hours of my time. The end result was... nothing. Not even a "thank you, but we chose someone else." Since every then, I chose to not bother with long coding challenges anymore.
Would you have preferred to spend a whole day on an inconclusive onsite interview? Or perhaps a phone screen during which you're asked to implement a hashtable for the umpteenth time?
As I said in another comment somewhere in this thread, one day people will learn to do this right. Hopefully this will happen before programmers as a profession have decided to never take code tests again.
Presumably you (the employer) have less time to do on-site interviews than to assign take-home projects, so landing a real interview is a much better indicator that my (the job-seeker) time will be well-spent. (Hiring is a two way street.)
At least an onsite interview typically includes expense reporting lunch, if not dinner, and maybe an opportunity to visit a city or a part of city you are less familiar with and possibly even sight-see a bit.
I don't think a phone screen is productive beyond a half-hour to an hour and anyone asking technical implementations over the phone is likely a worse interview than trying to whiteboard things (and that's still far and removed from actual programming).
It's not just the investment. In a 4 hour take-home I learn close to nothing about the company I am applying for. In a 4 hour on-site I see the workspace, talk to employees, etc.
People usually have a coffee date first, before doing dinner. Here, you're doing dinner first, and then lunch. The first pass should be minimal in commitment, and give both sides a chance to get to know each other. They pass fizzbuzz or something doable in 15-30 minutes, you have a short convo about the company, and then you move on to the second pass.
You are right with that a whole day onsite interview is more time consuming.But personally, I think I can do better in a face-to-face evaluation than on a coding challenge where I have to guess the test cases from some vague description. In fact, face-to-face can show the interviewer how I think and what kind of questions I ask to figure out the way to solve an issue.
Maybe it is my lack of interview experience, since I only had like 3 or 4 so far, but I only 1 out of them left me with a good experience after the coding challenge (even if I did not get an offer yet).
Do well in the take-home test? Come in, meet the team, do the culture-fit thing. Then you get an offer. And this doesn't need to be an all-day thing. Who wants to keep selling the company to a candidate all day long? :)
As someone currently preparing to interview with Google I'd take 4 hours implementing real code over a (recommended) 2 weeks of basic algorithms, data structures, math, etc. revision.
You strike me as the type who would also balk at fizzbuzz and other 'typical' interview questions. What do you propose? Say you have to interview a candidate for your own startup? How would you go about it?
I agree with the parent that take home projects (as the initial filter) are a non-starter. It's because they're flat out abusive. The employer uses their position to offload the cost of their hiring onto job seekers. The fact that our first interaction is exploitative couldn't be a bigger red flag.
However I am completely fine with fizzbuzz tests, even take home fizzbuzz. But this literally needs to be something that will take no more than 10 minutes, and for most people a couple of minutes tops. And in the question have a disclaimer: if this takes you more than 10 minutes to do, this won't be a good fit, so save yourself the trouble.
I actually don't mind the idea of take home projects, but only after significant investment on the part of the company. They need to have something on the line as well if I'm to devote significant time to it. Paying a good hourly rate for the time to complete it is the most straightforward way.
I actually like fizzbuzz and stuff like that. My most recent and best coding challenge experience was one from Cloudera. It was 2 questions and I had 70 minutes time in total. The difficulty was not something crazy. To me it felt like something you would get as a homework in Uni, but slightly more complex. I was able to finish it in 40 or so minutes and even had fun doing it, because I did not get frustrated by lack of details in the problem description.
But when a company sends me 2 hours on some weird platform that tells me 3 out of 10 test cases work, but doesn't tell me why or what even the input is, I just get frustrated. One of the worst coding challenges was asking me to implement the cd (change directory) functionality. But the details were so bad that I had no idea if I have to invest time or not into edge cases etc. I basically had to guess the test cases, but eventually failed the test since I ran out of time.
Startup A asked me to do a take-home programming challenge to prove my skills, as well as a general algorithm test to give them ideas how to run their core architecture. This took hours, I felt I was being asked to do their job for free (on the second test), and the results were pathetic.
Startup B gave me an assignment to add a feature to their API and have my work merged with their active code base. I was compensated $300 for my time. The feature worked, although I don't know if they still use my code from 3 years ago.
In the end, I worked for neither. But if I were forced to choose between them, I know my answer in a heartbeat.
Option B is explicitly prohibited in your employment agreement for most salary workers. I would never consider working for a company that required me to violate a contract just for a chance to work there.
In my experience this hasn't been the case. The closest I've come was an employment contract that required me to notify my employer of any work on the side. This has happened only once over more than a decade of work in this industry.
GitLab has an employment offer example up on their website¹.
Here's the part that's relevant to the discussion:
> While you render services to the Company, you will not engage in any other gainful employment, business or activity without the written consent of the Company. While you render services to the Company, you also will not assist any person or organization in competing with the Company, in preparing to compete with the Company or in hiring any employees of the Company.
I'm sure we can have both charitable and strict interpretations of "while you render services to the Company" but only a lawyer can probably tell what that means. Did you not have such a clause?
I do not think we would make a problem if people did a paid technical interview somewhere else, it doesn't make sense to me. On the other hand it is hard to write an exception for this, but merge requests are welcome.
Are technical interviews work on real world problems, the work is not paid and the code is open source. You can opt for an alternative if you want, see https://about.gitlab.com/jobs/
"During your Employment, you shall devote your full business efforts and time to the Company. During your Employment, without written permission from the Chief Executive officer, or one of his direct reports, you shall not render services in any capacity for any other person or entity and shall not act as sole proprietor or parter of any other person or entity..."
As I understand it, this is pretty much the nature of being salaried (as opposed to hourly).
"Most"? Anecdotally and all that, but I never signed a contract that stated that, across ~5 jobs. One tried and I got it redlined out before I started.
Anecdote to add, I've never across 20 years experience (and many jobs) not had a moonlighting clause in my employment agreement.
As someone who hires, I'd be worried about the IP implications of even suggesting I pay them for outside work prior to them leaving their prior commitments.
Huh, interesting. I think I'd just walk if presented with a firm one. I constantly have other things going on, even when FT employed, and honestly no employer is going to pay me enough to constrain my options.
I work in higher ed now, and I can do whatever I want outside of normal hours, as long as it doesn't interfere with the work I was hired to do.
One benefit of having firmly rejected Silicon Valley is that I'm no longer at the whim of jealous employers who know I could find another job across the bay at any time and keep my home and social circles.
This is true of work you do on company equipment during work hours. If things were as you describe, no one would ever be able to do open source, let alone sign standard FOSS CLAs or the like.
Don't over-generalize, both of the big tech companies I've worked at in the last 10 years have this clause and require _explicit_ specific approval for any outside work, open source or not. Otherwise, yes, according to the contract it is their property. The enforceability of this debatable, but it is most definitely something that happens.
Saying they are not enforceable & saying that they are completely risk free for a potential hirer who wants to pay a nominal payment for coding exercises are 2 different things, especially considering that conflict of interest laws are enforceable.
As for the ethics of it, I find it odd that you are fine with working for someone who uses something you are vehemently against, but hey to each their own.
> 1. If the candidate can't be bothered to complete a 2-4 hour (depending on claimed seniority) code test in the language of their choice, we can't be bothered to talk to them.
The approach you describe strikes me as shortsighted as well as to no small degree selfish. I don't do take-home tests, as a rule, for one overriding reason: my time is too valuable, and I have so much less of it to expend in discretionary fashion than any potential client or employer, to spend it on a one-way interaction such as a coding test. I have a Github profile with more than enough stuff on it to demonstrate proficiency in some areas and mastery in others, and I learn nothing about your company by completing your test that will help me better understand if I want to work for you. If I really want the gig, maybe then I'll do a little additional work to demonstrate, in good faith and after you have likewise established that you are acting in good faith and aren't just cattle-calling me, that yes, I'm interested. But before? No way. Your company is just not special.
Conversing with a potential client or employer, on the other hand, is a two-way street. I learn things useful to me both in terms of deciding on a gig and things that are more generally useful, like the war stories that inevitably get swapped during those conversations. It helps me decide whether your company is worth it to me and, right now and in this market (for clued-in people), that's the biggest question: not "do we want this person to work for me?", but "do I want to work for this company?". Not operating with that in mind seems like a very, very poor idea to me.
> I have a Github profile with more than enough stuff on it
Based on my experience only, you're in the minority here. Most candidates I've interviewed have barely anything on GitHub. Some UI/UX-ish people will have a portfolio, which involves a lot of "view source" and isn't too rewarding. Lots of perfectly good working programmers have no active open source participation, not even a "dump of code" type of GH profile, making it difficult to tell wheat from chaff. Having them do code tests is pretty much the only way that doesn't involve talking to every single one.
Look at the other side, too: I've gone through hiring processes in the past as a candidate that started with a 30-45 minute conversation with the hiring manager, and proceeded to the onsite interview stage. This inevitably ended up being the classic 6-hour beatdown, with both sides muddling through without truly understanding what either is doing. These are usually so inconclusive (especially at bigger companies) that no one can make a decision, so they end up saying "no". Now both sides are 6+ hours in the hole. This could have been a simple 2-4 hour asynchronous code test, which would have provided hard information to the hiring team instead of the usual inconclusive bullshit.
> Having them do code tests is pretty much the only way that doesn't involve talking to every single one.
I get where you're coming from. But to me that's still a very, very selfish way of approaching this. You are asking for a blind burn of four hours of their time just to consider talking to them. Four hours is a lot of time. It's half a workday. You're asking for a hell of a lot just to not-a-culture-fit them out afterwards. And from a practical perspective, somebody who's good and employed is unlikely to expend the effort unless you're a very special company, too, which complicates the search.
> I've gone through hiring processes in the past as a candidate that started with a 30-45 minute conversation with the hiring manager, and proceeded to the onsite interview stage. This inevitably ended up being the classic 6-hour beatdown, with both sides muddling through without truly understanding what either is doing.
See...this I don't get. I know within an hour or so of talking to somebody if I want to work with them, and past that literally everything else, aside from compensation, is belaboring the point. And I mean "an hour"; if I'm not enjoying the process at that point, I'll usually suggest we end the interview. But on the flip side, because I am a good community member and I network heavily, I might in those situations go "hey, I think this isn't a good fit, but I want to introduce you to my friend who I think might be a better fit." I've done this before, and I can't do that (and wouldn't be inclined to do it) after you've burned half a workday on a monkey-work project.
I think there are reasonable ways to get a demonstration of work, but they involve investment and give-a-shit that I think most companies are way too happy to be bad cultural citizens to consider. Like, here's a coding test I'd be happy to do: "add this feature to this open-source project you probably use every day", well, that'd be different. I'd be getting something for my time, too, both in terms of feature and in terms of public recognition for open-source contribution. But one-way, dance-monkey-dance, throw-it-away-after stuff? That's very selfish, and I think we should be better than that.
I wish there was a way to scale the "add feature to FOSS project we use" thing. Scale is a concern here because I see this exercise being doled out more than a few times, which means that every instance must involve someone deciding what features should be added. There's also a limit to how much can be added to the same handful of projects over time.
Can't ask all candidates to add the same feature over and over, or else this becomes throwaway code as well, which doesn't address any of the valid negatives you've raised.
> I know within an hour or so of talking to somebody if I want to work with them
Ha ha! I've had an HR department at a BigCorp tell me that there was a minimum duration to each interview (30 minutes for phone screen, at least 45 but preferably an hour for in-person), regardless of how they were doing. Their thinking was that more time spent by us would cause the candidate to consider us a more desirable employer. You can imagine my anguish at having to speak endlessly with people who would clearly never cut it on my team.
> Can't ask all candidates to add the same feature over and over
Ehh. Make a list of, say, ten features across the stuff you use. You really think that's not going to be enough for fifty candidates, if the state of the industry is as apocalyptic as everyone says it is?
I don't know, I personally consider an all-day onsite interview to be a bigger waste of my time. With two or four hours spent on a take-home test, those can be any time during the day, probably time that I would have been wasting anyway. An interview has to happen during business hours, which means I have to take one of my scarce and therefore precious vacation days, and then actually go somewhere which is likely to take two hours round trip anyway, and that commuting time is entirely wasted. At least a work sample test can be an opportunity to practice my skills and an interesting challenge in itself. Then again, I actually used to do programming contests back in school so maybe that's just my personal preference.
Well, there's a pretty big difference between 2 hrs and 4 hrs. Most people are applying to multiple places when they are looking for a job. So let's say 3 places a week. Now this is the difference between 6 hrs (already a lot) vs. 12 hrs (practically insane) per week on this stuff.
And at the end of all that work, the candidate receives a binary yes/no. 1 bit of information. On the other hand, the 6 hour 'beatdown', as you term it, gives the candidate an incredible wealth of information. They see what their prospective coworkers are like, what the office is like, how people interact with each other, maybe what people talk about over lunch. So 3 interviews with 1 bit of information (12 hrs) each or 2 on-site interviews (12 hrs) with a hell of a lot? What's the rational choice?
And I'm not against programming assignments, but 4 hrs each?
Nearly all of the work I do for my employer is on public github. I always wonder though because if you just went to my personal github page it's not necessarily obvious that my Day Job is just a click or two away as these are contributions to another public repo, not one of my own. My own personal repo collection has close to nothing related to any job I'd ever want.
How do you feel about the all day interview process some large companies are currently using? Keep in mind you probably had 1-3 phone + computer interviews before the in person interview.
A take home test seems like a less time intensive process as you at least don't need to take time off from your normal job to do it.
> How do you feel about the all day interview process some large companies are currently using?
I don't go for the all-day ones (I just decline; after my last Google interview I'm done putting myself through the wringer for a whateverjob), but I really do enjoy in-person interviews. Like, last time I was looking for a FT gig I did at least a phone interview at 35 places and an in-person at 20. I enjoy interacting with people. I get a great lay of the land of what the market looks like, what's not just "cutting edge" but what is, in practice, the leading edge of what people are actually trying to use in anger. This is, in addition to just learning about a given company, why I view in-person interviews as a two-way street. After my last FT company shuttered, I moved into consulting largely because I have a better view of the world than most people working for individual companies. I've learned, from talking to a ton of people, many things that might even seem like offhand facts from idle conversation that have helped me. Like, I happened to remember that D3 instances exist in AWS from a chat with somebody on one of those interviews and in the process I saved a client $20K/month in EBS costs.
Maybe it's a Shockingly Minority Viewpoint, but I think that, in a healthy business relationship (and it's important to realize that employer/employee is a business relationship), both sides get something, even if it's the short relationship of an interview.
I love interviewing. Like, I'd do it as a hobby if I could. That said, my own preferences are not likely to be salient to the greater community of potential hires. So, I have to put them aside when designing hiring pipelines.
Having given this problem a lot of thought and experimentation, I've come to realize that designing good work sample based hiring pipelines is very difficult. Designing hiring pipelines based on something else is impossible.
At my company, we use take-home problems to answer: "is this person worth interviewing?" Can you take a simple problem and code us an equally simple solution? It's basically FizzBuzz.
I rarely have people balk at doing this, but when they do, I'm happy to look at other code samples. I just need to know if they can write code.
Totally. And I mean, I don't think I'm unreasonable. If it's the kind of thing I can reel off the top of my head or open `pry` and bang out in a couple minutes, I'll usually indulge. It's the multi-hour odysseys that are a big nope.
I honestly feel like take home tests are "busy work" and just as volatile as any other sort of technical interview for showing actual skills versus a real world environment.
2-4 hours is a lot of time to ask of somebody, regardless of their circumstances. Would you pay a contractor's salary rate for that work? Are these take home tests results worth $100/hour? $40? Every take home test I've encountered I've wished someone would pay me for that time, based on the opportunity costs of what I could have been doing, but maybe I'm cynical and pessimistic.
As a huge proponent of work sample hiring, if it were at all feasible in the jurisdictions I hire in, paying someone a couple hundred dollars on completion of a take home test, I'd do it.
But the vast majority of the developers I want to hire are encumbered by moonlighting clauses that would likely cause them major personal issues if we violated them. Seems a bad way to get on the appropriate footing with a potential new hire.
During my last job hunt, I had an 8 hour take-home assignment that could be finished over the course of a week. They paid me $400 to do it and all I did was send a PR with some refactoring on an overgrown controller. I wound up not taking the job, but I remember being really impressed with this process and wish more companies would value a potential new employee's time like that.
I was bummed out recently when I did a take home test for a local start-up, which asked me to build a full web app. I spent 4-6 hours on it making sure it was perfect - just to turn it and and get told I did a great job but they decided they didn't really want to hire any non-senior engineers and where just testing the waters.
This is happening a lot in Seattle recently - the job market is here is terrible for anyone except senior devs (which it is great for!).
Poor execution doesn't invalidate the idea. Happened to me too, although on a smaller scale. 4-6 hours is a bit much for a take-home evaluation project. If it can't be reasonably done by a distracted person in 4 hours, it's too hard/too long for a take-home test.
Hopefully people will learn to do this right over time. Or, someone might solidify a common platform for doing this, a-la HackerRank or any of the other similar services.
I would be surprised if a well constructed 2 hour test wasn't able to give you enough information on their structure and approach.
4-6 hours probably results in the candidate filling in a lot of basic details to get something to actually work that will tell the interviewer little more than the 1-2 hours spent implementing the core parts of the task.
I have been interviewing quite a bit lately, and I have found that my favorite interviews are take home tests that are representative of the kind of work I would be doing at said company. I am more comfortable with that style and I perform better.
More than that though: it gives me an idea about what it will be like to work at the company. If my only experience is studying algorithms and coding on a white board, I have a hard time knowing whether or not I will enjoy working at this company for years of my life.
A promise that you can work on "interesting problems" is hardly enough. I'm curious why more companies don't take the opportunity to sell themselves.
Anyhow, I've hung a whiteboard in my room and try to practice for interviews as much as possible. It's incredibly draining and produces a feeling completely opposite to how much fun I have building things. This feels like getting started on the wrong foot.
Despite what others are saying here I think take-home tests are a great idea. What is a terrible idea however is requiring that a candidate complete your take-home test, that you wrote and will administer. It is simply not reasonable to expect a candidate to complete such an exam for every company he applies to, oftentimes before he even talks to a human being and usually before he has a face-to-face or a phone interview with an engineer. There simply aren't enough hours in the day, even if you're unemployed.
And there aren't, to my knowledge, any really good and widely-used sites that a prospective job-seeker could use to establish his skills, which any random company likely will look at and accept as canon. And so we're all in a bit of a bind.
I assume the take home test is after you have talked to them? I wouldn't spend multiple hours unless it was already clear there was some desire to move further
This kind of depends on where the candidate is coming from. Those who're applying directly are obviously already expressing some desire to work with us. Recruiters should be instructed to give a consistent narrative, too. Good recruiters will also pitch the code test as a sign that we're a serious employer and take people seriously.
I don't like phone conversations that serve as the first line of candidate triage. Many big-company HR departments will have engineers phone screen anyone who sounds marginally suitable. This is soul-crushing, pointless busy work. My personal rejection rate of such candidates is somewhere in the 90-95% range. These are cold leads, not candidates. They should be taking the code test before talking to anyone in engineering.
A 4h test when your still rejecting 90+% of people is ridiculous. You are basically rejecting anyone that is talented ahead of time.
I personally know two programmers that did not create a resume for 20 years and still got jobs offers. Remember anything more complex than a phone screen is going to start turning people away.
These can be great. Provided that they're not used as a first-pass filter and provided they give some insight into the work the person will actually be doing. Because otherwise you're going to lose a lot of talented people who can't be bothered, especially in this market.
In a recent job hunt, I found cases were used as an excuse for the company to invest 0 time in the process.
It was so bad that with one company (a unicorn, if it matters) I replied with a bunch of questions that I had to fill out the gaping holes in the case, and they instead opted for an in person to avoid investing the time in the replies! In two other cases, they requested 4-6 hours for a time commitment. I told them I was incredibly busy and was more than happy to put 1-2 hours into the case and return it as a work in progress. Both company's said to put as much time as I wanted into it, but then decided that the case wasn't complete enough to continue.
This last round taught me that I won't do cases as a first pass filter as a matter of principle. If you want me to do a case, invest time in me first so that I know we both have skin in the game.
These cases are really just optimizing for people with more time than others.
That's why you get them to talk you through their process during the face-to-face interview. It'll be immediately apparent if they didn't do the work themselves.
One of the best interviews I've had as a candidate (didn't get that job - it's the one that got away) involved me explaining design choices I made while implementing my solution to the take-home test. That was a great discussion, and didn't feel like the traditional interview-style questioning.
We ask candidates to submit code samples of their own choosing, or if they have nothing that they can supply for whatever reason we give them a short challenge which is only used for generating code to review.
We bring them onsite and review the code and have them walk through what's going on. If they just blatantly copied stuff or handed in someone else's code they won't be able to stand up to scrutiny. And if they really do know it inside and out they probably could have written it anyways. We've had a small number of candidates who tried to pull a fast one but very very few
That has always been a concern, but in practice I've never seen this happen. Even for common-ish problems (Dijkstra's algorithm, GCD, chess) I've seen many, many diverse solutions that were obviously hand-coded by the candidate. I recall once we've suspected plagiarism based on a Google search. If my memory serves me right, we rejected the candidate and moved on.
Uh... If the solution is readily available on Google, and they can solve the problem effectively with that solution, then they didn't plagiarize. They operated exactly the way they should work in a real world scenario. The worst kind of employees are the ones who insist on re-inventing the wheel every chance they get, instead of using industry standard libraries.
It was worse than what you describe. It was literally a wholesale copy/paste of a related but not-quite-right solution. The code test involved designing a data model for chess, but the candidate took an implementation from elsewhere that didn't give us the opportunity to evaluate their modeling skills (i.e. "I will create these classes with these fields and connect them like so").
That sounds like an excessive test. What kind of code are you guys writing? Why is it not okay for him to use implementations from elsewhere? Should he really reinvent the wheel to try and impress you?
Reminds me of the time I got asked to write a merge sort algorithm in Ruby. I wrote array.sort! on the board, the interviewer told me I have to rewrite it from scratch, and I was done with the interview. Because we create abstractions and reusable implementations for a reason.
Understanding how mergesort is implemented can be useful. I've had interviewers ask me to sort a data set that is too large for a single hard disk to contain. Mergesort is like map/reduce; it can be distributed.
But have you ever had to reimplement merge sort from scratch in any real-world scenario? Of course it's good to understand, but actually doing extra work again seems pointless to me.
I do agree that "homework" type tests are the way to go.
The only caveat is - someone needs to be REALLY interested in working for you to do this as a first step.
A 15 - 20 min conversation with a hiring manager or an HR person if you have HR :) to cover things like the specifics of the position, candidate's 5 min summary of their career, approximate compensation, etc as a first step. Then you decide whether you like what you've heard and the only thing standing between the applicant and the offer is the completed assignment.
4 hours is a lot of time out of someone's day, and a near-dealbreaker for anyone gainfully employed and having some semblance of a life. Do you not realize this?
Meanwhile, it really doesn't take long to find out if someone is "worth talking to." Ask them to provide a code sample, and talk to them about for 10 minutes or so. Or if you must use a coding exercise, really, just keep it short and sweet. A good coder can show you a lot about their sensibilities in their solution to even just a slightly non-standard sorting problem.
Technical interviews are a form of hazing. Engineers often suffer from imposter syndrome, especially during an interview. Those who have already been hazed and accepted to the club will turn around and put potential candidates through the same humiliating process. And what's worse is that demonstrating you have superior capabilities in one area or another can be seen as a threat to the interviewer and they may give you a thumbs down based purely on their own insecurities. What ends up happening is that, just like in college fraternities, is that everyone ends up being similar, both culturally and in terms of abilities.
If I had it my way I would do away with the interview process altogether and do something more akin to an internship. Potential employees could start their engagement with a company by working (for pay, mind you) on a very limited basis to solve actual problems that need solving (i.e. "write an algorithm that's 10% more efficient", "create a tooltip that's aware of the viewport in React", etc). Based on their output their engagement could be ramped up until they are brought on as a full time employee. That way it ends up being completely merit based. You can either solve these problems or you can't. And whether or not they ultimately end up becoming an employee doesn't end up mattering because both parties are compensated along the way.
This would obviously put the burden on the company to boil its problems down into smaller, isolated efforts but that's something all companies should be trying to do anyways. In the end, they just want some code written that will end up solving some problem for their customer.
Overly complex algorithm questions (that the interviewer expects an immediate, near-optimal solution to -- even though they were open in the literature for several years before these solutions were found); or reasonably enough questions that just aren't articulated properly (this happens amazingly often); gratuitous brain teasers, or "Carnac the Magnificent"[1] question so any sort; or just the sheer duration of some of these one-way, "prove to me you aren't a liar or an idiot, while I play with my cellphone" sessions (6-8 hours over multiple visits); and then, quite often, dealing with the candidate in a desultory fashion afterwards -- all qualify as borderline hazing, in my book.
Constantly confused by this, so much arguing back and forth, and yet the single easiest way to deal with this is a stripped down real world problem and then giving it to a whole bunch of different candidates. Some people adopt this some people don't some people argue that it's meaningless, and go back to the standard silly interview patterns of algorithm questions and meaningless complex fizzbuzz alternatives.
My personal conclusion is that most companies don't want this for two reasons:
1. culture fit is more important for people in a rigid hierarchical structure, partly because an out of the box thinker could be dangerous for that structure. too much questioning authority, too much pointing out flaws. It's much easier to have a good worker bee than wondering why you need 40 employees to build an automated gif platform.
2. in most companies everyone is very reluctant to make decisions. for example management struggles with clear direction because it opens them up to the question of liability. if they make a decision and it's wrong they might get fired. HR works the same way, if HR passes a resume along they want it to hit a list of keywords, so they can cover their asses if he turns out to be a bad hire.
Basically everyone is so scared to make a mistake that they make a lot more mistakes trying to avoid them.
The opening of the cracking the coding interview she talks about how they don't really care about false positives and negatives, they just want those to stay below a certain threshold. But consider the hiring scale of google compared to a small company and suddenly those things matter.
One bad hire can be toxic. And basing your hiring strategy on something a huge behemoth with infinite money does is kind of silly imho
Isn't this exactly what you'd expect? e.g. if get a random cohort of developers who all of their peers would say are amazing, if I subject them to a series of tests, they will do differently well at the different tests?
For example, so I set a test where you have to write some Java, if half the candidates haven't written any Java, they'd all surely do worse on the test than the other half?
Or is their a belief in the industry that there's some scale on which we can absolutely rank all developers - front end, back end, full stack, mobile, desktop, embedded? That sounds like a surprising belief which would require extra-ordinary evidence?
The conclusion is misleading due to 2 wrong assumptions:
1. The population is heterogeneous: interviews test different skills. All interviews don't test the same set of skills, which is mandatory to compare interview scores because scores are aggregates of these skill tests. Different job opportunities means different skills to test, so it seems reasonable to assume that people evaluation vary for different job opportunities, and thus their scores vary for different interviews.
2. The observations are not statistically independent: past interviews may influence future interviews. People may get better at passing interviews or conducting interviews over time. This would impact their score. It would be good to study the evolution of individual scores over time.
While (1) should strongly limit the conclusions of the study, the complete analysis may simply be irrelevant because of (2) if the statistical independence of observations is not demonstrated. Sorry guys but this is Statistics 101 introductory course.
(1) We listened to most interviews on the platform to establish homogeneity. Interviews were across the board, language agnostic, and primarily algorithmic in nature.
(2) We actually looked into this and noticed that time didn't really affect performance. Usually, people did their interviews over a pretty short time span and then found a job. Or, people were already experienced interviewers and had kind of hit a plateau. You can see the raw data and how it oscillates wrt time in the footnotes.
Not too surprising when you consider there isn't really a standardized guideline and every interviewer asks questions of varying difficulty. Sometimes interviewers don't even ask candidates the same question and tailor them based on the candidate's resume and experiences. I've had interviews as simple as writing a function that outputs if two strings are anagrams, and other interviews that test dynamic programming knowledge and other interviews that tested my knowledge of concurrency. At the end of the day, its luck of the draw which interviewer you get and what questions he decides to ask you.
Technical interviews are silly exercises IMO. You have a multi-month ramp up period and you have all kinds of environment specific things to deal with at just about every employer. Nobody in real life gets asked to program on stage or forced to answer esoteric questions as if your in some sort of math competition.
If you don't have confidence in someone's ability based on their experience and their interview but you did like them, give them a task to accomplish offline. See if their results are anything like your results would be, and bring them in to see how they respond to feedback both negative and positive.
I've seen too many interviews that go along the lines of "How would you rate your Java on a scale of 1-5?" "5" "So how would you fix the problem if your cache hit rate on SomeObscureCommercialProduct went from 94% to 82%?" "Forget that guy. Huge ego. Doesn't know anything."
I did run into one company that had an interesting process for technical validation. They actually hire people for two weeks as contractors and have them work with the team. Then they hold a vote and decide whether to extend an offer.
1) They usually just measure the amount of effort a person has put into studying interview questions. Whether or not the ability to do this translates to being a better engineer is debatable.
2) An interviewer almost always exercise some form of personal bias, whether it be educational, personal, etc. This doesn't always show up in written feedback, but the interviewers with stronger personalities usually dominate interview debriefs, and often influence others into hire/no-hire decisions. This is especially prevalent in smaller startups where the process is more informal, things move quickly, and decisions are based more on gut feelings.
I had a 800+ page study guide while living in SF for the ridiculous technical interviews they'd put you through. Then you'd see their code base and realize they've never actually done anything that would remotely resemble best practices.
The best part though was realizing if you didn't answer that one question exactly how the "brilliant" person interviewing you wanted it answered, you were done. So at that point I'd quiz them on all their shortcomings that were evident based on what they'd told me.
In my experience hiring it's not that difficult. Good attitude, good aptitude, genuine interest in the field (software engineering), ideally interest in the applied product (if not a software product), decent communicator and good hygiene. If they've done well in those areas, I've never had to let someone go.
Lol, no. It's mostly just an amalgamation of all the technologies and stupid fucking interview questions that you get.
I have 10-12 different documents. CS/OO (basics for reminders with some simple algorithms), .Net, SQL (I always forget this shit, it's why I have data layer and use ORMs + LINQ), LINQ, jQuery, Vanilla JS, HTML5+CSS, Architecture/Patterns, ASP.Net vs MVC, Tuning, etc.
Haven't updated them in a few years because I left the bubble. My interviews in the midwest are typically an hour or two at most with very little quizzing. At worst they ask you to make them a small simple app (which I find irritating but better than SF interviews).
technical interviews are a joke. the majority of the time they exist so the interviewer can try to feel smart and subject the interviewee to whatever whimsical problem they found on the internet.
how often do you do group coding in a whiteboard in your actual job? at one interview I was criticized for sitting and thinking about a problem for a minute without just blindly jumping into attempting to solve it. also tons of people are great at solving toy interview problems but can't debug their way out of a paper bag.
I once was called into an emergency meeting by the CEO of the company I was working for the time. When I entered the room, all of the top brass were seated around the table, some visibly agitated. The CEO proceeded to hand me a single black whiteboard marker as he stated "the fate of the company depends on you". The problem was outlined by a fellow engineer and it was explained that I had 10 minutes to solve it, or we might risk going out of business.
With that, I took a deep breath and went to work furiously scribbling away on the board while everyone watched with anticipation. As I closed the final bracket, and stepped back to examine my work, audible gasps could be heard from around the room. The grizzled old CTO whom everyone was a little scared of broke the silence at last: "God dammit, Mark!" I caught a few nervous glances from some of the less technical folks. You could hear a pin drop when he continued:
"you just saved the god damn company!"
A slow clap started, and soon everyone joined in with a round of applause and cheering. Over the outburst of joy and adoration the CEO shouted to one of the developers who had gathered outside of the conference room, "get this code into production!" Soon the whiteboard was being whisked away by a group of smiling engineers as they navigated with great purpose through a sea of high fives and back slaps.
And then, with a twinkle in his eye, the CEO shook my hand, and pulled a crisp $100 bill out of his wallet. As he handed it to me, he said "I knew from the moment I heard about your interview that you would be a great asset to this company, Mark. Go home and relax for the rest of the day and take your wife out for dinner tonight. You've earned it."
So you see, being able to solve complicated problems on a whiteboard in a crunch time is a valuable skill.
The headline is "interview performance is kind of arbitrary," but the data solution proposed in the article is "interviewers rate interviewees in a few different dimensions," which is not any less arbitrary.
I appreciate there is an appendix addressing this issue, but it does not absolve the issues the analysis, especially since the appendix uses a "Versus Rating" to justify the statistical accuracy of the system, which is also calculated somewhat arbitrarily (since the Versus Rating is derived from the calculated interview score, wouldn't it be expected that the two have a relationship?)
The fact that the results of the non-arbitrary score are centralized around 3 out of a 4 max (instead of the midpoint of 2) implies a potential flaw or bias in the scale criteria. (The post notes that people who get a 3 typically move forward; maybe selection bias is in play since companies would not interview unskilled people in the first place)
That's not to say that the statistical techniques in the analysis themselves are unimpressive though. I particularly like the use of FontAwesome icons with Plot.ly.
Thanks, as always, for the thoughtful notes. We certainly don't mean to imply that this is gospel, and there are limitations to the work, and fwiw, we are thinking about potentially moving to a 5 star scale.
That said, if click on the link in the footnotes to see the original data, you can get an idea of what we're working with.
And lastly, the fontawesome thingy isn't plotly. It was built using http://blockbuilder.org/ by @enjalot
> the data solution proposed in the article is "interviewers rate interviewees in a few different dimensions," which is not any less arbitrary.
This article's headline reveals the spin the HN community is trying to put on it. Many people in SF like to attack the very idea of meritocracy, because if you admit that meritocracy is a good idea, you're implicitly endorsing the idea that people have different levels of merit, and thus that between-group differences in representation might be due to something other than discrimination.
The real issue I have is that only a few of the other comments in this HN thread are looking at the data analysis presented in the article and are just taking the headline as gospel.
I had a really bizarre interview recently where, after the initial recruiter phone screen, I was rejected based on an in-person half-hour very simplistic paired coding exercise, only met with one person, and wasn't asked about my (imo very strong) resume once. I must have said something foolish at some point, which is on me, but the point is: interviews can be hit or miss. Fortunately you only need one hit.
It does sound like a bad interview experience, but I will say that I have very little interest in what a candidate writes on their resume. There are candidates with amazing resumes that can't do much and people without much that they can talk about publicly that are amazing. If the interview is low signal, the resume is even worse.
> There are candidates with amazing resumes that can't do much...
That's certainly true. I understand not taking resumes at face value; I've interviewed developers (and DBAs) and I'd always pour over their resumes to look for both padding and meat to ask about. It can be insightful to ask about a big accomplishment someone claims as their own only to find out that they only actually affected a small part of it. However, In this case I'm honestly not sure whether my interviewer even read my resume.
Interesting. I wonder if we interviewed at the same place! I had a very similar experience over a year ago. At least I was told they only select one person from the group to move forward from the pair-programming task, which I thought was weird, but hey, it's their company.
Fundamentals: CS basics. I don't nitpick on details. It's more around if you've heard about it or not and if you could figure out how and when to use it.
Structure: I want to see a structured approach to problem solving. Doesn't matter if your code is perfect. Doesn't matter the programming language you want to use.
Curiosity: You need to be curios about things. Asking the "why".
It may not be arbitrary, but if so it's buried deep in the data.
I've turned down candidates who had impressive technical resumes, who had worked in startups that sold, who had been hired on as consultants at various places, etc, because they were unable to solve simple algorithms in a simple manner, and their code was atrocious. Does this mean they're "bad developers"? No. If we were a consulting firm or a startup they might well be worth it; where the important thing is getting code out the door quickly, and to have something that works, even if it's not easily maintainable. But I was hiring for a position that required someone who would keep solutions simple and maintainable ('craftsmanship' rather than productivity, if you will. Note that the former does not necessarily preclude the latter, but it's the trait that was necessary, and was lacking).
Google optimizes for people with strong algorithmic knowledge. It's debatable whether they need everyone to have that, but certainly, many shops don't. Again, I've hired people with no formal CS background, because most of my job's problems don't require you to have deep algorithmic knowledge (the ones that do we can have others address, or work together on).
We know that people can fail one technical interview, while being radiant in another, and the reality is that what we're looking for, and what others are looking for, are often different. That creates a lot of variance in the data when we compare them.
I'm curious about the interviewer community. Specifically things like how are they vetted and how often they come to conduct interviews. It would be cool if there was a community of interviewers for the betterment of the process, but I could see their retention for conducting interviews to only be 1 or 2 before they drop out. I see in the appendix that there are those that do more, but no indication about what percent leave quickly.
A better drinking game might be when a candidate offers a data structure they know nothing about. Would a red-black tree work here? No.. I guess not.
I dunno - if you look at the data, there are fairly clear clusters of people who are 'probably good' and 'probably not so good'.
'Programming' is necessary but not sufficient for product engineering and that is what most of these interviews are trying to tease out. Good companies will balance out 'programming' with other rounds like 'technical design' or 'pair programming' or even non-technical rounds with business analysts or product to gauge general ability.
I have learned that an (in)ability to program "in the small" correlates very well with an (in)ability to program in the large, and now ask mostly simply questions whose answers are things like one-line Boolean predicates to test for well-defined conditions. It is paradoxically easier for an inept candidate to fake his way through an algorithm design question than it is to fake the coding of a simple test for "determine whether two closed intervals [a,b] and [x,y] overlap each other".
I've moved to doing something extremely simple: just give me pseudo-code for indexOf given a string and a character. If you can't write the 4-5 liner for linearly searching a string for a character, then there's not much point in moving further. It is shocking how many people get tripped up by it.
Is it a sign of insecurity that I just popped open vi to make sure I could solve this? :)
Where I work, we've been trying to fill a vacancy for a senior dev for months. Our one whiteboard question is similarly easy: find the largest integer in an array (The interviews have become easier since I was hired on). And yet... I've been watching an endless parade of people with decades of experience who apparently don't know how to write a for loop :(
Exactly this. I've been arguing for years that those simple string/array/link list/tree manipulation questions are simply a microcosm of the qualities that make for good developers in general. If those trip you up then you need to consider that there are serious holes in your abilities, not that the test is bad.
As a potential candidate, all of the standard complaints ring true - but once you're on the other side of the equation, and need to hire people... your ability to create new interviews is not nearly as wide or as clear as it would seem from the outside.
1) Take home test: OK for performance metrics, bad for "getting to know" the candidate, and terrible for selling the candidate on your company
2) Daylong interview: Expensive, requires interrupting our team, needs a fully planned and well executed itinerary - but is perfect for getting to know someone, getting the feel for their personality and interests, and is the best way to sell someone on the opportunity.
3) Work sample: we usually do this for interns[1] and pair it with a ~1 hour conversation (either before or after, doesn't really matter to us) on what the company is like and what they would be working on. Obviously, work samples suffer from the same deficiencies as a take home test for cultural fit and the like, but it's the best we can do for interns!
I've said elsewhere on this thread, but my absolute favorite technical interview method is to set up a contract to pay the candidate to create a feature or fix a bug on your codebase, and have it merge and work as expected.
It lets the candidate feel (literally) valued by your company, and you get to learn how they do the job you are hiring them to do.
I was a college student at the time. I was long on money and short on cash. I was also far less experienced than I was so there were a lot of new things for me then, like those fascinating "callback" thingies I had to figure out.
Nowadays if I were given roughly the same assignment it would have taken me about 3-5 hours at best, which, given my graduated-and-employed hourly rate, is actually quite in line with what I was paid. So it paid as much as I would have charged for a Saturday hack.
In the HN echo chamber, there isn't a day without some blog post/article describing how our interview process is BS, interview is broken, etc.
I don't necessarily dispute this state of affaire, but does anyone know how it compares to other fiels/professions? How about interviewing a lawyer? Or a doctor? Or an account manager? Or a product marketer? Are developers the only one with a "broken" interview process?
We are among the only industries that have multi-stage interview processes as the norm, including a gamut of usually quite gimmicky "technical" stages. It can now easily take more than a month to get hired as a software engineer, which is rather long compared to most other industries.
I think this boils down to the fact our field is young, there isn't professional accreditation in place or anything else for people to have a real idea of what you know without attempting to try assess that themselves (and failing miserably as assessing other people generally isn't their core competency).
That's interesting. So would you say that this process was broken? Was it good? I'm genuinely trying to understand how other fields handle recruitment to compare it with the "broken" IT equivalent.
Lawyers were mentioned directly in your grandparent comment. Sales and athletics also easily show at least that much variation from person to person. Business owners show much, much more variation than that, even if you restrict the field to very similar businesses. Researchers show variation similar to business owners (i.e. mind-bogglingly high).
You can easily construct differences of larger magnitude for what you might think of as "simple" jobs like telephone receptionist; a fluent english speaker is going to be more than 10x as productive in that job, assuming they're supposed to do business in english, than someone who can barely get by, and probably billions of times more productive than someone who can't speak english at all. There are a lot of Chinese people out there whose highest ambition is to "work in an office". I've met some of them! Most will never achieve that goal, because they don't have the requisite skills.
Most people who don't speak english won't apply for a job answering a telephone in english (but a lot of them will! This is exactly the kind of customer service representative everyone has grown to hate). I'd like to see some software job openings that specified a skill set, and were willing to hire people with that skill set. That doesn't seem to be the direction people are headed in, though...
In most other jobs, it's harder to earn a PhD in the field while not being able to perform basic skills competently. But not computer science. It's damned depressing how low the proportion of new grads is that can write code for me to, say, sort a linked list. So the factor is way larger than 10x. We're not differentiating the great programmers from the merely competent; we're struggling to distinguish the capable from the inept.
> In most other jobs, it's harder to earn a PhD in the field while not being able to perform basic skills competently. But not computer science.
I think that you are confusing "computer science" with "software development". It may be easy to get a Ph.D. in computer science and be poorly suited for work in the software development industry, but that's no more surprising than the fact that people can get Ph.D.'s in economics and be poorly suited for work in the finance industry.
Computer science is, obviously, related to software development, and in many cases its possible to take a CS degree that is focused on software development, but CS in general is not software development, and CS degrees in general are not vocational degrees in software development.
The best software developers may need to have extensive knowledge of CS, but merely having extensive knowledge of CS doesn't make you even a competent software developer.
All that I'm saying is that it's possible to get a degree in CS from some schools without being able to demonstrate a mastery of concepts like pointers or recursion.
You could argue that building software has a higher potential for complexity than work in any of those fields. So how do you measure someones ability to be good at that within a reasonable time frame using limited resources (interviewer time)? You're always going to have a "broken" process with those limitations, because you can't test for every possible scenario they may encounter. This leads to a small subset of questions that you use to generalize ability, which leads to errors.
There are two things that guarantee at least a nominal level of competency in those fields:
1) For junior positions, both have professional accreditation, with standardized testing that ensures a basic level of competency. Not to mention the mandatory schooling (in most cases).
2) If you screw up as a doctor or lawyer, you can end up with official sanctions on the public record. If you screw up too bad, you lose your ability to continue to do your job.
I've worked with plenty of dead weight developers who wouldn't pass even these nominal competency tests.
Further, it just seems to me it's easier to benchmark your successes as a doctor or lawyer. At least in a way that has meaning to your potential employer.
I think part of that has to do with the fact we pretty much all work on teams. The actual projects themselves might be somewhat verifiable, but figuring out that developer's actual contribution is a pure crapshoot.
I list of successes for a doctor or lawyer are going to be both more individual, and also verifiable (i.e., less likely to be bullshit).
I'm not a doctor, but from the outside, it seems that practicing medicine is mostly about memorizing a whole bunch of things and then doing them correctly over and over again.
Law actually seems quite similar to programming. You can think of the jury as your users and facts/precedent/laws as the statements available in your programming language. Then your job is to assemble the statements into a program that compiles and that your users want to buy.
I like the law/programming analogy, but theres probably more of a margin for error when your job is to convince some people, depending on if you're prosecuting or defending. If you make even a small mistake when programming, it could break your entire program.
You misinterpreted what I was trying to say. I was simply trying to highlight that theres a difference between dealing with people and dealing with a perfectly logical machine, to support the point that interviewing programmers is harder than interviewing lawyers. And I'm not a programmer.
> There are easily many more programmers who work on potentially life-or-death systems than there are lawyers who litigate death-penalty cases in court.
But potentially life-or-death cases are not limited to death penalty cases, they are just the most obvious class of cases that are life-or-death.
There are far greater numbers (and proportion) of lawyers who work on potentially life-and-death work -- not limited to death penalty cases -- than programmers who do so.
Well, the OP said "sentenced to death" specifically, which is why I mentioned the death penalty. But ok. Either way I still don't think I agree, though frankly I'm just going off gut feeling and don't have any numbers. But think about how much medical device software is out there, plus industrial control equipment, airplanes and other vehicles, military stuff, etc. I know on HN the image of "programmer" is "person taking VC money to write a photo sharing app" but there's a lot of other work out there.
Though for what it's worth this is less a "programmers are awesome and important!" argument than it is a "don't get your view of lawyers from television shows" argument -- if the OP had gone with doctors or something I wouldn't have said anything.
Although not a completely different field/profession:
Power Engineering interviews usually have technical questions, or how you would approach a problem. Even if you have a license. But typically if you have worked in the field, the answers are straight forward.
The problem is, whether or not interview performance is consistent, we still don't know how or when it's correlated with performance if hired, and that's the sort of thing you would need to actually help people making better hiring decisions.
Does interviewing.io have any plans to collect employee performance metrics from companies that hire via their platform? Is that something companies would be willing to cooperate with?
We know big companies go through tons of people. Isn't stack ranking supposed to eliminate 10-20% a year.
All this noise about avoiding bad hires while ignoring the elephant in the room - companies claiming that bounce as many or more people as the companies that don't use these foolish interviewing practices.
Technical interviews are part of why I've moved away from engineering positions. I'm looking at product management jobs thinking that I could use what I learned past 6 years working on a SaaS alone. However, I found the exact same shit, even more technical interview questions that require whiteboard code writing.
There might be some merit to why they are doing this but it's impossible for me to engage companies that discount real world product experience in favor of rote memorization.
So far it's a pretty tough nut to crack, lot of product manager interviewers don't seem to know what they are doing, instead relying on law of large numbers and how great their fucking product is blah blah blah (it isn't).
It's a bit worrying since some companies seem to be hiring product managers for some subjective end goal of an improved product and improved sales....they want one person to take the credit from, and the same person to take all the blame...another huge red flag when managers outright tell you they have no idea what to do so they just get someone else to outsource their thinking.
Just finished 2nd phone interview (it was somewhat technical) yesterday with a company and an online timed tech test this afternoon. First interview was with an internal recruiter and was non-technical.
I've been impressed. They've been very straight forward regarding tech eval with no trick questions and respectful of time. Their interview process is selling me on the company.
This is why we at Lytmus believe the most effective way to assess a candidate is to see how they perform on a real work sample. We've built up a virtual machine based platform that allows candidates to showcase their skills in a real development environment with working code cases (web, mobile, data, systems, etc.). Most interviewing methods like algorithmic challenges often only provide signal into a discrete skill that can be acquired through practice, whereas what matters is whether or not you can actually work on real world projects, understand an existing code base and perform on the job as opposed to on an interview coding challenge. Google's SVP of People Ops Laszlo Bock also writes about the ineffectiveness of indirect tests and their weak correlation with on the job performance.
Minor question; is it just me or is the "Results of Interview Simulations by Mean Score" a bit difficult to parse? I understand that observing the behavior of any singular cohort involves looking at the endpoints of the cohort's curve at the horizontal line 'x=n', where n is number of simulations you wish to observe (the right point of the curve at x=n is P(fail) of the worst performer in the cohort at n simulations, the left point of the curve at x=n is the P(fail) of the best performer); which is why the gap between endpoints within a singular cohort decreases as n increases. But it seems kind of counterintuitive to observe any other kind of trend -- shouldn't the information be graphed as P(fail) being a function of # of simulations, as opposed to the other way around, seeing as the latter is the independent variable?
The data seems to show the opposite to me - despite scores being all over the place, the mean is very reliable. When a 2 or lower is considered a fail, those who consistently rate ~2.5 fail about half of their interviews while those who consistently rate ~3.0 fail only 10%. Of course, the probability that a candidate failed an interview approaches 1 as they are subject to more and more interviews. That the test has both false negatives and false positives does not invalidate the test. In fact, that the test is accurate despite the false positives and the false negatives ought to do the opposite. If a single bad interview invalidates a candidate for company A, that doesn't mean that the candidate won't go on to pass all of their interviews with company B.
I haven't interviewed in some time but one thing I absolutely hate about technical interviews is "white board" coding.
For some reason white boards intimidate me. I have terrible penmanship and complete lack of planning how much space I need for writing things. Then there is the fact the markers seem to have a hi failure rate when ever I use them.
Perhaps I'm the only one that feels that way. I have even begged some interviewers that I would prefer them just watching me use a laptop but the offer is typically refused. Maybe things have changed now?
Great post as always Aline. I'd be most curious as to how well anonymity was kept. Did interviewees identify their employers, schools, or any other information that might create bias while in the interview itself?
I've been recruiting for a long time, and I'm rarely shocked about the result of an interview - maybe a few times a year. There are tons of possible explanations for that, and lots of possible explanations for your results as well.
The problem with hiring is that hiring decisions are centralized, causing huge workloads for the decision maker. To reduce the load, arbitrary processes and voodoo tests are used, always with the same poor results.
Instead, the team hiring should themselves interview candidates and make decisions on who to hire, because it requires personal knowledge that you can't get from tests.
super true, and probably easy to bypass if you're a data viz wiz. It's more burdonsome for web devs who don't have the math / algorithms chops, but I know I have to learn it.
The only real differentiating factor is your tolerance for ramp up time. I expect a programmer to be able to pick up a new language or database within a couple of weeks (tops) in most cases. If I'm hiring full time, that's something I'll tolerate. If I'm hiring a contractor, I'm going to be uneasy about paying high hourly rates for him to learn the job.
The single most effective way that I've found to interview for "interest" is to just get them talking about something they've done before and ask them to go deep into the details. You get everything you need from watching somebody talk, with a smile on their face, about how they solved some problem in a creative way that makes them show some pride. Doesn't really matter what the problem was, if it was a business problem, code problem or hardware problem. The important thing is the level of attention to detail in addressing it.
I've been using this technique for about 8 years now and while I don't make it the exclusive criteria for hiring, every person I've ever hired who has passed that part has ended up in my "great hire" category.