Hacker News new | past | comments | ask | show | jobs | submit login
Redesigning the Technical Hiring Process (jeanhsu.com)
81 points by jeanhsu on Aug 31, 2011 | hide | past | favorite | 56 comments



I think this is a fantastic way to do interviews. I could count on one hand the number of interviews I've been involved in where I actually felt like a true measure was taken of my skills and abilities as they relate to the job I would be performing. Instead it just feels like an excuse to trot out a bunch of computer science trivia.


Having been through the puzzle dog and pony shows at Facebook and Google, I still have to say that the best interview I've seen was a trial carried out by my last employer.

They hired the candidate as contractor on a one-week term. He paired with the senior members of the team and worked on the code base from day one. He started as a full-time employee two weeks later.


This process would be difficult if you had a full-time job already or if there are a relatively high number of potentially qualified candidates, how do you choose who should come work for a week? It may get expensive quickly, certainly for a small startup.


All fair points. Given the circumstances it was great, but it doesn't scale. The candidate worked as a contractor and wanted to transition to a full time position.


Right, the best way to answer the question "if a candidate can perform a job" is to make the candidate perform the job.

Another thing is that companies like Google or Facebook can't seem to afford this, they prefer false negatives, bad for them.


Google seems to be doing pretty well anyway though. I've tried several of their products and they seem pretty decent. I have a hard time imagining that stuff I found like google search and google maps were built by developers without any skills. Therefore it seems reasonable to conclude that they must be able to hire good people with their current methods, despite the skepticism.


All it tells us is that at least some competent people make it through their interview process, but it's a rare company that has an interview process that effectively filters out all competent people and remains a going concern for any length of time.

The fact that Google produces "pretty decent" products doesn't actually tell us whether their famously stringent comp sci trivia quiz system produces any better results than, say, throwing a packet of resumes down the stairs and hiring the ones that land face up.


When I say that I've heard of this company named Google and the seem to produce some decent products, I am understating the fact that they are one of the top software oriented companies in the world today. Since this is widely known I saw no reason to either call attention to that or belabor it, that would just be silly and a waste of time.

You suspect that one could get the same results by throwing resumes down the stairs and hiring all the ones that land face up. That is a testable hypothesis. I recommend you test it and post back here with your results. I look forward to reading your report.


I don't understand how "here's a hard-seeming problem, solve it please" is a "comp sci trivia quiz". What specifically are you referring to? Have you ever interviewed at Google?


It's very refreshing to finally hear of a company that doesn't think that leading a candidate through tree-walking algorithms on a whiteboard is an appropriate test of real-world programming ability. Hopefully more companies follow suit and we can finally put to rest what Malcolm Gladwell would call our "mismatch problem".


I don't understand this. Why would you want to work with someone that doesn't have a basic understanding of computer science?

I work with a bunch of people that wouldn't know a tree if it fell on them, and it's amazingly painful. They write amazingly slow-running code simply because they don't even know that there's a such thing as an O(n^2) algorithm.

I've interviewed at Google and the problems they ask are great. They don't care about trivial minutia like what programming languages you know. They care that you can approach a hard problem, apply your knowledge of computer science, and generate a solution that's simple and efficient. That's the kind of codebase I want to work on, so that's the kind of interview questions I would want to ask.


I don't think the above commenter suggests you want people who DON'T have a basic understanding of computer-science at all. The problem is that the traditional 45min-per-interviewer style of interviewing really isn't sufficient to know if someone is a viable candidate or not. You don't know until they actually work on something.

That's what is refreshing about Pulse's process: you actually work on something and present it to them. This is very practical approach, and requires the use of the candidate's facilities which include, but are by no means limited to, computer-science understanding.

As an aside, I've also worked with people who have good technical skills but had serious attitude problems. Those people tend to have a very negative effect on company culture.


What I've found far more lacking in my colleagues over the years is an ability to create clean abstractions and to think architecturally. In most code this is far more important than the ability to roll basic algorithms from scratch.

The Google approach is like unit testing only one specific feature of a candidate. Pulse's approach is more like an integration test.


I've had the exact opposite experience. Everyone I worked with was pretty good at all the software engineering stuff, but mostly have been average-to-bad with algorithms, math, thinking outside the box, optimization, coming up with clever solutions, etc. Maybe it depends on where you're coming from.


When I interviewed at Google, one hour was about abstractions and architecture. You can ask a bunch of CS questions AND architecture questions when interviews are a day long.


I think that approach used to be more relevant before cheap 100+MHz CPU's, interactive IDE's, and the web. Now days most developers need to focus on API's far more than the language and operate in a more abstract environment. People spend far more time writing boilerplate code to leverage powerful system and care far less about how those systems work. Most of the day to day problems relate more to these systems than the actual problem so people spend far more time reading how other developers solved the same problem than writing code from scratch.

Granted some people are still working on embedded systems and low level networking code etc, but that's hardly the default.

PS: Today I needed to break out a decompiler because a client lost the source code on a small website. When exactly does that show up in an interview?


The position you describe is "support" or, more cynically, "code monkey" rather than "software engineer". Software engineers are the people that are building the APIs for you to use.

No doubt there's plenty of money to be made in putting building blocks together and knowing how to play with some tools, but that's not the kind of employee that Google, etc., are looking for. It's assumed that if you know A* and compilers, you can also download tools from the internet, run them, and see if they solved your problem.


I think your falling prey to the no true Scotsman fallacy. http://en.wikipedia.org/wiki/No_true_Scotsman

Now days 99+% of all developers incorporate some API as a core part of their software. Take the game industry the number of people writing direct X is tiny compared to those who use it. John D Carmack (http://en.wikipedia.org/wiki/John_D._Carmack) is clearly a "software engineer" yet he spends most of his time working on top of DirectX or some other API. Yet, when you dig into Direct X it's built on yet more API's and at the same time anything written on Direct X has a tun of boilerplate code you need to get right or it fails badly.

PS: Ruby on Rails is a perfect example of what I am talking about. By your definition anyone using it is "support" or a "code monkey" even if they start from scratch and build a billion dollar company simply because they spend more time thinking about the API than pointers.


Making billions and being good at programming have nothing to do with each other. Being rich is mostly about getting the idea right and paying others to solve the real problems. Facebook is a good example; Zuckerberg made a nice prototype, but to scale, he had to bring in the kind of people that write their own compilers. He gets all the money because he came up with the idea. But the people that made it work were the programmers.

Also, Carmack is famous for the algorithms he's designed, not for using DirectX.


I'm going to be the contrarian voice and disagree with all of this. This type of interview process is seriously inefficient. All of the "soft technical" skills you're measuring with this process can either be judged by a single conversation, or can easily be learned. The only thing about software development that can't easily be learned or enforced is problem solving ability. This is where puzzler type interview questions come into play.

Comp sci "puzzles" are a microcosm of the types of mental processes needed in day-to-day development. Having the developer code on the whiteboard tests communication skills, ability to analyze and handle critiques, etc in one shot, while also testing the most critical skill that can't be learned or enforced later on.

All these "hacking interview" ideas are missing the problem. For top companies, the only problem with the interview process are finding enough good people and the potential false negatives. The process described here just turns the problem of false negatives into the problem of false positives for these companies. This really is no solution.


Comp sci "puzzles" are one thing but questions like "how many gas stations are there in Oakland" makes the interviewer look dumb.


I love the idea of actual project-based coding interviews where you get to see all aspects of the candidate's abilities (communication, technical chops, design skills/trade offs, etc).

But I'm a little skeptical about the statement, "And this is so much more scalable..."

Now you have the overhead of possibly paying each person for some work, the paperwork that goes along with it, the management aspect of it if it goes wrong (and it will go wrong, the fact that it didn't in 8 interviews is just the reality of a small data set), not to mention the review time for the team on each project.

I don't doubt that you get better hires out of this process, but calling it scalable is a stretch on reality for companies greater than 20.


For the one-day projects, we typically do not give projects of things we might use, for that reason. We give a sort of related simple app. We are currently < 20 so we'll see how this goes once we expand beyond that =)


I love the idea of actual project-based coding interviews

The best interview I ever had was a project-based interview with a YC company. We spent 9 hours implementing production code using a language (python) and framework (django) I'd never worked with before. Fortunately, python is close enough to ruby that we were able to successfully complete the project.

The position was a first-hire engineering lead so the length of the interview made sense. Obviously, this isn't reasonable for a normal position, but a scaled down version is a good approach. I would have loved to accept the offer and work with them but had to go with another company in the end. I was moving to SF with my family and needed a higher salary versus mid-salary + equity.

Now you have the overhead of possibly paying each person for some work, the paperwork that goes along with it...

I wasn't paid for this work and wouldn't expect to pay a dev for a 2 or 3 hour interview, which is what I would expect for a normal position.


She did mention that Pulse only pays for code or projects that would be used in production. Agreed, paying people for all interviews would not scale. :)


Ouch, that may be a serious problem. Are all of the projects intended "for production"? That is, are there some projects that Pulse won't pay for even if they're completed successfully (internal stuff)?

Now you're excluding all candidates who aren't willing to spend some time (a week?) working for what might be free. And some of these candidates will be the highly-skilled programmers you'd love to have, but who won't be willing to jump through hoops when they can easily get a job elsewhere with fewer hoops.


This is exactly the approach that I would take if I needed to hire someone.

We need to realize that the 20-year old process designed by MSFT of hiring has been gamed and is no longer useful. Interviewing these days in the Valley seems more like a cat-and-mouse game where interviewees memorize answers to as many algo questions as they can, and interviewers try to one-up candidates by asking them increasingly harder and more ridiculous questions.

Interviewers claim that they care more about how a candidate thinks through a problem, but I know this is bs, since I've talked to people in my company who have interviewed. If they ask a question, and the person doesn't know the answer right away but 80% of the others do, the person already looks deficient in their eyes.

The process that the article talks about is probably the best way to identify good candidates, at least until this is gamed. Hopefully if designed properly though, it will be much harder to game, since you can always change the nature of the project.


> interviewers try to one-up candidates by asking them increasingly harder and more ridiculous questions.

This seems like a net win. Interviewees and interviewers in a positive feedback loop of knowledge. Even in a worst case where people are memorizing answers to brain teasers, at least they have to think about it and therefore they will gain some level of understanding that they wouldn't have otherwise.


> has been gamed and is no longer useful

To be fair, if you can game it, the interviewer's doing it wrong. The idea is to ask a question that the interviewee can complete and then work together with them for ideas on other ways of implementing it, trade-offs, demonstrating that they understand the memory and execution model of the machine, talking about how to measure performance issues or representation choices, etc. You're trying to discover by working with them on a problem in the small how they will react when faced with similar problem on a problem in the large. And while I don't claim that A -> B, I do claim not A -> not B. If they can't even talk about the memory touched during binary search, good luck putting them on partitioning an algorithm to run across multiple processors.

If you're just giving an algorithms quiz, you'd be better off just outbidding IBM on their "we will hire everyone from the top 25 ICPC teams" strategy.


I'm not sure if you've been through interviews recently, but from the experience of myself and my friends who have been interviewing, this is what it seems to be. Based on my experience 1.5 years ago, I quit interviewing until I got enough time to sit down and memorize algos.

A popular question to ask as a stumper is "Given a tree, how do you determine if it is a binary search tree". Either that or "convert a sorted integer array into a BST".


Why would you have to memeorize algorithms for those? Without thinking about this I don't know how to it. But I know what a BST is and I could figure this out in a few seconds/minutes, I'm pretty sure.


Exactly. If you have to memorize answers to these, you need to revisit CLRS and brush up on your problem solving skills. But this is also an example of how these interviews are gamed. You don't want someone who just memorized the answers and couldn't come up with them on their own, but its hard to distinguish the two during an interview. And in many cases the guy who answers faster from having memorized it can seem like the better candidate.


I haven't been through them, but I used to interview for MSFT (I did ~300 in-person interview between 2000 and 2007). I'll admit I'm totally biased having worked for them and that all my experience is there and not elsewhere.


>We need to realize that the 20-year old process designed by MSFT of hiring has been gamed and is no longer useful.

another way i use the interviews is to learn more from the candidates about things i supposedly know less than they do. Usually it would be a very specific details of some technology/products they supposedly have worked a lot on or with, and that i'd have less detailed knowledge about (along the lines for example - how it works, externally and internally, why would this way have been chosen/implemented , what were the alternatives back then and are now, etc...)


I have been doing some interviewing lately, and only just received my first "prototyping" test, based off a short user scenario. It was almost scarily broad, given the 3 days I had to complete it, but I think that's the point -- show initiative, build something cool, be prepared to defend your decisions.

I was personally happy to spend my time doing it, because it was a direct reflection my abilities, and I controlled the result.

I also recently spent a short 2-day stint working with another startup team that was also very instructive. There's no substitute for seeing a team in action, how they approach decisions, collaborate, etc.

I contrast this with the full-day onsite interview I did at one of the old guard dotcoms, which was disorganized, confused, and muddled -- basically, a huge waste of everyone's time. And to cap it off, after I'd had numerous phone calls, in-person meetings, the onsite interview, and bought an SVP lunch (forgot his new bankcard PIN), I got a quick impersonal phone call from HR saying they weren't going to move forward. Needless to say, I agreed.


I am just in the process of applying to find a FT job in a startup and I think that this is the best way that every startup/company should copy in order to interview potential candidates. It makes no sense for me as a candidate to solve top coder/ACM puzzles in a whiteboard as that doesn't correlates with the day to day job that they will be performing. I actually disagree with sites as Interview Street, where they ask tough and challenging problems (Project Euler kind of questions) to candidates. You can't simply judge someone from that kind of stuff. You hire someone to be able to get the job done in a company and not to be a possible candidate for participant of a topcoder/ACM challenge. Kudos to you Jean for starting the initiative! Now I know better the philosophy of the Pulse team and I admire it even more. To other startups, please consider doing the same thing!


Why is "resume comes in" always accepted as a given first step?

Redesigning the tech hiring process starts with discovery. You can't keep posting your hiring needs on HN, Github and Stackoverflow and just expect a great person to find you, read more about you, and pull a resume together for you to make your life easier. We expect you to make our life easier because we know we can work just about anywhere.

As I guess pg might say "Go to your users!" to founders looking for help making a decision, you need to go to us if you want to grow your team! We're real freaking people who hate creating a BS piece of paper for you that means absolutely nothing. Find us in person and spend time with us so we can skip this resume crap and focus on actually building things that we both actually enjoy.


Good point--I should have clarified, but this is more often a referral. I should have called it "first point of contact" or something more appropriate. Often we don't get the actual resume in into much later in the process...


We're all familiar with the "phone / phone / half day puzzle fest" interview technique but has anyone ever done any testing on it? At any large company like Google or Microsoft it seems like it would be easy to run some ringers through the process to double check it. Take someone who's a brilliant engineer and asset to any engineering organization and run them through the hiring process in a different group. Give them a cell phone for the initial phone interview and see how they do. I'd be extra curious to see how this would work on employees who hadn't already gone through the traditional hiring process (i.e. people who came in as part of an acquisition).


A company as large as Google might be able to use an algorithm to derive a correlation from Google employees' annual performance reviews (positive and negative) back to their original resumes and interview feedback. There might be some interesting correlations, such as Google employees that graduated from some university X and worked at company Y are in the 90th percentile of performance reviews. Or every Google employee who ever worked at company Z was fired in less than 12 months. :)


The results would be meaningful but not as interesting as someone who is under more pressure to find a job due to currently being jobless.

Speaking for myself interviews are always much easier when you already have a job and always more difficult when you really want to work where you're interviewing - just an aspect of self-inflicted pressure.


While coding a small project is a good idea time constraints can often severely limit what you can do. The reason companies ask a series of micro-questions (like Fizzbuzz or "implement a linked list") is that it lets them test a wide range of areas that they're interested in. If you're swapping that for a bigger project there's the risk you might find the candidate spending a lot of time doing one thing without testing them in other areas.

The particular way they're doing it is also a legal nightmare. If they use an applicants work there's a good chance that applicant current employer might have a claim over that code. Even if they don't use the work, but later build something similar themselves what's stopping the applicant suing them over it ? - it's not going to happen today, but if your startup is successful one-day you can bet you'll face lawsuits. Essentially you should avoid having the candidates work on something directly related to your product.

Also if you're going to take this approach it's a good idea to have a standardized project that every applicant does, both because it ensures fairness when you're comparing candidates and because it's much easier to defend if you face a discrimination lawsuit.


If you can make this hiring process work, that's probably a good sign about your code base. It can be difficult to find a meaningful and reasonably self-contained project in a large code base, but if your code is very well organized, it's possible.

The thing I really don't want to experience, ever again, is to be dropped into a horrendous project that nobody else wants to deal with, usually with an assignment along the lines of "nobody ever wrote tests for this - writing tests is a great way to learn a code base, so write some unit tests for this monstrosity", usually followed by a chirpy "Ping me on IM if you get stuck!"

If I were able to contribute meaningfully to a code base during my interview, that would give me a lot of confidence that I'd be able to succeed on the job and enjoy my work day.

The downside to this is that even very a very well organized code base often runs into a glitch that takes a couple hours to debug, and if this happens, you have a new candidate sitting next to a programmer muttering to himself as he messes around with config files trying to figure out out why the paths are all broken.


I've spent regular time preparing for programming puzzle style interviews by working through several books. For whiteboard style interviews I've received puzzles that I've studied before or puzzles that are similar to something I've seen before. I feel that how well I do in this style interview is proportional to how many practice puzzles I've done.

But often the work I'd be doing has very little to do with the preparation I did for the interview. For this type of work I've found that a programing project is a much more honest assessment of my skills. And this is good for both the employer and I.


The idea is nice, and can work for startups and small companies. However, the companies that interview several hundred people every month (think Google Microsoft, etc.), I am not so sure if this would work.

One way this would work in large companies is if they give the hiring power to individual teams/groups, and then those teams can conduct interviews like this on need to hire basis. But this would require a major shift in how hiring is done in large companies these days.


Actually, it probably scales better, if structured a bit differently.

1) Phone screen (1 hr) 2) Give them a project to work on over a weekend(0 hrs) 3) Have someone go through the code (1 hr) 4) Bring them in for lunch, and then a demo session with Q&A (1 hr for lunch and 1 hr for Demo/Q&A x 2 engineers)

This seems like it would scale much better. The person who reads the code can say yay or nay to the demo session, and then whoever is involved in the demo session (probably 2 engineers) can ask questions throughout the demo. Total time would be 5 hrs and I think you would get a much better feel for the candidate afterwards.


I have a different process that is I think more efficient:

1. Hire remotely in any country and region of the world 2. Don't even look at the resume 3. Give the candidates a short but difficult unpaid development test 4. Everyone who passes this initial test gets a job to work on a real project 5. Everyone we like from the work on the project continues further to work part or full time.

The most important way to evaluate programmers is through getting to see them programming something.


    > 3. Give the candidates a short but difficult unpaid development test
    > 4. Everyone who passes this initial test gets a job to work on a real project
What happens when only the desperate bother with your difficult, unpaid dev test and none of them pass?


It will be very difficult for this to scale.

Yes, if you are small startup, it is possible. But imagine companies like Google, Microsoft, Facebook, Adobe, Cisco etc with over 1,000 positions per month. Add cost overrun. Add team members taking time out of their regular work to manage the temporary projects. And plus, this model will not work if the candidate is currently working elsewhere.

Great concept but not practical if you are large company.


brilliant.

I've been working on a side project for a few months now and it got me thinking about how I'd like to hire programmers if the need ever arises and I had sketched out something (in my head) that was eerily similar ...

I think this is the way things should be done, I want to see how the potential hire actually does the job, integrates with the team and deals with other things like version control, getting setup with a database type they may not be familiar with.

It surprises me when companies hire devs (and pay them lots of money) without ever seeing them code.


This sounds great. I've been to entirely too many interviews where their questions are completely unrelated to the act of software development. I'm sure they were all put off by my low performance on contrived questions, as well.

My favorite is that, "I would probably Google it," usually means instant disqualification when discussing any problem solving. This, of course, is ridiculous, since the very first thing I'll do is heavily research a subject before implementing it. I want to stand on the shoulders of giants, not sit around building sand castles.


Hrm, if someone answers "I'd google it," I consider that to be a completely valid response. Of course it automatically invites the follow-up, what if a google search fails?


If Google fails, it is an unsolved problem [1], and therefore requires study and research. A variety of communication mediums (forums, IRC, etc.) exist to communicate with others who are working on solving the same problem.

[1] At least to the extent that the public is concerned. It is possible that some skunkworks operation has solved it in secrecy.


If Google fails, it is an unsolved problem

Alternative: it's a completely contrived problem. Interviewers really don't like it when you point that out, though. :-)


I find I mostly use google to see how others have implemented something rather find the exact solution. But either way you have a problem to solve and google takes you from knowing nothing to at least a starting point.


Having been on both sides of table this process makes more sense than a traditional white board coding problems. In regular interviews one always asks set of known problems so there is a good chance that candidate already prepared for these problems. In a company that I worked for we had a 2-3 day boot camp to hire usually these are new grads so most of the time they are ok with the lengthy process. We used to give them set of projects and allow them to interact with each other and use google to research etc.. .Although this process was good we couldn't sustain as the company grew.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: