Hacker News new | past | comments | ask | show | jobs | submit login

Programmer interviews are broken. Whiteboard coding and algorithm questions aren't good predictors of how effective someone will be at writing real code. Technical hiring processes harm both excellent candidates who don't interview well and companies trying to hire good programmers.

Agreed. Great start.

[make an] investment by going through our interview process, afterwards we can fast track you through to final interviews...O(n) process into an O(1) process

Makes sense. Use of complexity notation is transparently gratuitous, but I liked it, because it shows the intention to understand and connect. Traveling the world most people appreciate an honest attempt to speak to them in their language even if you're not fluent.

You'll have to take a coding quiz with questions in several different languages, as well as some design questions. The coding questions involve reading blocks of code, and answering questions about them.

Intuitively I hate the idea because I'm skeptical of how effective it is. Do you have non-anecdotal data to show how well your quiz correlates to specific goals?

After the quiz, you may be asked to complete a coding challenge...

What triggers that requirement? Also same question as previous question about correlation/validity.

Then, you will move on to our technical interview...

What do the tech interviews consist of? They can vary wildly across companies, interviewers, and formats.

...work history [is] meaningful but relying solely on [it] results in missing good programmers

This is a show stopper for me. If a company doesn't acknowledge that what people have accomplished with previous contributions, projects, development work is not just meaningful, but extremely important and a top criterion to find great people, then I don't believe they have the correct perspective. But hey, I can be wrong about things, I'll try to be open minded. Point me to solid data showing tests and quizzes are a better predictor than accomplishments and I'll have a look.




Accomplishments and work history are meaningful. The problem is that they're also very noisy. There are false negatives from rejecting great engineers who either can't communicate their experience well, or don't have much experience yet. There are false positives from wasting time interviewing people who are "too good" at puffing up their resume relative to their underlying skills.

Reducing the rate of both false positives and false negatives reduces the time and cost of hiring an engineer for companies.

The data we look at for this is our "onsite to offer rate", which is how often Triplebyte candidates get a job offer after each final interview we send them to. Ours is roughly double what the companies see on their own for people applying directly or referred by a friend. This means we're doing a good job at pre-screening for technical talent and pre-matching for what each company is looking for. If this "onsite to offer rate" wasn't substantially better than normal, companies wouldn't work with us -- and especially wouldn't let our candidates skip their resume and recruiter screens.


I agree using this approach on someone just out of school wouldn't work very well, but I don't think that's the bulk of the problem space.

If someone has 3 years experience, there should be plenty of results to allow an in-depth look at what they've been working on. It wouldn't matter if someone 10 years senior had "more stuff", because it's not about quantity. I think it's important to take in the context of where that person is at in their career and what's realistic.

>>There are false negatives from rejecting great engineers who...can't communicate their experience well

Well, it's true this is a problem isn't it? Whatever you call it, it's part of a lot of us, part of who we are, and does sometimes result in unfair judgement. The problem is, I'm not sure TripleByte's process gets around this either. Part of their process is a dynamic interview that surely requires some communication skills. Pass that and no doubt even more communication skills will be needed at the employer interviews.


>Point me to solid data showing tests and quizzes are a better predictor than accomplishments and I'll have a look.

Not specifically tailored to programmers, but I find this paper relevant with respect to hiring practices:

The Validity and Utility of Selection Methods in Personnel Psychology: Practical and Theoretical Implications of 100 Years of Research Findings: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2853669

An excerpt:

>Overall, the two combinations with the highest multivariate validity and utility for predicting job performance were GMA plus an integrity test (mean validity of .78) and GMA plus a structured interview (mean validity of .76). Similar results were obtained for these two combinations in the prediction of performance in job training programs. A further advantage of these two combinations is that they can be used for both entry level hiring and selection of experienced job applicants.

"Accomplishments" could probably be called a combination of "Biographical Data" and "Reference Checks" as described in that paper. Biodata correlates highly with GMA tests, so it's a lot easier to just give a GMA test then pour over descriptions of of past experiences. Reference checks are generally not useful today because of the fear of lawsuits associated with giving a reference that costs someone a job.

It seems that a general programming test combined with a structured interview has data showing it's a good predictor for on-the-job performance.


Always appreciate seeing a highly cited and influential paper, thank you for sharing that. It seems to make a good case that IQ (or GMA or similar) combined with one or two simple tests can have some success as a tool for choosing candidates.

The issue I would have is it doesn't seem to preclude the possibility that placing more emphasis on specific accomplishments and other software development contributions, is as I suspect, an even better approach.

You mention biographical data and reference checks. I dug just a little further into related papers to get a better idea of what the authors meant by this. From what I can tell those concepts are different things from what we might guess, and wouldn't serve as a proxy to evaluate my conjecture (for example, on the meaning of biographical data: https://www.biz.uiowa.edu/faculty/fschmidt/meta-analysis/rot...).

Speaking of conjecture, I realize that's what I'm offering here. Whereas they have strong data to say, here's a couple of approaches that can be effective.

Anecdotally, I still believe better candidates could be chosen by closely evaluating the specific work a developer has done, what role they played on the teams, and the impact and value of the work directly connected to their contributions. Of course these can't work just by listing Git repos, the interview process has to explore and validate the history. Combining this with the interview process can allow great insights into a candidates thinking, because it allows discussions about real world, non-trivial work and problem solving.


Regarding the work history thing, I think the point is missing good programmers who have a patchy work history. I, for example am 42, but I've only been a programmer for 3 years. Someone looking purely at my work history probably wouldn't have bothered to interview me for the first couple of years. Fortunately for me that wasn't the case.


Glad you mention your situation, because it's an important distinction and definitely not what I meant at all.

I can see how it could cause an issue with some employers, but I would consider a 42 year old person with only three years experience not to be an issue whatsoever, in fact I've hired people with that profile.

One example was a guy with a degree in Biology who had ended up in technical sales for DNA related products. However, this was when mobile apps were getting hot, and the guy had shown an uncanny ability to learn iOS and build a few small but respectable apps using Objective-C during whatever nights and weekends he could scrape together. Considering many full time devs at the time struggled with Objective-C it was an impressive accomplishment. It was impossible to fake because we could scroll to a random piece of source code and ask him to start explaining it, which besides verifying he did the work, gave valuable insights into his development thought process.

These concrete results, combined with probing discussions of how they came about, proved he was smart, had the ability ramp up on new tech quickly, could make some immediate contributions, and as a bonus that he was full of motivation. He worked out great, and to this day we maintain occasional friendly contact.

Point being "work history" can be looked at in many ways. I'm advocating an emphasis on specific results, contributions, and weighing the value of those, not really generic work history patterns.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: