There's a difference between syntactically-correct code (meaning every single paren/brace is balanced, no dropped semicolons, should compile exactly as written, etc.) and what I call "valid" code. Expecting a candidate to write the former without an editor and compiler is silly.
A good interviewer (and good hiring committee) wants to see that you can write "correct-ish" code: it looks like it would compile modulo a typo or niggling detail, but the algorithm, data structures, and control flow are clear and valid. A good interviewer will also tell you that up front, e.g. "I'm interested in your code, not your syntax". If they don't, ASK!
The problem is: not everyone is a good interviewer, and it's surprisingly hard to teach someone to BE a good interviewer.
I think I agree with all of that. Especially the "interviewing is hard" part. But what's wrong with having someone sit in front of a computer to write some code?
At the company I'm leaving, we ask candidates to do a small project as a first pass. If the code isn't awful, we call them in to pair program with us (we're a pairing shop) on the code they wrote to improve it.
The company I'm joining just has you come in and pair (they're also a pairing shop) on projects their team is actually working on for most of a day.
I find both to be pretty good approaches. Better than having someone write code in an unfamiliar setting (the white board) to solve problems which are often, but not always contrived (write quick sort, etc.), at least.
We've had pretty good success with identifying good candidates since we switched over to this model and the place where I'm starting is a consultancy which is pretty well-regarded in the startup community, including HN, so they've probably had reasonably good success identifying talent.
How big are these companies, how many people do they hire per week, and how many do they interview? Any number of perfectly reasonable interviewing strategies in smaller companies fall apart at big-company scale (1000+ resume submissions per DAY)
Fair point. The companies are small and medium-sized. I'm not convinced that those models can't scale, though. One large-ish company that I know of who still does something similar is ThoughtWorks. They have on the order of thousands of employees and their interview process consists of, among other things, a take-home code submission and on-site pair programming.
If you're getting 1000 resumes per day, I think you would be able to weed out a significant number of them just by giving them a coding assignment to work on. Churning out a resume is easy, but sitting down for a few hours and writing well-designed code takes effort.
They probably can scale up to a point. Thoughtworks has 2100 employees. Google has 45,000.
How would you grade/score ~1000 code submissions per day, though? You could conceivably do Coursera/Topcoder-type automated grading, but that can only get you so far and can't distinguish good code vs. bad code vs. "copied from Glassdoor" code. It might be useful as an initial filter, though. You'd have to constantly be implementing new questions w/ associated grading scripts, though, as the problems would inevitably leak.
A good interviewer (and good hiring committee) wants to see that you can write "correct-ish" code: it looks like it would compile modulo a typo or niggling detail, but the algorithm, data structures, and control flow are clear and valid. A good interviewer will also tell you that up front, e.g. "I'm interested in your code, not your syntax". If they don't, ASK!
The problem is: not everyone is a good interviewer, and it's surprisingly hard to teach someone to BE a good interviewer.