Hacker News new | past | comments | ask | show | jobs | submit login

Building a full production grade website seems extreme, but giving a prebuilt website to a front end candidate and asking them to add a few simple pages or features is basically fizzbuzz type vetting. I also like it when coding "challenges" are timeboxed to say 1-2 hours. This keeps the time expectations reasonable for both the interviewer who has to read a bunch of candidates' challenges and the interviewee who has to do them for (potentially) a bunch of gigs.

The type of coding challenge a company asks (interview questions also) can be one of the most important data points to use to judge a company's culture...




Yes exactly, this is overboard. We have "take home test" (I hate that term but haven't come up with anything better) for both our front-end and back-end candidates. They're

(a) Short - 45 minutes to an hour

(b) Only as complex as necessary (for the front end we provide and end-point serving JSON, for example, so they don't have to worry about that)

(c) Real problems we've run into, distilled down into a form appropriate for an interview

(d) Made up of problems that we were having candidates do on the whiteboard before we developed the at-home test, meaning they're not super complex

I think that at-home coding is the perfect second step to an interview (after a phone screen). It's more realistic than whiteboard coding. You have the internet at your fingertips. You don't have the pressure of an interviewer sitting there. You have syntax checking. From my side, I don't have to waste 30 - 60 minutes on a bad candidate. And, assuming the code is good, reviewing it provides an awesome first hour of an in person interview. And the problems I give you to take home would have been given to you in person anyway, so you're not spending any more time than you would have if you'd just come in for an interview.

The issue here is that the tests he's being given are bad examples of a take home test, not that take home tests are bad ipso facto.


Have you (or people on your team) done the task yourselves? It's easy to underestimate the effort required.


Yes absolutely. They're all problems that we've solved in the process of building out our product. In addition to that, as I mentioned, we originally gave them as white board questions, so we're well aware of the issues that come up when solving the problems and the effort required to resolve the issues.

Our front-end test is a bit longer than the back-end test with a tradeoff of a shorter in person interview, but the front-end test shouldn't take more than 90 minutes.

One of our problems is a small matching algorithm. Another is a thread safety problem. The front-end test consists of pulling JSON data from a server, formatting it, and posting a single action back to the server. It sounds like we're in a minority though. Reading these other posts it seems like a lot of the tests are very onerous (build a small trading system? that is a ton of work).


> They're all problems that we've solved in the process of building out our product

Have you gotten feedback on the questions? It's easy to say that it's a valid question, but when you're close to the problem, it's easy to make a mistake since you might encounter a specific sort of problem that others might not be nearly as familiar with.

The difference to me I think is that yours are very specific, and "build a small trading system" can be very general. If it's timeboxed, there might be only so much that you can do--i.e. you might not be expected to complete the task. You want to test someone's knowledge, without being too 'in depth'. A candidate that can solve a problem on his own in 30 minutes isn't necessarily better than a candidate who can solve the problem in 10 minutes with the assistance of Google.

0.02


Thanks for the feedback, I'll definitely take a look at getting an outside opinion on our takehome test




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: