See my previous footnote about them receiving 1 million resumes to fill 1,000 slots. If that's true, it is more selective than any Ivy League university.
No. That's not how a market works. Google can in fact maintain high and consistent standards for incoming developers. But because their processes are so dysfunctional, they overpay to do it.
And engineer headcount costs are a huge component of their business --- so much so that they've been accused of breaking the law to collude with other companies to avoid competition in hiring!
Google is succeeding in spite of the waste and unreliability of their hiring processes, not because of them.
>Google can in fact maintain high and consistent standards for incoming developers. But because their processes are so dysfunctional, they overpay to do it.
But there's no evidence that switching to self-paced work samples would cost them less. With Google's popularity, they'd get more false-positives from candidates that copied the code from widely disseminated previous projects. False-positives cost money.
Your medium size firm with smaller volume of candidates won't have that problem of increasing false-positives.
Sure, with whiteboard interviews, the rejected candidates (and even ex-Googlers) can write a "brain dump" blog with blow-by-blow algorithm questions but history seems to show that these don't work so well as cheating mechanisms.
What kind of work sample project could Google realistically design for 10000 programmers to complete? (It can't be as hard as "solve this Clay Millennium problem" or as easy as "reverse this string". Anything between those 2 extremes is trivial to copy to github.) How often do they need to redesign the work sample? What about objective "comparisons" which was touted as a feature of that method? What about the programmers that don't want to do the work sample? (They do exist!) Is there also a cost to filtering them out?
It's great you're really enthusiastic about work samples and want more companies to adopt it but I see no slam dunk evidence that they are the universal best method for every company.
Google is manifestly too capriciously selective in their interviews. They are infamous for turning down people who go on to do excellent stuff elsewhere. Watching acquaintances navigate their process makes me believe the infamy is well-earned.
That's an argument that is separable from the question of how best to design the Google interview process. I think Google's interview process should be work-sample based, like I think every tech company's should be. But you could argue that either way. We can both agree that Google's challenges are sui generis. We're both reasonable and might disagree on the rest of it.
Where I don't think reasonable people can disagree is on the question of whether Google overpays for engineers. They do. They pay a tax for the luxury of being capricious about who to hire. They can afford to do that, but most companies can't.
Finally, some validation that my interpretation of their process is not insane.
One thing that really rustles my jimmies is the constant assertion that "false negatives are (effectively) free". I think Google and the companies who hire like them seriously underestimate how much this costs them, both the direct costs of spending so much to ultimately reject people and the indirect costs from the work that is not getting done or being foisted on another overloaded engineer.
I worked in a small microcosm of Google's market (Google wants the "best of the best" software developers, and we wanted qualified app pentesters; both are a small subset of the overall market for software talent).
My experience is that running a hiring process that lights up only on the kind of talent that qualifies itself with a standard tech interview is ludicrously expensive. People that do well in standard tech interviews can work anywhere they want. If you can only hire those people, you are competing for talent with the wealthiest (or most overfunded) tech companies in the market.
Fortunately: performance in tech interviews is in fact not a good proxy for programming ability (in fact, there are ways in which being good at interviews can obscure deficits in candidates), and you can get ridiculous discounts to the price Google pays for talent if you don't try to hire people the dumb way Google does.
The root issue is that, the costs of bad hiring are even higher than that. For a rapidly-growing company bad hires can snowball into a bad organization.
Apparently this is where I will disagree with Tom, because I think the costs of a bad hire are often overstated. Part of it is loss aversion, and part of it is inability to quantify risk. In fact, it seems like the industry is at a point right now where they would rather repeat this mantra than put effort into understanding the scope of the problem.
The costs of a bad hire are overstated. The cost of systemic bad hiring may grow geometrically. If that estimation is correct (bad hires result in more bad hires) then no price is too much to avoid a bad hire.
And consider the US Air Force. They have an unlimited number of young people willing to learn to fly jets. Qualified candidates even. So they can filter any way they like, and still not run out of applicants. So their filters seem capricious, because they sometimes are. And it doesn't matter.
Is Silicon Valley in that position? Depends upon who you are. I don't think Google for instance is running out of candidates.
I suspect that the idea that the costs of bad hires are huge comes from not identifying bad hires early. If you hire a jerk who makes poor coding decisions and fail to oust him for a year or more, the effect could be devastating. If said jerk is spotted in a month or so, it may not be so costly.
And really there are levels of bad hire. There's the bad hire who really doesn't know how to do the job and will never get good enough to do it. That person should be very easy to spot with not too much effort in the interview process. The dangerous hire is someone who on the surface can do the work but has a toxic attitude and/or doesn't learn/grow. I'm not sure putting someone through a torturous interview process is going to root that person out.
I think where a lot of the elitist hiring is at now is that many companies aren't so much trying to filter out people who can't do the job, so much as they're holding out for who they think are rockstars. So they put candidates through the ringer with the thought that what will be left is rockstar material.
But many highly-qualified individuals won't jump through hoops for any but the top tier companies, and sometimes not even then. Furthermore, by definition, rockstars make up a very small percentage of the devs out there. What are the chances that every company who is holding out for elite coders even has any applying? Especially considering that elite people probably don't jump around often.
> filter down from a large number ... to a small number ... doesn't say anything about how selective they are.
But that's what the definition of "selectivity" is for database retrieval. Selectivity == n_rows_selected/n_row_count. The "larger number" was the denominator and the "small number" was the numerator.
Your example SQL is not consistent with your previous sentence:
SELECT TOP 1000 FROM resumes ORDER BY received_date
Notice that nowhere is the total row count for resumes known in your isolated example? So yeah, we don't have the denominator to determine selectivity.
For examples of Harvard and Google, we know the denominators (the total applications and total resumes). Therefore we know the selectivity.
I suspect you're mixing up "mathematical selectivity" from "decision process selectivity" because Google's internal decision tree for hiring might look to outsiders as "black box" or nonsensical.
The denominator has no relevance to how selective your hiring process is. Your process does not become more exacting and precise simply because you received more applicants.
Thank you for confirming that you were using the colloquial version of "selectivity" which doesn't require knowing the denominator instead of mathematical "selectivity" which does.
That wiki page orders the ranking on mathematical selectivity and not colloquial selectivity. My previous comment of Google Inc being more "selective" than Harvard is to be interpreted as mathematical selectivity. Sorry for not stating it more explicitly to prevent confusion.
So you don't care about on what basis selection is performed?
If your assertion is just that Google rejects a higher proportion of applicants than Harvard, that's... not at all interesting. The lottery rejects and even larger proportion of applicants for its 'grant' program, but I'm not going to try to learn anything from how it goes about picking winners.
Here's another one showing 75,000 resumes received in one week: http://www.sfgate.com/business/article/Google-gets-record-75...
When you have the luxury to pick the best, from the best of the best, you don't have to follow the advice given by HN armchair quarterbacks like us.