Hacker News new | past | comments | ask | show | jobs | submit login
Google's “Director of Engineering” Hiring Test (gwan.com)
1764 points by fatihky on Oct 13, 2016 | hide | past | favorite | 923 comments



FWIW: As a director of engineering for Google, who interviews other directors of engineering for Google, none of these are on or related to the "director of engineering" interview guidelines or sheets.

These are bog standard SWE-SRE questions (particularly, SRE) at some companies, so my guess is he was really being evaluated for a normal SWE-SRE position.

IE maybe he applied to a position labeled director of engineering, but they decided to interview him for a different level/job instead.

But it's super-strange even then (i've literally reviewed thousands of hiring packets, phone screens, etc, and this is ... out there. I'm not as familiar with SRE hiring practices, admittedly, though i've reviewed enough SRE candidates to know what kind of questions they ask).

As for the answers themselves, i always take "transcripts" of interviews (or anything else) with a grain of salt, as there are always two sides to every story.

Particularly, when one side presents something that makes the other side look like a blithering idiot, the likelihood it's 100% accurate is, historically, "not great".


This looks like a typical pre-interview recruiter phone screen… they're looking for shibboleths that identify the candidate as a genuine computer person who took CS 101, and exclude candidates who spam every job with bogus CVs. I'd start every candidate with this screen, unless I personally knew them & was familiar with their technical ability.

  > none of these are on or related to the "director of engineering" interview guidelines or sheets
They'd be internal to recruiting, so you wouldn't see them unless you were heavily involved (doing interviews and recruiting trips isn't being heavily involved). They're for any recruiter to use to quickly eliminate bogus applicants.

  > Particularly, when one side presents something that makes the other side look like a blithering idiot, the
  > likelihood it's 100% accurate is, historically, "not great".
You can just outright call him a liar… I'd expect this to be a fairly accurate report. It looks like the recruiter misused the screen; instead of eliminating obviously bogus candidates, they used it to eliminate a candidate who may or may not get an offer (and thus a commission). They should have proceeded to the technical phone screen stage. If the guideline on the recruiter screening is: drop anyone with <100% correct, then I think that's silly.


I'd hope it's not too typical, since four out of the ten official answers are wrong, and even one of the questions manages to be wrong. (Specifically, the "why is quicksort the best?" is just completely ridiculous.)

It's one thing to blindly apply a simple questionnaire without thinking about the answers that come back, and yet another thing to do it with a questionnaire that's doesn't even get stuff right.


I wouldn't be surprised if the recruiter just googled to find a list of questions and answers. This candidate probably isn't even on any official radar. The recruiter probably just uses this as a means to evaluate candidates before they officially call dibs on them. Google could very well be different since they do many things differently but recruiting has always been a sales position with everything it comes with, namely leads, quotas, conversions, etc.


I don't think that's what happened. The questions look too familiar to me, and I've been through the SRE-SWE interview process which is what the top-level comment talks about.


Maybe it's the whole "better to have false negatives than false positives" philosophy Google espouses?


The problem is, once you have a crud ton of false negatives, people stop wanting to apply to work for you, especially when you get excluded via junk like this. And every false negative that posts about it online... well, this post is at +1363 right now?


That's part of it, but the other part is what you mentioned earlier--leads, quotas, conversions, and don't forget diversity initiatives, inexperienced recruiters, and the fact that the first part of the funnel has to be dirt cheap to work at scale.

For what it's worth, lots of other companies seem to use almost the exact same process.


At one point at least, physical security and recruitment were about the only contractors Google used.


They do seem typical; the recruiter asked many of those same questions when I interviewed for a SRE position back in about 2004. I particularly enjoyed the bit count; I went back later and confirmed that the bit swiggling approach was faster on the machines I had handy. Large lookup tables have poor cache behavior.

On the other hand, the recruiter did not tell me whether I got the right answer (or I didn't miss any). It was pretty clearly a scripted initial phone screen with someone who wasn't a programmer.

Oh, and they didn't ask anything truly ridiculous like the QuickSort question.


I was asked the "what syscall returns an inode?" question (and agree with DannyBee that this is extremely similar to my successful SRE screen) and I answered stat() without the clarification because I understood what the screener was doing and the parameters in which she was operating. That context on how phone screens work is missing from this transcript, but it's also unfair to expect that sort of context from a candidate because not every interviewee is familiar with the standard Google style tech interview.

The screener is the wrong person to walk down Pedantry Lane or Hexadecimal Packets Street and that's the sort of thing you save for the actual interview. But yes, I agree that it's shitty that the incentive is to answer for the test instead of the exact truth. (I wasn't extremely supportive of the interviewee once he turned slightly sarcastic and rattled off hexadecimal bytes instead of just saying "SYN" and "ACK," though.)

As an unrelated addendum, I'm intrigued by the following four things:

    1) The author wrote a Web server and framework, G-WAN
    2) I've seen G-WAN advertise itself questionably in the past w/r/t perf
    3) gwan.com is powered by G-WAN
    4) Under Hacker News load, the entire gwan.com domain is hard down
I'm not drawing a conclusion, but it is tempting.


No software can perform better than the hardware allows it to. End of argument. Even if you make 100% optimized ASM tailored to the hardware and workload, and you can still kill it hard with enough requests. For all we know he hosts the website on a free tier of whatever to show how well it performs. It being down doesn't tell anything useful other than the current workload exceeds its capabilities.


> I wasn't extremely supportive of the interviewee once he turned slightly sarcastic and rattled off hexadecimal bytes instead of just saying "SYN" and "ACK," though.

His response:

> in hexadecimal: 0x02, 0x12, 0x10 – literally "synchronize" and "acknowledge".

What do you think SYN and ACK stand for? Could it be "synchronize" and "acknowledge"? Moreover his point that knowing the bytes is more useful when you're looking at packet dumps is valid.


The messages contains a lot more than the flags though so those bytes aren't enough and he didn't mention SYN-ACK.


They're bits. SYN can be represented as 0x02, ACK can be represented as 0x10. 0x02 BITWISE-OR 0x10, ie SYN BITWISE-OR ACK or 'SYN-ACK' colloquially, is 0x12.

"in hexadecimal: 0x02, 0x12, 0x10".


Those are just flags, the message contains much more than the flag. Therefore it is wrong to say that they just send the flags and that the flags are equivalent to SYN and ACK.


Although I can see why it's tempting to poke fun at this situation, his site could be down for any number of reasons not directly related to the quality of the code he wrote for G-WAN.


Even if I give you that, it's still 30% of the official answers that are wrong, and 10% of the questions.


> I'm not drawing a conclusion, but it is tempting.

Except that you did draw a conclusion.


You are drawing a conclusion and you're making it public to discredit this person.


It is true that gwan.com is (still) down.


"They'd be internal to recruiting, so you wouldn't see them unless you were heavily involved (doing interviews and recruiting trips isn't being heavily involved)."

Actually, this is a super-bad assumption, pre-screening questions, etc, are all public to google internally. There are no magic internal-to-recruiting parts to the questions, and they are in fact, listed as SRE pre-screening questions, so ...


But they appear to be changed in subtle ways from what's listed on other sites. For example, googling for: Google SRE interview questions inode

returns a few hits, including:

https://www.glassdoor.com/Interview/Linux-system-call-for-in...

which lists the question as "system call for inode data" - which is importantly different from a system call to return an actual inode.

This post says something similar: http://gregbo.livejournal.com/182506.html

"There were some questions I just didn't remember the answers to, such as "What system call gives all the information about an inode?" and "What are the fields in an inode?""

(Argh, the blog post is down, so I can't compare some of the others, but several of them seemed to have been changed in ways that made the question itself seem wrong.)

((Thanks to leetrout below for bopping me on the head with the google cache. Next bit added thanks to said bop.))

Another one: The blog post lists "what is the name of the KILL signal?", but googling for: google sre interview questions kill signal

turns up this post on glassdoor: https://www.glassdoor.com/Interview/site-reliability-enginee...

Which lists the question as "What signal does the "kill" command send by default?"

That matches much more the answer SIGTERM that the interviewer was described as insisting on.

That suggests a few likely possibilities: (a) The interviewer misread the questions; (b) There was a horrible communication failure between the interviewer and the interviewee; (c) The interviewee failed to actually listen to the questions before answering.

I have no information with which to assign weights to those possibilities, but all of them seem more likely than "the interview questions themselves are actually this horrible" (they're not as broken as the blog post made them out to be. After writing this, I looked.)


I got asked some of these questions literally yesterday by Google and you're right. The wording was how you've presented it, not how he did.



Anyone asking a question should be qualified to interpret the answers in context. If they don't, use a multiple choice quiz or something instead. These questions/answers are just ludicrously out of sync.


> They'd be internal to recruiting

I managed to find them and I don't work in recruiting, they are for SRE pre-screens. The guy misunderstood most of the questions which is why he failed and then worded them incorrectly on his blog, it wasn't the fault of the questions or the interviewer.


The recruiter misunderstood the answers and or he is not qualified to ask those questions. Usually when you ask someone something that is not literal as in "what is 1+1?" You can't expect them to be literal. Questions like: Why Quicksort is the best sorting method? Guy gave very good answer showing that he has perspective and he is able to make a valid argument. The recruiter on the other hand just read the paper and completely disregarded the fact that the person he is trying to recruit is a valid candidate.

Also at the end the recruiters attitude aweful. Like what is that, he was reading answers from a paper, couldn't make valid arguments back to the candidate and at the end turns and says to the candidate "sorry my paper says you don't know this and that, and goodbye?"


No, I meant that the interviewee misunderstood the questions/answers given by the recruiter and thus misrepresented them when he wrote the blogpost. Since he couldn't even get the questions right I highly doubt that he gave a correct representation of the recruiters attitude as well.


Which of the following seems more likely?

A recruiter who was already giving the guy the wrong interview, and whose job revolves essentially around HR and sales, made mistakes in asking a series of technical questions.

An expert with decades of relevant technical experience misunderstands and confuses basic networking and system topics.


Which of the following seems more likely?

A person fails to read a question verbatim.

A person who has been the "smartest person in the room" for decades has an inflated view of his fluency on a topic and makes mistakes in his favor when he tries to reconstruct the questions from memory.


That's a big claim. Can you share some of the actual wording of the questions?


If you work at Google you can find them by searching for it.


Maybe not such a good idea to post an internal link? :)


Do you have proof of this?


So you're saying Google's recruiters don't tell what position they are interviewing for and that they found a 20+ years experienced engineering manager holding patents on computer networking under-qualified for an ordinary site maintenance position. Well, that sounds like a dumb recruitment process.


> they found a 20+ years experienced engineering manager holding patents on computer networking under-qualified for an ordinary site maintenance position.

To be fair, I've interviewed people at previous companies that had patents and 15 years at IBM on their CV and completely failed even the most basic system / coding questions. (fizzbuzz style).

There are a lot of people that read great on the CV but then it turns out that they mostly kept a chair warm and organized meetings over the last decade without actually retaining any technical knowledge.

Not saying that was the case here, but it happens and it's probably worth checking people on their stated qualifications.


Perhaps that suggests you're giving them the wrong interview.


Well, general interviewing (unrelated to tech) contains various amounts of "are you lying on your resume" type questions. If someone walks in with a breakdown of 10 years dev, 5 years management, they should be able to at least comfortably answer system/coding type questions. As in, if you do something every day for 10 years, you don't forget all of it in 5.

I had a candidate in a few months ago that was interviewing for Software Development Manager, so he got an initial phone screen and then a face-to-face with myself and another dev on the team he'd be managing. I was impressed with how little he knew about programming.

"Name some data structures." "What does MVC stand for?" "Name some design patterns" etc. All of which were unanswerable. Generally when it becomes clear someone was dishonest about their skillset, the ability to get hired for any position becomes impossible.


How can you not know what MVC stands for? It's pretty much a buzzword!


I mean, yeah, 99% of candidates should know what that means because it is an extremely common initialism. Although, I could see some engineer who worked on networking drivers for 10 years might not be up to date on the design patterns of frontend engineering.


That's exactly what happened to me. I was stuck in embedded systems world right out of college and then one day took interview with Google, they were asking me questions clearly looking to hear "MVC" in my answer but I just didn't know it back then...


Not all programming/engineering circles use the same buzzwords. For five years my mobile development groups used the concept without the acronym.


Agreed. I interviewed a few QA candidates at a previous company that used a term completely differently than we did. When I rephrased the question from defining the term to "what kind of test would you run in this situation" I got the kind of answers I would expect. It's far more important that a candidate understands the concepts needed to solve a problem, than that they have memorized a term.

Hell, someone could be able to define MVC and explain how you would use it, but have no idea how to actually implement something using it for a given programming language.


Even then it's worth remembering not every MVC is the same. Fat/slim models. Intelligent/simple views. There's lots of approaches to even a well known paradigm.


Knowing proper terminology is necessary in order to stay up to date with developments in your professional subject area. If concept X is an established concept in your professional area of expertise, and you don't even know that its name is concept X, then you likely have not read much about concept X, and consequently, are likely not up to date with current developments related to concept X. This isn't just semantics, it's about professional literacy.


I'm sure a lot of people know what MVC stands for. I don't think there's anyone on the planet who can be sure they know what it really means.


Right. Everyone and their mum know what M and V stands for, right? Now .. C? C is tricky. Please don't ask any further questions about C, will you?


I'm not sure if this is a joke. Model and View are really clear. Controller I usually find munged in with the View and it's not always profitable or clarifying to separate it.


My kick-out questions:

"Could you write out what an HTTP request and response looks like on the board?"

I'm really surprised at how many people can't do this. If you've spent five years developing web, surely you've had to look at raw requests, either debugging using netcat or with wireshark or just looking at the information in the Chrome/Firefox debugger?

"What's the difference between a GET and a POST request?"

"What is the difference between a statically typed and a dynamically typed language?"

I had one candidate try to tell me Java was dynamically typed and Scala was statically typed. It was for a Scala position. They also said "statistically typed" instead of statically, even after I corrected them.

-_-


I believe 90% of my coworkers and former coworkers would be unable to answer the HTTP response question.

And 95% haven't used netcat or wireshark. I wouldn't have either, if it wasn't for some particular work related to messaging.

They're able to develop reasonable line of business websites in spite of that.

I would be extremely worried if they were unable to answer about the difference between GET or POST, or the difference between statically and dynamically typed languages, so I agree with those.


I basically lived in Wire shark for a couple of years working for a voip company and still use stuff like curl all the time and I don't think I could walk through an http request of the top of my head.


GET / HTTP/1.1\r\n and some kind of sensible response is not too much to expect someone to know. HTTP is super easy and I see the HTTP transaction test as "did you ever get curious as to how exactly a core part of the current Internet actually works". I'm sure that there are app developers out there who can spin crud stuff all day and have no idea about this, just as there are curious people who couldn't stand up todomvc to save their life, but in general, all of the most talented people I've worked with knew their stuff front to back, and had at least a few areas of expertise.

CGI is also cool to learn about the workings of, since it almost seems too simple.


> I see the HTTP transaction test as "did you ever get curious as to how exactly a core part of the current Internet actually works".

Sure I did.

Then I forgot most of the details because they didn't matter, and I knew I could look them up quickly if I ever needed to write a HTTP client/server for some reason.


You would've gotten it wrong though!

You need two newlines to finis the request, plus the HTTP 1.1 standard requires clients to send a Host: header for all requests.

Not saying every interviewer would care about that in an early screening process.


if you answer "GET / HTTP/1.1\r\n" I'm going to ask you if you left anything out.

Because you did: after that you have to provide a Host: <hostname>.


But could you describe the general structure?

Yeah expecting many people to be able write out a complete http request from memory without a reference to look at. But the general structure of a http request is something so basic to web development that asking what the structure of a http request looks like isn't an unreasonable expectation.

Request line (method, uri), header(s), empty, body...


There is some truth to the saying that some people don't have ten years experience, they have two years of experience five times in a row.

Learning to use wireshark or tcpdump is a power tool that does show whether you got more experience in understanding the lower levels, or stayed at requirements-and-tests. (Not necessarily bad, but a good "fork" to jump off from)


The number of web devs I've encountered who regard my ability to talk HTTP over telnet as black magic makes me sad.


In all fairness, if you can talk normal HTTP 1.1 over telnet with some service, someone configured TLS wrong ;) And if you can talk HTTPS over telnet unassisted... well, I am truly impressed.


There's always openssl s_client


Yep, and when devs watch me key in the s_client pipe to OpenSSL to dump the cert info it's like I've become Neo and entered bullet time. I guess they don't need to know this stuff, but trying to do things like editing a hosts file in OSX, flushing dns, opening an incognito tab, looking at the SSL cert through the GUI, manually comparing it to one in an editor....vs a pretty short one-liner with curl or OpenSSL and a diff....I guess I'm either biased or lazy. I also almost never get asked WTF I just did, just a "wow, thanks" at most.

A previous employer had a sysadmin wiki. We call it Devops now, but I really liked working with the plain-text files of Dokuwiki there. Confluence is good for some things, but as a notebook of shell snippets and when to use them, it's not great.


> I also almost never get asked WTF I just did, just a "wow, thanks" at most.

Asking "wtf did you just do" is responsible for probably 1/5 to 1/4 of the professional knowledge I have today. It's sad that many people ignore that people will often teach you their little tricks if you ask.


What? HTTP 1.1 doesn't require TLS.


Scala is statically typed.


> "Could you write out what an HTTP request and response looks like on the board?"

Why should anyone remember what an http request or response should look like? Statically typed vs. dynamically typed language?

Fuck.

Are these entry-level positions or for someone with 10 years work-ex? A simple search on Google can tell anyone the answer of these questions, why do you expect people to carry an imprint of it in their memory? If the problem they'll work on mandates knowing these things it'd be pretty easy to solve with just one search. It is exactly questions like these that are worth kicking the host organization back in the butt.

Either your interviewing process is hilariously stupid or you're just spiking it up to boost the ego here.


Static versus dynamic typing is so fundamental that I don't see how a programmer could be remotely competent without having been exposed to those concepts enough to have internalized them. It would be like an accountant not knowing what the number 4 is. Yes, you can look it up, but if you need to then how did you ever get this far?


OK, ask me that question about defining the difference and I'll argue with the question, and back up my argument with examples of how type systems are far more of a spectrum of different cases than a stark static/dynamic binary.

And then your non-engineer phone screener who's expecting the answer to match the scripted sheet will conclude that I don't know this "fundamental" thing and thus am unqualified.


This isn't a question a non-engineer phone screener can ask. Coming up with first pass filters that don't require an engineer to interpret is harder.


This is proposed as a question that an engineer would ask, not some base-level screener.


Which would be true, or rather: over-qualified.


> It would be like an accountant not knowing what the number 4 is.

It's a hypothetical no-go! Every person, even the fourth grader knows the number 4. So why ask a question that measures their ability to remember 4, say 4 or show that they know 4.

> I don't see how a programmer could be remotely competent without having been exposed…

Share this link with them:

http://stackoverflow.com/questions/1517582/what-is-the-diffe...

Invest in people and people will invest back in your business. Interview process that I follow at my workplace has just one goal to assess: whether or not it'd be great to work with this person and spend over ~50 hours per week with them.


Are you hiring fun people who know nothing about computers? Or are there actually more criteria than you let on here?


> hiring fun people…

Absolutely! This is super super important. Fun to work with, not annoying to waste time with.

> know nothing about computers

It's sad that you think this way of people who couldn't answer your questions at the expected level.

> Or are there actually more criteria than you let on here?

Yes! One way to know if they're any good or not suitable is by giving them a problem statement like so:

'Design X, feel free to choose a language that's suitable for this problem', and then may be proceed to hint with: 'You might want to look at advantages of Static versus dynamic typing'… and then let them ask whatever questions they want to ask or read up or search or start implementing whatever.

Observe what they do -- and how fast can they get to the decision of what language and why. And how to make X (break down of steps) or if they can dive and start making X there itself. Note, if they had theoretical knowledge of what you seek during an interview it will work to their advantage naturally. Or sometimes not.

Of course, this process may not work for you as it does for us -- therefore seeking direct answers about static vs dynamic language may not be such a bad question after all (I get it), but expecting people to accurately remember what an http request or its response looks like may not be fruitful at all. It can throw good people off guard and ruin the rest of the interview for them.


Well, following Netflix's mantra - it is a team and not family that you are hiring for. Anyone can Google and find answers, doesn't mean you would hire everyone, would it?

There are a number of basic items that a competent programmer needs to know off the top of his head. If they had to google for every single item, then their productivity goes down the drain and so does the entire team's productivity. You should fix your hiring.


It is also a ridiculously dogmatic question. Many people believe into a fallacy that static typing makes safer programs, for example, and expect that somewhere in the answer.


How is it dogmatic? Sure, there's a lot of dogma around which one is better, but simply explaining what each one is and what's objectively different about them isn't remotely dogmatic.


Could you explain how static typing makes less safe programs?


I have yet to see a large static typed program that didn't -- somewhere -- run into the limits of static typing and contain a set of workarounds, using void* or linguistic equivalent. That's code a dynamic language doesn't need.

The only code you can be sure isn't buggy is code that doesn't exist.


void* is usually not a symptom of limits of static typing, but limits of the [type system] design or human brain. You can think of it as "ok, I give up. Anything can be passed here, proceed at your own risk, compiler will not save you here, errors will show up at runtime". Even the memory safe Rust does not do without such unsafe blocks. In dynamically typed languages that is everywhere, though. I have said this before: safety benefits of static typing show up when you are working with at least data structures, not simple variables. Imagine you have an external endpoint or library call that is specified to return a single object and does exactly that. At some time after release you are the maintenance programmer responsible for implementing spec changes:

  * The object returned no longer has member/property x, it is obtained by other means;
  * The endpoint returns list of such objects.
How sure are you that tests in dynamic language cover these cases? My experience shows that tests very rarely get designed to anticipate data changes, because data is driving test design. Which is more likely for a test: a) to test whether object returned contains keys x, y and z; b) to check if the object returned is_list() (see appendix)? Static typing covers such cases. Static typing is not something that magically saves oneself from shooting them in the foot, but is nevertheless a safety tool that CAN be used. It is of course a burden if one does not intend to use it and that is the core of the debate.

Fun thing: in the second case if your code manages to convert input list to a map and assign one returned object to a key that coincides with the removed property and map access looks syntactically the same as property access (a very specific set of assumptions, though), the bug can butterfly quite deep into the code before manifesting :)


Static typing is basically a bunch of free type-based unit tests. You can write safer programs in dynamic languages, but you need to write and maintain a lot more tests.


You can't compare static + N tests, vs not static with M > N tests.

Compare static with N tests, vs not static with N tests. In what case would the not static be safer?


If the type system is not expressive enough and you have to get around it?

The claim that "dynamically type language" allows code to more closely follows the business logic has merits. And you could follow from that to claim that type system could be causing more bugs (ie less safe).


That's a preposterous attitude. Just imagine if we took a similar approach to hiring for other kinds of jobs:

"OK, so you'd like to work here as a mechanic. What's the difference between automatic and manual transmission?"

"It's not fair to expect me to know that off the top of my head. If I need to know, I'll just do a Google search."


"OK, so you'd like to work here as a mechanic. What's the difference between automatic and manual transmission?"

Nope, more like "can you write out on the board what types of connectors are used in the car cooling system and in what order".


That's a crap comparison.

You're gonna have a hard time drawing a comparison between a line of work where you build things and one where you fix things.


Ok, how about "What is the firing order on a Chevy LS1?" ?


I would not want to work for or with him, if he asked me that in an interview, I'd walk out.... if he think thats what good computer scientist should know...


I'd be able to write HTTP request by hand, I've done that quite often, however I would not expect that to be a common skill. Looking at something, even often does not in any capacity mean that you would be able to reproduce that from memory.


Exactly. These are memory tests, not ability tests. Beyond a very basic level, memory tests are too random to be useful.

I once aced a geography exam because I happened to read up on the economics of Nigeria just before I took it. By sheer luck, there was a question about Nigeria in the paper.

If I'd read about Zimbabwe instead I'd have been screwed.

Neither possibility provided much insight into my competence as a geographer.

Even if a job spec needs specific knowledge of key facts, you can't generalise from pass/fail memory questions to broad spectrum competence, or lack of it.

If a candidate has no idea what an HTML request is, that's one thing. If they know damn well what a request is but can't list all the elements in a stressful interview while you're staring at them, - because in fact they spent the last year working on database code, and the API stuff was the year before that - that's something else entirely.


Why it is important to know the difference between statically and dynamically typed languages? If one writes in only one of those (or one set) it is not important to him/her and doesn't specifically make him/her a worse programmer in that particular language.


Knowing a cursory difference between a statically and dynamically typed language in this day in age is not an unreasonable requirement for many developer positions, especially web development where you're often using a mix of languages.

As always, this sort of question is a test of competence by proxy and there are usually outliers, but statistically speaking, I think you'll find a very high correlation between inept programmers and people who don't know the difference.


If they didn't know it, I'd want to dig down into whether the understand the specifics of their particular language at least.

I actually just sat down in a meeting with a dozen programmers, some of them with decades of experience, and half of them didn't know what functional programming was.


And the half that didn't know where much worse programmers than those that knew?

Your example shows that not every programmer has to know that.


General domain knowledge. We do Scala and a little bit of Python here and there. You should know the difference to show you're well rounded. Senior devs sound have some experience in both types.

I have a follow up question, "What are the advantages of a dynamically typed language over a statically typed one?"

This one kinda exposes the "Java-zellot" side of programming. If you love Scala and you're applying for Scala position, you don't often think like this. Being able to think critically about the things that are harder in Scala, that would be easier in a language without strict type checking, is a another good way to gauge if people can think critically.


I agree. Why the hell would you ask someone at that level basic questions like fizz buzz? It's absurd. I also tend to shy away from asking coding questions in interviews, they don't tell me much about aptitude for critical thinking and culture fit. Skills can be taught but culture is much harder. ... But I'm not saying to throw in some questions that don't prove that they are actually competent, just be casual about it.


I think coding questions are really important. You see their logic flow. Now stupid coding questions (in a list, find all the number pairs that add up to another number in the list) are terrible. They're complex and even good programmers need time to think about them. Fibonacci is one that people expect, so they look up all the variations and you get people who are good test takers (would ace a GRE/MCAT) but not good designers.

You want a simple question that isn't common, but that shows how they break down a problem under stress. Example: you have an input with paragraphs at 80 characters. Write a function to return the same paragraphs wrapped to 40 characters. You cannot break a word and must maintain paragraphs.

Great design questions: a word problem (You have an autoshop with, staff and customers. Customers can own multiple cars. A staff member gets assigned to a car with a work order...) .. draw an ER diagram. This is actually a pretty low stress question. It should be straight forward. If someone draws a terrible ER diagram with lists in tables and no normalization, or unnecessary relationships (or you have to keep asking them to label 1-to-n/n-to-1 relationships and they struggle), you know they're not going to be good at designing database schemas.

Another great general knowledge question: "A user types in a web address into a web browser and hits enter. Describe what happens. Go into as much detail as you can." This gives people a change to elaborate as much as they can. People can talk about DNS, HTTP, load balances, HTTP request/response, cookies, load balancers, web apps vs static content...

Questions need to be geared to the job. You don't ask someone to draw an ER diagram if they're being hired to rack servers and setup VMWare. Likewise you don't ask a web developer to write a function to do matrix multiplication.


I often ask the web browser one and find it quite illuminating. Best answer so far started with something like "Well, there's a microswitch in the keyboard if it's a decent one, and a circuit that debounces the input - err, is it a USB keyboard or a PS/2 one? Hmmm... How long do I have to answer this question?" THAT is the guy you want to hire...


Last time I got that question, I started with nerve impulses.


Still not enough: If you wish to explain a web-request, you must first invent the universe.


"Describe an HTTP request? Tricky.... What's the mass of the electron in this hypothetical?"


I did actually start my answer to that one with 'Look, I'm just going to skip over the microcontroller in the keyboard and the USB protocol --- is that OK?' and was met with a calm, 'That's fine.'


I feel like there is a happy medium there.


> In a list, find all the number pairs that add up to another number in the list.

You're saying this is a bad question because it's too complicated? Am I missing something? It really doesn't seem more complicated than the paragraph question to me, but maybe I'm having a brain lapse.


The entirety of HN seems to have something against Competitive coding.


My company has been giving the fizzbuzz for students applying for internship, with any language they wish and extra for style points.

The results speak for themself. All the good applicants do it in no time, without hesitation and give a perfect answer and usually some style points on top. The ones who have second grade coding skills have always something wrong with it.

It's a good 5 minute test whether someone can code or not. It shouldn't be the only test, of course.


How do you know the people failing your interview process have "second grade coding skills"? The fundamental challenge with evaluating interviews is that companies don't hire people who flunk interviews - so there is no easy way to reliably measure the false negative rate. Does fizzbuzz ability correlate with coding ability? Maybe, but you'd have to hire people who fail fizzbuzz to definitely answer the question. I know that I use google extensively at work - interviews don't allow you to use search.


I'm not sure about OP, but there is a tech company that has said it hires people who fail their interviews occasionally to see if their interview process is working. That company is the one that is the subject of this thread.


Can you cite your source? I haven't seen this anywhere.


I haven't read it myself, but I heard that's what Laszlo Bock said in Work Rules.


We actually do let people use Google during our code interviews. They'll use it at work, so why not.

We do watch them work though so if they just copy and paste from stack overflow and they don't understand the problem, it's pretty obvious.


It depends on the questions.

If you require using real, compiler correct language in a coding exercise, and the problem is not trivial, than allowing search is more than fair.

But the point of Fizzbuzz is being such trivial problem that it really should not require nothing more than an understanding of basic programming logic and constructs.

In my (limited) experience, there were instances where the candidate could not even decide on a programming language to use, I told them to use pseudo-code and they still flunked horribly.

Aside from that, Fizbuzz is rarely a dealbreaking task in itself, it tends to correlate pretty well with the overall performance, I would be surprised seeing someone failing fizzbuzz and excelling in the rest of the interview (once again, in my limited experience).


These were done in recruitment events at universities and the applicants were free to access Google if they wished. Some guys even went to the computer lab to do the assignments on a computer and then return a printout of their code. And we were completely fine with that.

But really, if an applicant needs to google to solve FizzBuzz, they don't have a firm grasp of the fundamentals. You're required to write one loop, a few if/then/elses and understand how the modulo operator works. Our jobs are much more demanding than that.


> Why the hell would you ask someone at that level basic questions like fizz buzz?

Because there are people applying for software engineering jobs that still can't answer those questions.


If you have a CS degree and cannot answer this type of question in your language of choice, you simply aren't ready for even a junior position in my opinion.

This type of coding exercise can potentially answer more questions about the candidate in two minutes than 30 minutes of softball questions about the candidate's past experiences.

I think that people who disagree simply haven't done much interviewing or haven't worked on a team with someone who couldn't do much more than copy/paste code from SO.


> I think that people who disagree simply haven't done much interviewing

Absolutely. Last time I went through trying to hire people was about a year ago. Easily 90% of the applicants we saw were completely unqualified. You have to have a way to weed them out.


I don't ask multiple questions like FizzBuzz, but I do ask for FizzBuzz. (I will explain the modulo operator if necessary because it doesn't come up that often in web development and people many forget about it until prompted.) Everything else about FizzBuzz (loop over a range, use a conditional, define a function, compare, etc.) is so basic you would think you wouldn't need to test it - but then you run into a person with 10 years experience who can't do it.

It's a (sadly) useful screen. Even more sad when you realize how popular and widespread that particular question is.


The question is, "why wouldn't you?" If the person is competent they will dismiss it in seconds and you can move onto something more interesting.


But this is Director level. You are wasting your time and their time. Far more important things to be factoring in for a director level.


Not being able to answer even a simple coding question with for-loops is a really bad sign, even if the question is "beneath" the candidate's level.

I'd expect any technical candidate to be able to do at least a fizzbuzz-type question.


Now take into account stress, lack of preparation, environment the person is not used to, unusual syntax patterns for them, biases against them, their way of talking, their appearance, etc. and you get yourself people good at your kinds of interviews in your biased view. You can only hope they are at least average at their job.


The most beautiful and elegant part of coding is the logic, Not how to use a for loop. Anything question that can be answered with google should be forbid from a interview test. show him a method, ask him how he can improve the performance. ask him a opinion based question on OO design.

if you hiring a house builder u would not ask him what a brick looks like right?


Yeah, but if your have to get through 100 house builder interviews and half of them don't know what a brick is, it saves a lot of time, no?


if you hiring a house builder u would not ask him what a brick looks like right?

The problem with this is that a home builder/contractor will have a long list of references, and possibly examples of her work available for examination. Many engineers search for jobs while still employed, so they generally don't include as references co-workers and current managers. Further, if your employer doesn't allow you to open source your work, then you need to do open side projects to have any sort of real resume prospective employers can examine (and this is problematic since your day job may already take more than 40 hours of your time).

So, no, I don't need to ask a contractor if he knows what a brick looks like, but I do need to look at his references, look him on Angie's List, post to local message boards about his work. And, of course, I'm not an expert on home building, so it would be unreasonable to ask him questions about carpentry or framing.


I see where you're going, and I generally agree with you, but I think that every programmer should at least be able to FizzBuzz, just as every architect or master home builder should be able to answer the question "which one of these is a brick?"


I think there is no point in assessing anything that doesn't take years to learn. And it's fairly easy for any team to come up with some fundamentals, that a candidate should know. There are more important qualities, than knowledge, though, as google-funded research suggests, like empathy.


If the interview distinguishes between people you want to hire and people you don't it's a pretty good interview, surely?


It's fair to discuss in the abstract, but seriously, in the OP the interview that failed him didn't fail him for an ordinary site maintenance position because he wasn't capable, it was because the recruiter was incompetent.

Of course, a sample of one (anecdota) which is most likely the min of the distribution is always the worst way to judge a distribution, but this is still upsetting.


Agree with everything you said... but I find it not applicable to this particular candidate given his answers.

It's possible that he got frustrated, became condescending towards the recruiter, and the recruiter decided to screen him out.

There are plenty of companies who turn down candidates that are false negatives for various reasons. Author should probably not take that personally and just apply again.


> completely failed even the most basic system / coding questions.

But could they at least tell you why quick sort was the best sorting algorithm?


Apart from the fact that it isn't always the best sorting algorithm and it's pathological worst case is in fact as bad as a bubble sort's worst case (O(n^2) with a higher base operational cost, as at least bubble sort's worst case is somewhat cache friendly). This happens in three cases; 1) all the elements are sorted in descending order, 2) all the elements are sorted in ascending order, or 3) a special case of 1) and 2) combined, all the elements are equal.


Because its quick, right?


It also sorts.


Also: It has a cool name. It's much better than 'slowsort'


Agreed.


First, it is definitely standard process to tell him (if they didn't, that's a definite failure). Again, remember you only have one side of the story here.

I like to try to gather facts before assuming things. IE Ready, aim, fire, not fire, ready, aim.

Admittedly more difficult in this case (and certainly, i have no access to it)

Second i'm going to point out a few things:

Experience may translate into wisdom, it may not. Plenty of companies promote people just because they last long enough. So 20 years experience managing may translate into a high level manager, it may not!

I hold a bunch of patents too on compilers and other things, it's not indicative of much in terms of skill, because almost anything is patentable.

Lastly, SRE is not an ordinary site maintenance position by any means. I"m not even sure where to begin to correct that. I guess i'd start here: https://landing.google.com/sre/interview/ben-treynor.html

Does this mean this person is under/overqualified/exactly right? I literally have no idea. I just don't think it's as obvious one way or the other.

"Well, that sounds like a dumb recruitment process."

Judging an entire recruitment process based on one side of a story from a person who's clearly upset about an interview, and even 3 sentences i wrote on hacker news, seems ... silly.

If you want to do it, okay.

But everyone in this entire thread seems to be making snap judgements without a lot of critical thinking. That makes me believe a lot of people here have a ton of pre-existing biases they are projecting onto this in one direction or the other (and you are, of course, welcome to claim i fall into this category too!)

I almost didn't jump into this discussion because it seems so polarized and rash compared to a lot of others

I think i'm just going to leave it alone because it's not clear to me the discussion is going to get any more reasonable.


> SRE is not an ordinary site maintenance position by any means

Then why ask about the nitty gritty details required by maintenance personnel as part of the screening process - things I would rather have my high level employees looking up rather than relying on a possibly faulty memory.

> Judging an entire recruitment process based on one side of a story from a person who's clearly upset about an interview, and 3 sentences i wrote on hacker news, seems ... silly.

This kind of opinion is not formed in a vacuum. It's formed of the dozens of posts that appear every year about how someone who seems qualified is turned down for spurious reasons like "being unable to reverse a binary tree on a whiteboard". It's what makes this particular post so believable - it fits the stereotype. Even your own developers who post here say "yeah, that's more accurate than inaccurate." Perhaps it wouldn't hurt to "undercover boss" your way through the interview process...

Speaking for myself, and only myself... I turn down all Google recruiters because I know I would not pass Google's interview process. Not because I don't have the skills, but because I don't have a college degree. Because I don't see the return on investment for studying for the next 6 weeks just to pass the interview process, especially when I won't even know if I'm getting a job I'll enjoy.

> I think i'm just going to leave it alone because it's not clear to me the discussion is going to get any more reasonable.

How about the responses from your own employees which are pointing out that they see the problem too. Are they being unreasonable?


"Then why ask about the nitty gritty details required by maintenance personnel as part of the screening process - things I would rather have my high level employees looking up rather than relying on a possibly faulty memory. "

This is one reason why i find it super-strange. It's not a set of "high level employee" questions. It's a standard SRE pre-screening.

"How about the responses from your own employees which are pointing out that they see the problem too. Are they being unreasonable?"

My view of unreasonable is not about whether there is a problem or not. It's not about the consensus. I don't actually have an opinion myself on the hiring process. If people i work on recruiting raise problems, i try to solve them. I have not had trouble trying to recruit in general. So i haven't formed a strong opinion, even after 11 years. If folks want to decide the process is horrible, okay. If folks want to decide it's great, that's also okay.

But it's unreasonable because it's both super-quick reaction without time to settle and think, and not aimed at anything other than trying to reinforce one view or the other.

Nobody is actually listening to each other, they are just trying to force whatever their view is, good or bad, on others.

So to answer you directly, i don't think pointing out a problem is unreasonable, but that's not my complaint. My complaint is that the actual discussion is not a discussion, but mostly people just arguing on the internet. IE You shouldn't take me saying "unreasonable" as a proxy for "me saying i think their viewpoint is wrong". I just think the mechanism of discussion here is unlikely to yield fruitful results.


> It's a standard SRE pre-screening.

To clarify, I was speaking of your standard SRE hires, whose position you referred to as "not maintenance drones".


I suspect for someone who has failed - rightfully or not - a recruitment exam in this manner, it may in fact be the only cathartic mechanism.


> Then why ask about the nitty gritty details required by maintenance personnel as part of the screening process

I'm not sure what "nitty gritty details" you're talking about here.

As much as some people here think it's impressive knowledge[1] to be able to give the size of an ethernet MAC address without Googling it, that's something that anyone with experience in computer networking oughts to know. Not at all because it's useful knowledge, but simply because if you actually spend time looking at network traffic dumps or ARP tables or DHCP configuration or SLAAC assignments you'll be seeing MAC addresses so often that it just becomes obvious. Just like knowing that an IPv4 is 4 bytes and an IPv6 16 bytes. Or that a TCP connection starts with a 3-way SYN/SYN-ACK/ACK handshake.

And the same thing applies to the other questions that look like meaningless details: knowing what an inode is and what syscall returns inode data for a path is something that someone with system-level C programming experience should know. stat(2) is far from being something obscure. Knowing what signal is sent by the kill(1) command is maybe slightly more on the trivia side IMO, but it's still a very well known fact.

A candidate is most likely not expected to know the answer to all of these questions. But failing in all of the categories is IMO a fairly strong red flag for someone interviewing for SRE, where in general people are usually expected to be comfortable with at least one of {networking, system administration, Linux internals}. In fact, this domain specific knowledge is the biggest differentiator between "standard" SWE and SRE-SWE, even though the lines get blurrier and blurrier.

This also indirectly answers this:

> things I would rather have my high level employees looking up rather than relying on a possibly faulty memory

You would have to be out of touch with the field for quite a while to forget such basic things. Which is likely something that you want to test for in such interviews. To go with a metaphor: if you claim to be a fluent English speaker on your resume, you can't be excused of "faulty memory" if you forget how to conjugate "to be" in the present tense. It's not something you forget easily, and if you did forget you most likely can't say you're fluent anymore.

Disclaimer: I was an SRE at Google for 2.5 years, but I'm not familiar with the early phases of the recruiting process.

[1] https://news.ycombinator.com/item?id=12701486


So, as someone who went through the process and got through it (so is less inclined to hold a grudge):

> Then why ask about the nitty gritty details required by maintenance personnel as part of the screening process - things I would rather have my high level employees looking up rather than relying on a possibly faulty memory.

AIUI you can get easily 5 or more of the pre-screen questions wrong and still proceed to the next stage, depending on your experience and how wrong you are. The point here is not that you know each and every one of those things, but to show that you are, in general, knowledgeable enough to spend Engineer hours on.

And your judgement of these questions is seriously impaired by the fact that they are written down wrong. I assume, that the author of this post has written down a rough transcript from memory and as such it's colored by their own (mis)understanding of the question and whatever got leaked from memory in the meantime. The questions he wrote down are, at the very least, not verbatim the ones from the checklist given to recruiters (and there is a strong emphasis on reading them out verbatim there, so I consider it relatively unlikely that the recruiter didn't do that).

> It's formed of the dozens of posts that appear every year about how someone who seems qualified is turned down for spurious reasons like "being unable to reverse a binary tree on a whiteboard". It's what makes this particular post so believable - it fits the stereotype.

Exactly. You are reading "dozens of posts every year" from disgruntled interviewees who got rejected and are pissed. On the flip side, a quick internet search will tell you that Google gets on the order of millions of applications each year, meaning you don't hear from >99.99% of applicants.

There is also the widely advertised fact, that the Google hiring process accepts a high false-negative rate, if that also means a very low false-positive rate, so it is to be expected that a good percentage of qualified applicants still get rejected. It is thus also to be expected, that you hear from some of them. Meanwhile, again, you are not hearing from the thousands of qualified applicants that do get accepted each year. Because an "I interviewed at Google. It was pleasant, everyone was really nice and they got me a good offer" blog post won't draw a crowd on hacker news, even if it was written.

> How about the responses from your own employees which are pointing out that they see the problem too. Are they being unreasonable?

Let's not ignore the responses from Employees that don't think there is a problem.

From reading this post, I'd say a likely reason for the rejection is, that this person wasn't being particularly pleasant. Frankly, he comes of as kind of an arrogant prick. And, as a general rule, engineers at Google, just like everyone else, don't particularly like having unpleasant people on their team. And I also believe this post has gotten enough upvotes, that someone will look into the situation to see what went actually wrong here.


> they are written down wrong.

Please, feel free to correct the record, then, with the correct screening questions. The proverbial cat is out of the bag, and has gone tearing down the street towards everyone trying to make a buck by "training" hopeful young graduates on how to make it through the Google interview process.

> Because an "I interviewed at Google. It was pleasant, everyone was really nice and they got me a good offer" blog post won't draw a crowd on hacker news

No, it won't. Because it's the tech equivalent of a lottery winner saying they think the lottery system is a fair and equitable way to distribute money.

> Let's not ignore the responses from Employees that don't think there is a problem

Same problem. If you're in, you passed the Google employment lottery, so it's much more interesting (and should be more meaningful to management) when insiders agree that the hiring process has problems.

Now then, of course, so long as directors find that they have plenty of applicants to back fill attrition and grow, they have no reason to think the hiring process is broken; so long as Google is happy hiring not necessarily the best people for the job, but the ones lucky enough to dodge more false negative flags than everyone else. Better to be lucky than good.

All that said, yeah, Google's hiring process works for Google. Coming here, to a conversation started by a crappy screening experience, and expecting respect for a process with so many false negatives is a bit optimistic, though.


> Please, feel free to correct the record, then, with the correct screening questions.

No can do. I actually like my job. And I also like my coworkers and don't want to make their life any harder.

> No, it won't. Because it's the tech equivalent of a lottery winner saying they think the lottery system is a fair and equitable way to distribute money.

The same goes for a "I interviewed at Google. It was pleasant, everyone was really nice but sadly I didn't got accepted" post.

The fact remains, that you don't read from >99.99% of people. My interview process was very pleasant. I had a bunch of nice conversations about programming and computers with friendly and humorous people.

> Same problem. If you're in, you passed the Google employment lottery, so it's much more interesting (and should be more meaningful to management) when insiders agree that the hiring process has problems.

There are a lot of insiders. With a lot of opinions.

> so long as Google is happy hiring not necessarily the best people for the job, but the ones lucky enough to dodge more false negative flags than everyone else.

Well, the thinking here isn't really "we want strictly the best". That would be a hopeless idea from the get-go. The thinking is "there is a hiring bar that we want people to pass and we want to hire exclusively from above that. We don't care about the sampling of that, as long as we get that". What they end up with is a pretty broad sample of that population. Some (like me probably, tbh) just barely pass the bar, some are the very top. Some other top-people got unfortunately rejected, some other barely passing people too.

So yes. There is indeed no ambition to actually get just the top 100K engineers in the world.

> All that said, yeah, Google's hiring process works for Google. Coming here, to a conversation started by a crappy screening experience, and expecting respect for a process with so many false negatives is a bit optimistic, though.

Well, mostly I (and DannyBee) are just pointing out obvious flaws in the discussion here. Like the obvious self-selection bias and selective reporting. And the also obvious fact that this particular post was written while angry and only represents one side of the story; and that not even accurately.

Secondarily, in these long-wound comment threads on reddit/hackernews/twitter, people seem to usually not even be aware of the goals of the hiring process and think "look, here, three prominent false negatives" is an actual argument about the process being flawed.


>Judging an entire recruitment process based on one side of a story from a person who's clearly upset about an interview,

It's not just this guy. There have been others: https://twitter.com/mxcl/status/608682016205344768

There's another measure I use to measure the quality of their hiring process. The output. Namely the track record of products Google has developed in house in the last 10 years.

I've also heard a few stories about friends applying for a position and being shunted by the hiring process into the hiring funnel for other (plainly unsuitable) positions. When I hear a very specific criticism from two separate places it's hard to stay skeptical.


Yep, those engineers they took on in the last ten years must suck, they've only managed to develop technologies that grew Google's annual revenue from 10 billion dollars in 2006 to 75 billion in 2015. That's the kind of track record that has to make you question the hiring process, right?


You seem to be confusing "I have a smug twitter-sized sound-bite response" for "I have a worthwhile counter-argument".

It's a common failing these days, but you should probably look into getting it fixed.

That said, yes, Google's hiring process is questionable. The Web is full of horror stories from obviously-qualified people who Google passed on, often very early in the process when no engineer had talked to them, and this suggests Google's success is not sustainable so long as that continues. They'll be able to hire fresh CS grads out of Stanford forever with this process, but the experienced/unconventional people they flunk out on the early screens are not going to come to them, and when their current crop of experienced/unconventional engineers retire or take jobs elsewhere, Google's finally going to have to fix this problem and stop pretending that it's better to pass on a thousand highly-qualified candidates than to give one unqualified candidate an on-site. That, or tumble back down into mediocrity.

(which, to be fair, is already mostly the case; Google is largely a mediocre company, with only a couple of externally-visible brights spots of talent or innovation clustered in a couple of particular teams, and otherwise Google runs on inertia and the hope that the 0.1% of interesting stuff they come up with will keep the 99.9% of mediocrity afloat)


I can expand beyond 140 characters if you like. The OP claimed that in the past ten years, as a result of their hiring practices, Google's product output quality has noticeably declined, presumably as compared to the search product on which their name was made, and gmail, which they launched in 2004. And it's easy and fashionable to knock Google because maps is not as good as you remember it used to be, or because they shut down reader, or because plus didn't manage to unseat facebook.

Well, in 2006 Google was a 10 billion dollar search and ad company with a fledgeling email business without a revenue model, who had just bought youtube. In 2008 they shipped a mobile phone operating system. That's now a thirty billion dollar business which has been built up through talent within google. They undermined Microsoft's office monopoly with an online office suite (okay, some acquisitions underpinning that). They have a credible seat at the top table in the cloud market. And they continued to develop their core ad platform to drive more revenue growth.

I've got no particular reason to stand up for Google, they're quite big enough to look after themselves, but the idea that their product flops in the last decade outweigh those product successes, and can be held up as evidence that there is something deeply rotten in their hiring model, seems to be cherrypicking to me. 70% mobile OS share, 70% search share, and 50% of global online ad revenue... that's a pretty good kind of mediocrity.


It's still the case that other than search and ads, most of Google's biggest hits were acquired rather than the result of in-house initiatives (even Google Analytics, which is probably one of their more heavily-relied-on products, was acquired). Google doesn't hire people who will create stuff like Android; they hire people who can pass their interview process, and get new product and service lines mostly through acquiring teams of people who probably can't pass their interviews.

It's also the case that Google is acquiring a reputation for bad interview/hiring processes, and for hiring people who have a Ph.D. in CS and putting them to work on CRUD web apps that any random coding-bootcamp grad could build, since there's just not enough interesting in-house work to keep all those top talents occupied.


Google internally-initiated successful products that come to mind: Cloud (2nd or 3rd in market, lots of revenue and growth), Play Store (also lots of revenue and growth), TPU chip, SDN, Photos, Chrome, ChromeOS.

Google (vs Alphabet) often acquires companies that have a seed of a useful product. Android for example was apparently not in a usable state when it was acquired. 99% of the creative work is making the thing actually work, not in having the prototype.

To say Google's own engineers didn't create Android because they didn't commit the very first line of code is doing them a disservice.


>I can expand beyond 140 characters if you like. The OP claimed that in the past ten years, as a result of their hiring practices, Google's product output quality has noticeably declined, presumably as compared to the search product on which their name was made, and gmail, which they launched in 2004. And it's easy and fashionable to knock Google because maps is not as good as you remember it used to be, or because they shut down reader, or because plus didn't manage to unseat facebook.

I don't necessarily blame them for plus (facebook was clearly a marketing success, not a technology success), but maps' decline isn't anybody else's fault. It has declined in quality and that is plainly an engineering failure not a product failure.

>Well, in 2006 Google was a 10 billion dollar search and ad company with a fledgeling email business without a revenue model, who had just bought youtube. In 2008 they shipped a mobile phone operating system. That's now a thirty billion dollar business which has been built up through talent within google. They undermined Microsoft's office monopoly with an online office suite (okay, some acquisitions underpinning that).

Well, yes. Acquisitions underpinned all of that success.

>I've got no particular reason to stand up for Google, they're quite big enough to look after themselves, but the idea that their product flops in the last decade outweigh those product successes, and can be held up as evidence that there is something deeply rotten in their hiring model, seems to be cherrypicking to me. 70% mobile OS share, 70% search share, and 50% of global online ad revenue... that's a pretty good kind of mediocrity.

All predicated upon outside purchases or the original self-reinforcing search monopoly developed before 2004.

What's worse is that they've often used their search monopoly to try to break into other markets (flights, shopping, etc. - plenty of stuff like this got preferential SERPs treatment) and failed because what they released was crap. That is, they failed even with a huge home ground advantage - the kind of monopoly advantage that let Microsoft make IE6 (IE6!) the industry standard for years and got them slapped by the DoJ couldn't even be put to good use by Google.

I'm not denying that they have some good engineers but the idea that they're the creme de la creme of the industry with the best hiring process is way way off base.


There are a lot of assumptions being made here. Sometimes companies grow despite poor hiring decisions. I think you need a finer-grained view than just revenue to really tell whether you're doing a good job or not. Lots of terrible decisions have been justified by this "the revenue went up so we must be doing a good job" line of reasoning.


And Comcast has some of the best customer service and engineering because they don't seem to be losing any customers.

Right?


> There's another measure I use to measure the quality of their hiring process. The output. Namely the track record of products Google has developed in house in the last 10 years.

That's a poor metric to evaluate the rampant complaints about a high false negative rate. I don't think that many people are disputing that the people who do get hired are qualified most of the time.


When the in house engineers come out with products like Wave and Glass while things like Maps and Android are purchased you have to wonder.


Psst: the Rasmussen brothers were behind both Maps and Wave.

https://en.wikipedia.org/wiki/Lars_Rasmussen_(software_devel...


I think you're neglecting the continuous improvement of successful projects, which take quite a bit of engineering effort.

Was it software quality that killed Wave and Glass, or was it more of the market not wanting either of those things? (To digress, it seems like both of those products came too early. Do you think that wearable computers will _never_ exist? And Slack seems to be the Wave-like thing that the market wanted.)


Funny you should mention that. I was just using maps and thinking "this is worse than it used to be".

From what I've heard from insiders, the adwords code base is an enormous mess. Not surprising for a product that old perhaps, but this points to their engineering practises being about as mediocre as the industry average.

I don't honestly know why people want slack. It seems to just be in vogue - one of those weird network effect things. It doesn't seem to have anything to do with their feature-set or engineering quality because it's not noticeably better than, say, hipchat.

>To digress, it seems like both of those products came too early. Do you think that wearable computers will _never_ exist?

They already exist.


Slack is in no way like Wave. Now you're just over reaching with your comparisons. Wave's flaw was showing you what the other person was typing as they were typing it. You try to separate quality from functionality and stick that to market's fault because it doesn't want Wave's functionality. That is not mutually exclusive. Wave's quality was egregious.


Not sure what you're saying here. Wave was great technically; the market fit just wasn't there.


Why is it a poor metric? Isn't the point of hiring employees to ideally build and launch successful products?

I think Google is pretty good at hiring "qualified" engineers who are very good at maintaining and scaling existing systems, but the process definitely selects against entrepreneurial product-focused engineers. Maybe Google thinks that's fine though: they can always pick them up through an acquisition later, albeit at 100x the price.


Google has never made it that clear what position I was interviewing for (and definitely not what team/role) when I interviewed with them. This was sort of pitched as a selling point, since after being hired you'd float around and find the niche eventually?


When was this? This was the case when i started (~2006), but it definitely changed and is not the case anymore.


Probably late 2000s when I was last on site. Google bugs me every year (most recently a week or two ago), but I don't usually push on the process.


Interesting. I could look up the date it changed, but it definitely changed because folks didn't like the old way :P.

Now, instead, they generally don't recruit (google is too large to not have exceptions) without some specific hiring managers and headcount in mind.

They will tell you what those groups are and what they do. So for example, the person i interviewed last week was targeted at two teams. I actually specifically asked if he knew what he was being interviewed for, because i like to get some idea what the candidate thinks whatever job they are interviewing for means, and he was able to tell me the two groups and knew what they did.


I interviewed at Google in March 2014 and was given an offer. I wasn't interviewing for a specific team. After the in-person interviews my recruiter set me up with 2 different team managers to talk to about potentially joining their team. I wasn't interested in either team, and my recruiter said "That's ok, we'll find a place for you," and a few days later found a new manager for me to chat with. I joined their team.

I did know I was interviewing for a general SWE role, but not anything more than that, and from all appearances the team was completely up in the air until after my interviews.

I don't know how much has changed since 2014. I also didn't get any of these pre-screen testing questions from a non-engineer. Is that normal practice for all interviews now?


FWIW, I got told what I was going to work on on my first day, by my new manager, when they picked me up for lunch. Before that, I didn't even know the PA. From what I can tell, that is standard practice for SREs, as SRE is very understaffed, so there is a lot of arguments and back-and-forth around where people are most needed.


>> as SRE is very understaffed in all likelihood due to the flaws in the process. I know quite a few people, who I highly respect, who IMHO are better than the people I know who work at google, who flunked the process.


> Judging an entire recruitment process based on one side of a story from a person who's clearly upset about an interview, and even 3 sentences i wrote on hacker news, seems ... silly.

How about the dozens of other seemingly qualified people who have complained about the google process?


"How about the dozens of other seemingly qualified people who have complained about the google process?"

And what's the other side of that? IE the literally tens of thousands to hundreds of thousands who haven't?

Again, i'm not saying there is no problem, i'm just saying this is probably not a great mechanism to evaluate whether there is a problem or not.

If you want actual usable data, this wouldn't be the way to get it, good or bad.


>First, it is definitely standard process to tell him (if they didn't, that's a definite failure). Again, remember you only have one side of the story here.

"Standard process" is what actually happens in the real world. Alas, standard process is to not tell him.

>But everyone in this entire thread seems to be making snap judgements without a lot of critical thinking. That makes me believe a lot of people here have a ton of pre-existing biases they are projecting onto this in one direction or the other (and you are, of course, welcome to claim i fall into this category too!)

Your story is also just one side of the story - actually, you weren't even involved so it's neither side. Still, you spend all your effort on saying why for example this guy's patents mean nothing and he's likely incompetent. I'd call that snap judgement, lack of critical thinking, and biased conjecture,


> "Standard process" is what actually happens in the real world. Alas, standard process is to not tell him.

Inferring what's standard from a sample size of 1 (which is ~0.0001%) is very questionable.

> Still, you spend all your effort on saying why for example this guy's patents mean nothing and he's likely incompetent.

That is not at all what they where saying. They where saying that patents aren't conclusive evidence of competency.


No, the policy/process DannyBee references is fiction. What's standard is what happens in reality. I'm clearly not talking about statistics.

For your second point, DannyBee focuses his efforts on discrediting this seemingly exceptionally qualified candidate, never yielding an inch from his position that Google is exceptional and can make no mistakes.


> What's standard is what happens in reality.

Infering what is "reality" from a sample size of ~0.0001% is clearly ridiculous. By that logic, it would be "standard" to be born a conjoined twin. Actually, it would be 10x as likely as what "standard" is.

> I'm clearly not talking about statistics.

You might benefit from doing so, though. It might help you realize what nonsense you are saying.

> DannyBee focuses his efforts on discrediting this seemingly exceptionally qualified candidate

No, this is factually incorrect. Repeating something factually incorrect doesn't make it more correct.

> never yielding an inch from his position that Google is exceptional and can make no mistakes.

You either can't or won't read. They very clearly acknowledged the possibility of a mistake several times in each post they made.


It seems to me parent's answer only reflects the general attitude at Google: they don't question anything they do, they don't do "customer support" and they don't display humility

Yes, I'm not expecting the conversation to have been exactly that, but it shows problems regardless.


Google questions a lot of what it does. It's made up of lots of engineers and other that are on this site and care deeply about the fields we are in. We are always questioning decisions made and try to use data as best as we can to back up those decision.

As for customer support, it depends on the product you are talking about. Your free gmail account or $5 purchases through the play store: don't expect a lot of support here (but there is some). If you are using Google Cloud, Apps, AdWords, or other products where you pay, you can expected to get amazing support (this will change with your spend level). For example: on the cloud side, you can pay for support contracts that gets you lots of 1-on-1 time with Google support staff to help you use the services[0]. Or with the new Pixel, there is on-phone support[1].

[0] https://cloud.google.com/support/

[1] https://madeby.google.com/phone/support/


I am aware of these support channels, but there are a lot of stories of paying customers getting stonewalled. Not to mention cases where non-paying customers or content producers get simply kicked out without recourse - though sometimes Google (and others) are right to act in a certain way


I used to have project fi, their customer support was quick and helpful, even for complicated things like when the porting out of my number got stuck (not their fault, it was the other carrier)


Listen, you have been hired by the greatest software company on the planet, you survived ridiculous recruitment process with multiple pointless whiteboard interviews, CLEARLY you are special. How dare those unworthy peons slander the name of your company? they arent qualified, you are. "customer support"? You are not being paid >200K to sit 8 hours in a chat telling people to turn it off and on again.


This to me looks like an initial phone screening interview. It's not actually a "technical" interview (there is no code to write and the person that interviews you is a technical recruiter and not an actual engineer). As far as I know (I might be wrong so take this with a grain of salt) your first screening interview is usually used to decide in which direction you want to proceed (for example if you want to be hired as a SWE-SRE or SE-SRE position). It's not far fetched to think that they were just applying some standard questions without having an actual clear position in mind yet.

I also agree with the grandparent, I'd be very sceptical about this transcript being 100% accurate.


This.

I passed several rounds of interviews at Google over a number of years (phone screening, phone interview, on-site). This is definitely a phone screening, where the recruiter expects "standard" answers to "standard" questions. Remember that interviews are somewhat of a game. Trying to be smart at this stage is the wrong move.


> This is definitely a phone screening, where the recruiter expects "standard" answers to "standard" questions.

I went through a Google phone screen once. (For full disclosure, I've interviewed on-site twice and failed that both times.)

One problem posed on the phone screen involved finding the last 1 in an infinite array consisting of a finite number of 1s followed by an infinite number of 0s. I described the search strategy "check index 0/1/2, then progressively square the index until a 0 is found, then use binary search to find the first 0". The screener objected to that strategy on the grounds that successive squaring "grew too fast" and successively doubling the index would be faster overall.

Once the call concluded, I looked into it and determined that those two strategies are almost exactly equivalent. This didn't leave me impressed with the phone screen process.

Then again, I apparently passed the screen despite making that "mistake". Still, I think the least courtesy you can extend to interviewees is to not correct them when they're right and you're wrong. :/


> ordinary site maintenance position

Seems you don't know much what Google's SRE job is about.


They really don't. I have been called by one of their headhunters recently who explicitly refused to tell me what sort of position I would be interviewing for. He told me that:

"That is not how we work. We will evaluate your abilities and then, if you pass, offer you a position on a team that we deem best fitting your skills."

Needless to say I have thanked him for his time and declined. I am not going to fly to another country to be grilled with stupid coding interviews only to be offered an entry level job on a team I am not interested in.

Another such thing was an invitation from Amazon's HR for an "accelerated testing session" where I was expected to go for a full day of coding tests (together with many others) and then they would pick who they invite for real interviews later where you may learn what sort of position they might offer you. Again, no idea what position/job you are interviewing for and wasted entire vacation day - for their convenience. System clearly targeting 20-somethings straight out of school. No, thanks.

The questions from the original article are familiar - but these are often external staffing agencies doing these pre-screens today. Google used to do it in-house with actually technically very competent HR staffers (I have done a few phone calls with someone in their California HQ back in 2002ish), but now if I get contacted by them every now and then it is always an external headhunter.

The staffing agencies employees tend to be very technically incompetent. Basically, they often have no idea whatsoever about the technical requirements for the position they are trying to fill. They only match keywords on the CVs in their database (often LinkedIn profiles, etc.) against the keywords in the job description, then they spam everyone that matches with an excited mail about having a "perfect match job". The matches are usually on the completely generic stuff like "C++", "Python" that everyone has on their CV, so in most cases the "dream job" is anything but - in a field the person knows nothing about or is not interested in.

I have been literally hounded for weeks by a headhunter once for a position that I had zero qualification for (Windows/.NET stuff - I was mostly Unix guy back then). It finally turned out that she wanted me only because I spoke/understood the Czech language. And she fully expected me to move to a "sweatshop" that company had in Czech Republic, trying to do a job I knew nothing about and paying less money that I had as teaching assistant at a university at the time. Some people are just nuts.

The phone screens are the same story - the headhunter has a script provided by their client with a bunch of keywords they are looking for in the answers. They are basically playing bingo with the candidate's answers, ticking off the "correct" keywords. Don't expect them to actually understand what they are asking. They can't - this week they are recruiting a Google engineer and next week they would be trying to fill a civil engineering position and a week later perhaps a chemistry lab technician.

I believe this is exactly what happened here. I have been in a similar situation before myself (not with Google). The hiring managers are complaining about how hard is it to hire talent, but why are they then wasting everyone's time with incompetent HR agencies, pointless phone screens that filter out even good candidates and stupid coding tests. Ask for references (I will be happy to provide), ask to see some code at the interview, check my public code (Github for ex), hire for a trial period. But give me a break with this ridiculous testing/screening nonsense. Nobody else except software engineers seems to have to put up with this type of crap.


Both sound like IT Hunger Games ;-)


This blog is exactly what an SRE interview is like.

I breezed through these kinds of questions with the recruiter since I'm younger have a fairly fresh CS background.

Then, my first SRE staff interviewer primarily asked how I would build a data center on the moon. I work on the FreeBSD kernel and TCP full time. I know what BDP, window sizing, head of line blocking, etc are way beyond what a typical SRE would and how communication latency would cause major issues. That confused the questioner. I can't think of anything else I'd have said wrong, my background is systems engineering and I know more about power distribution, HVAC, and data center design than I care to. The lady was skeptical of my answers and it felt really humiliating even though I would rate myself more knowledgeable than my questioner, because of the candidate/interviewer positioning and failure.

The next man, on another day, asked me a bunch of math trivia like estimating the angles on the hands of a clock and orders of magnitude guesses of a small item like a marble to fill a room. I told him I was no longer interested in working for Google and he was really startled because "he didn't get to ask me systems questions yet".. well, good luck with that.

Everyone was really sad at that point, including the recruiter. Nobody from Google has contacted me again, which is a relief. I found the entire process gross.


> even though I would rate myself more knowledgeable than my questioner

Don't get me wrong, I'm guessing you know your stuff, but you also strike me as someone who may likely have failed on culture fit down the line. Interviewers are often more sensitive to attitude than they ever are to aptitude, and for good reason, your HVAC knowledge may be irrelevant once you discover the custom designs in use behind closed doors, and a bad attitude toward learning where you're weak can be a much more fatal problem for a new hire.


At the same time, it's a useful filter for the candidate.

Last year when I was job hunting I kept getting fizzbuz-style phone screens, even from companies who'd specifically contacted me because they knew who I was and what my skills/experience were, because they have to be sure to filter out those unqualified core committers of software they use on a daily basis.

Anyway, I got asked the "write a palindrome checker" question multiple times in those screens. I guess more companies than I thought have a Department of Palindrome Quality Assurance these days. But after about the fifth time, I just started going overboard on the question to make a clear point to the interviewers. I got a pretty good patter down where I could write out the code while describing all the random quirks phone screeners never have heard of: I'd start lecturing about combining characters, right-to-left directional shifts, the tradeoffs of considering solely Unicode code points versus graphemes, using the character database to identify categories of characters to ignore when considering whether a string is palindromic, etc.

Interviewers who took that poorly did not get my further cooperation. Interviewers who took it well (by being positive/polite about it, or even admitting that yes, this kind of phone screen is a waste of everyone's time when you already know you're interviewing someone who can code) got to talk a bit more. But I ended up accepting an offer from a place that actually worked to make their interview process better then this, and which continues to evolve it all the time.


In the time since that Google interview, I've moved into management and have built a very high performance team recognized as such by peers and the executive team.

For internal hires, I convinced people to come work for me that I had immense respect for by using casual conversation and pitching the idea and vision for a new operating systems team.

I've found this is also a classy way to hire external people. I've since hired people off freebsd-jobs@ mailing list and twitter by being upfront about the good and bad of working at this company. No trick questions, just a conversation about what we like to work on. This was easy because I had an idea of what they have accomplished by their commit logs.

Most recently I hired two women, masters students, for summer internships. This was very different because I had no idea what the candidates had done as coursework or projects beyond a simple resume. I again used casual conversation, no trick questions. I posed some real world situations, passively seeing if they understood concepts like deadlock, manual memory management, indirection, and had very good working CS/OS vocabulary. This eliminated most of the other candidates, and it was pretty clear who had slogged through their OS and networking classes without passion. I let each person tell me about projects they worked on which really excited them. One had done Linux USB driver on her own time, among other interesting things. The other had implemented a scheduler and file system on a teaching OS as part of her course work. Both worked out phenomenally and both have patches in the FreeBSD.org source tree from the 2 month internship experience. I am very proud of this, and of my team for mentoring them so successfully.

The people I hired were often confused; "That's it?" at the end of the phone or in person interview. They thought they had done something wrong because they are so used to being sweated for the sake of being sweated.

I am now convinced this is the only ethical way to build teams and hire -- start with some seasoned vets then grow new talent while refining and reinforcing shared values.

I don't really see what the stereotypical SV tech interview accomplishes. Blind leading the blind. Leadership is piss poor in this industry.


Wow.. I would like to work for you.


That may well be the case. The reason I quit the interview is because I evaluate a potential employer while they are evaluating me.. I really love when someone asks "do you have any questions for me?" and take full advantage to try and get candid knowledge of a team and more importantly leadership. I determined Google would not be a good fit for me.


Disclaimer: I also work for Google, opinions are my own, etc etc.

> "i always take "transcripts" of interviews (or anything else) with a grain of salt"

I mean sure, a single instance of this might be overblown, exaggerated, or false in some way.

But there is an avalanche of reports like this, to the point where it's become widespread industry insider knowledge.

I enjoy working here, but the interviewing practices are such that I actively warn friends applying/being referred to temper their expectations of a repeatable/reliable process.

Most colleagues I've spoken to about this, including myself, have strong doubts we would have made the cut if we interviewed again - even though all are strong engineers with great perf records.

At what point do we start taking reports like these seriously? We don't have to accept every detail of the reporting as gospel, but there's clearly something here.


The problem with Google's interview methods is that they all select for a very specific type of programmer: heavily math oriented, deep knowledge of obscure Computer Science theory, but not one test on knowledge of languages, architecture, design, or actual real-world problem solving. I walked into an interview with one guy and he literally did not even say hello: he just jumped straight into some problem I had to solve on the whiteboard.

The problem with that approach is you end up with a very homogenous team of really smart, logical people, but without the balance of more creative, empathic types. Ideally, a well-functioning team will have both, and will have people from many different backgrounds and educations, because that's when you get true collaboration and innovation: by mixing unrelated disciplines.


(Standard disclaimer, speaking for myself)

1. Your interviewer didn't give you a good interview or follow guidelines. In interview training they tell you the first thing you must do to start an interview is to ask if the candidate would like to get some water / use the restroom, then break the ice before starting any questions (applicable also during phone screens).

2. Proper interviews actually are supposed to lean heavily toward real-world problem solving approach rather than arcane knowledge. For example, when I interview I look for rational decisions at every turn (not a random example but considering boundary cases, adding a new example to help you visualize the solution should give information gain rather than be something random). My questions are not math oriented, nor do they require deep knowledge of obscure theory. Based on what questions my coworkers ask, I know at least for my team this is not a correct characterization.

What we do test for: understanding of fundamental data structures and algorithms, ability to thrive in uncertainty (ask clarifying questions! state your assumptions!), ability to break a problem down and solve it from first principles.

Good interview questions are required to have multiple solutions.

And then you have the generalization at the end about creativity and diversity; in my limited experience we seem to get pretty decent diversity and even if there is some homogeneity (we need more women and minorities) it's certainly not the kind you described. No, it's not a bunch of mathy theory wizards writing code at Google, it's way more diverse than that. Not perfect, but not awful like you're describing.


> "The problem with Google's interview methods is that they all select for a very specific type of programmer: heavily math oriented, deep knowledge of obscure Computer Science theory"

I'd be marginally okay with it if the interviews actually selected for this sort of engineer! I've seen multiple people who fit this description to a T who flunked the process, hard.

If the goal here is "pick the hyper-mathy, deep-CS types out of the crowd" I'd argue the process isn't even very good at that.


Having a high degree of false negatives doesn't mean the positive signal isn't reliable.


Off-topic pet peeve but why is "OK" now apparently spelled "okay" these days? (especially in bandwidth-limited situations such as SMS or IM). OK is not short for "Okay", OK?


"These days"? It's been spelled that way for nearly 100 years.

> Spelled out as okeh, 1919, by Woodrow Wilson, on assumption that it represented Choctaw okeh "it is so" (a theory which lacks historical documentation); this was ousted quickly by okay after the appearance of that form in 1929.

http://www.etymonline.com/index.php?search=okay


I think "okay" looks better than "OK".


But it is ...wrong. OK originated as an abbreviation so why spell out the pronunciation of the letters? Makes no sense to me.


Because the connection to a 180-year-old fad for misspelling "all correct" is so obscure and non-obvious that it was lost long ago? Do you think we should be capitalizing LASER, too?


It has always been an alternative spelling.


Not so; as you can see in my other comment's link, we can cite OK about 90 years before we can cite "okay", and more reliably than that we can cite the alternative spelling "okeh" to 1919, which establishes pretty well that "okay" was not standard then.


by talking about this you've wasted all the bandwidth your two bytes would have saved a year.

As far as I understand this has been common vernacular since before my lifetime, I'm not usually one to welcome evolution of the base language but this one is before our lives we need to let it be.


Google's interview process, and interview processes modeled on it, do not select for "math-oriented CS-conscious" engineers.

These processes select for recent CS graduates from a handful of universities where Google expends recruiting effort, and anyone not from that background mostly only gets in by blind luck or by knowing someone already in Google who can navigate them through getting hired there.


> but not one test on knowledge of languages, architecture, design, or actual real-world problem solving

Maybe they fail to do so but I do believe the goal is to test real world problem solving. However, I think they stray from specific language or domain knowledge because they want you to be able to work in different roles, since you don't have to interview again to switch teams.

From what I've read, the idea is to hire people who would be smart enough to learn any specific domain knowledge necessary, because the expectation is engineers might have to tackle problems they wouldn't have seen elsewhere. I don't really know whether thats true anymore as my impression is now Google just has a bunch of overqualified people though..

> The problem with that approach is you end up with a very homogenous team of really smart, logical people, but without the balance of more creative, empathic types. Ideally, a well-functioning team will have both, and will have people from many different backgrounds and educations, because that's when you get true collaboration and innovation: by mixing unrelated disciplines.

Can't disagree with you there, but its a weird assumption to say that people who are logical are not creative or empathetic. I do think that they hire for "Googliness" whatever that means, which may lead to a monoculture though.

In any case, I guess you can call me a Google fanboy. I don't agree with everything they do but I feel like bashing Google's (or most other company's) interview process is the cool thing to do here, but most people don't seem to have tried to understand why it is the way it is, and thus don't offer any true alternatives that meet the same goals nor do they reject the goals in the first place.


This explains why Google is incapable of attacking any product that requires an understanding of humanity to be successful rather than just raw data. (e.g., google+, youtube comments, etc)


these are just lack of good product management, nothing to do with engineers. I do suspect Google's PM culture is... lacking.


YouTube is pretty successful (even if they acquired it).


It tells more of the interviewer than the interviewee. They don't ask the questions you mentioned because they don't know or they don't know how to best judge the answers. They asked the questions they know well, that show you what the breadth of their knowledge is. It's like the saying A-players hire A-players while B-players have a hard time judging A-players.


B-players know an A-player when they see them, they don't hire them because they feel threatened.


I got pinged by Google about a year ago (after being narrowly rejected 9 years ago) asking if I'd be interested in re-applying. I said, "Sure, why not?".

I was immediately asked which department I wanted to join and why. I said, "Err, not sure, how about SRE?". To be told, "Oh, well that's not my area, let me ask them."

Shortly after that I got a curt message saying "Thank you for applying to Google. We have no vacancies that would suit you right now, thanks for applying, goodbye."

Somewhat bemused by the whole process (you contacted me, dude!), I went about my day.


LOL. That's happened to me before too. And I've been on the other side, where I reach out to people to ask if they'd be interested in a role on my team.

However, I immediately tell them the role that I have, to avoid the whole, "We have no vacancies that would suit you right now" answer. Seems like their recruiters should have done that up front, to save you and themselves some time.


(Same disclaimer)

If you interview frequently, at least for SWE, this is certainly not how we go about things. ghire guidelines for SWE don't allow for questions like this, or behavior like this.

Is it possible this was an SRE interview? I guess, but it really sounds ungoogly and these questions sound like they don't give great signal. I'd be ashamed if this is how we hire SREs.

Is there really an "avalanche" of reports like this? Most negative reports I hear have to do with our SWE questions which tend to be difficult.


You should be ashamed then, because these are definitely the questions used on SRE phone screens.


The questions are fine for SRE. The problem is the behaviour and the expectations of the interviewer.


The people doing these interviews are non-technical people who read off of a cheatsheet. The cheatsheet covers alternative answers, but a situation like the OP describes can never end well.


If that's true, making non-technical people conduct technical interviews is also a pretty big failure.


Thanks, I am realizing now that it was SRE. All I can say is I'm definitely a fan of how we interview SWEs and I'm sorta bummed this is how SRE interviews go. TIL.


I got these questions when being interviewed for SWE and SRE. My answers suggested to the interviewer that I should go down the SRE route. I passed in the end, because I felt the culture regarded everything as a technical problem (which has worked out for them so far) and Google was a company where you did things their way, and don't rock the boat.


Are SWE and SRE interview the same? I thought they were different enough job descriptions that it would require different interview questions.


They are different, but a lot of the skills are similar. SREs need to problem solve and while they may need to have more domain-specific knowledge I'm not sure a facts quiz administered by someone non-technical is the best way to do that.


I got rejected just this month and I can certify that there was no crazy bullshit in the process. I mean, I feel like you made a mistake rejecting me, but I also can imagine a valid process behind the scenes which would reject me based on my "ok but not great" performance. I do hear anecdotes from people I trust which sound crappy (being asked very specific technical questions on subjects that candidate doesn't have experience on and not being flexible about it, being rude etc).


Google turned me down a couple times before I got hired. The first two were definite mistakes (false negatives) where I should have been hired. Just treat it like a process to be optimized. You can reapply every 18 months.


> At what point do we start taking reports like these seriously

My guess is when the number of applications per position actually drops far enough that the false negative rate starts to hurt.

Until then, an interview processed optimized for avoiding false positives at all costs will persist. Totally makes sense for a company worth hundreds of billions though, can you imagine if they had a few more bad hires sneak in? Oh my god, it would destroy everything.


In my (albeit anecdotal) experience, incompetence is the norm rather than an outlier in BigCo SV land.

Frankly, if you are more qualified for a position, chances are you will be rejected because your interviewers will fear for their own job security.

I've always found that type of logic strange, though. Wouldn't you want someone who was better than you currently are on your team? Wouldn't you be able to learn from them?


That only applies if you want to learn. If you just want to coast and be "the best" at something at your company, you dont look for people better than you. At best you look for people that are better at the tech than you, but who are passive or easily browbeat so you can claim their work as your own.


That depends on if the interview process actually does protect from bad hires.


It happens all the time Google acquires a new company, those employees aren't going through these crazy interview processes.


actually, acquisitions normally trigger full interviews- see "Chaos Monkeys" for a description how this happened at Twitter/Facebook.


Is this an US thing?

I went through three acquisitions and never had any interview, besides the set of meetings for each of us to decide to go along with the acquisition or get a severance package.


I think it all depends on the requirements of the company doing the acquisition. I've gone through several myself and did not have to re-interview however I have heard of a employees claiming Google required them to re-interview which has got to be nerve racking considering Google's heavy focus on avoiding false positives a lot of good people don't pass them.


I had similar questions for a SRE position at google a few years back in fact, so I found it interesting and does not surprise me.

I eventually refused the position without going on-site just because of how ridiculous the questions/replies were (and frankly, because I had another good offer elsewhere, but it did contribute greatly).

While my experience wasn't as bad by a long stretch, I can see how this is plausible. In particular I immediately figured out that the recruiter wasn't very technically inclined, had a "heres a list of correct responses" spreadsheet to help him, and had very little time to waste.

Due to taking that into account - I was always accommodating instead of confrontational (which got me more interviews, which were better/with real engineers, yay - though not great either). Had I been confrontational, pointing out mistakes and misunderstandings, I'm sure it'd have gone pretty bad.


I was asked pretty much the same questions for an SRE position at Google. Note that I only found the recruiter phone screen to be this kind of 'pop quiz'. The engineering interviews were more detailed discussions with engineers.


About 8 years ago I had the same questions.


Mine was 2 months ago. I'm surprised how stale these questions are.


I can confirm that a recruiter contacted me and asked pretty much these exact same questions when trying to recruit me for an SRE job.

My recruiter was reading these questions off of a sheet of paper, but when we had discrepancies in our answers and she would say something like "It says here an inode holds metadata", and I would respond with something like "oh, metadata and attributes are the same thing", she would say "oh, well you are correct then!"

I made it to the first phone interview but that's where the path ended. I was bummed for a little but then remembered I prefer small business anyways :)


On one hand you say that these are "bog-standard" SRE questions, and on the other you say it's "super strange".

What exactly is super strange? That a non-technical recruiter asked the questions? If that's not the strange part, then surely it's believable that the recruiter would not recognize some of the subtleties involved?

That said, if this guy is the creator of GWAN, then it's entirely possible that his personality rubbed someone the wrong way and he was nixed for "personality reasons" in the only way they could.


It's super strange that the recruiter would have no understanding that alternate answers are possible, and would end the call abruptly claiming the candidate didn't know their fundamentals.

I've done this kind of phone screen for an entry level position at Google, and while the recruiter wasn't an engineer, they did have some basic knowledge of the concepts involved, and were able to prompt me with follow-up questions if I missed something or got a question half right. The questions themselves are not strange, it's the alleged attitude of the recruiter.


"What exactly is super strange? "

Sure. For starters:

1. This guy apparently did not know he was interviewing for an SRE position.

2. The recruiter was looking for very very specific answers and immediately rejected any others.

3. There was no other discussion of anything, at all.


This is exactly why I think that transcript is a bogus, one-sided take from someone who's dejected and hurt by the fact that they weren't chosen. It reeks of the smell of someone that thinks they were smarter than the interviewer.


the questions asked don't strike me as strange, but the corrected answers/explanations by the recruiter are very strange indeed.


This was very similar to my first experience interviewing for an SRE role at Google. After about 20 minutes I got tired of arguing with a clearly non-technical recruiter and politely excused myself.

My second interview well, that was a whole different bucket of problems.


Yeah. I'll never interview with Google again. I've got friends who work in (mostly nontechnical) roles there who had very different experiences, but my interview was such a hot mess of disorganization, cluelessness and arrogance that I ended it early and told them I had no interest in the role or the company.

Two years on, I think I made the right call.


Yeah I had a somewhat similar experience. My first technical interviewer was 15 minutes late (so it was now a 30 minute interview). Then, after being asked the typical slew of questions (what is a hash table, etc) I was asked to implement a basic data structure (Set). Which was easy enough but my interviewer wouldn't let me finish writing up my implementation and, instead, insisted I focused on optimization of a particular, custom method he asked me to implement. I protested (premature optimization, etc) but ultimately went along with it. I finished optimizing but didn't get a chance to finish my Set by the time the interview was over.

I got an email a week later saying thanks but no thanks with zero explanation. I had gotten everything right, what went wrong? So I had some of my Google friends track down the interviewer and ask. Apparently I didn't continue forward because I didn't finish my Set implementation...

I've had Google contact me on occasion since then. I have not re-applied / re-interviewed with them. Their interview process is already bad enough.


I was interviewed by Google for an SRE management position and I got asked 6 of the OP's questions.


These are not director questions. I wonder if this is a case where a team didn't want to hire:

A team gets pressure to hire, but they don't want to.

A team has a great internal candidate but can't push it through without going through external candidates. (Expecting any director couldn't answer these -- which they shouldn't).

A team can get 2 for 1. Usually an H1B situation, which has the extra benefit of chaining the 2 candidates to the company. Former H1B employees love this option.

A team has a 'friend' in mind.

I honestly think, this isn't a question of a dumb recruiter, more like a way to just push something through. The recruiter was probably taken to lunch with a high five. This is very normal. I wouldn't freakout that they have a 'dumb process' you need to read between the lines here. The saddest part, this pawn gets a "PASSED" on his record at Google -- but it was just internal politics.


> team can get 2 for 1. Usually an H1B situation, which has the extra benefit of chaining the 2 candidates to the company. Former H1B employees love this option.

Are you saying google pays their H1B employees half of what they do others?


I would 100% back you up in my mind had it not been for the "Why Quicksort is the best sorting method?" question.

I hope you'll agree that there is no way a correct answer would ever validate this question.


I'm another Google employee. I really don't think that's an accurate transcription. There's a standard SRE question which is related, but different. I won't give the exact question, but you could try searching Glassdoor.

If "Why Quicksort is the best sorting method?" really was the question, then the recruiter must have asked it from memory and misconstrued it. It's certainly not a standard Google SRE interview question.


Having only seen the candidate's paraphrase of that conversation (and never having worked for or interviewed with Google), I would STILL be inclined to give that candidate a thumbs down.

To solve difficult technical discussions, it's important to be able to restate the other side's arguments in the light most favorable to THEM, while the candidate was entirely focused on paraphrasing the interviewer's argument in the least sympathetic way. Would you want to work with a person like that?


So what you want to see from the author is a Mao-era self-criticism stating why Google were 100% right in rejecting him.


I think they mean that if you realize you're talking to a non-techie then you should make an effort to use simple words. Your TCP hex opcode knowledge does not impress someone who doesn't know what hex or TCP is. Figuring that they must be looking for an answer along the lines of "SYN/ACK" is a skill as well. A people skill.


This is the SRE prescreen. At least it's the one I was asked in 2007, almost verbatim. Possibly too verbatim.


Also, his answer on #9 is wrong, or at least <EDIT> his explanation of the conversation is terribly confusing </edit> With 10000 numbers, it's only efficient to create a lookup table with 8-bit integers, not with 16-bit integers.

Based on his LinkedIn profile, I don't think anyone at Google would have thought of him as a "director of engineering". Being an "R&D director" at some unknown company at 24 is entirely un-comparable to a director at Google, and since then he's worked at his own very small company. He was probably a candidate for Senior SRE.


Who's answer is wrong? Cause no-one suggested a 16bit lookup table.

His answer was to look at 64 bits at a time and do a [0] Kernighan style count. The "correct" answer was an 8-bit lookup table. Which is right is going to be highly dependent on the data and the architecture you are using.

[0] http://stackoverflow.com/questions/12380478/bits-counting-al...


You're correct, I misread what he said the recruiter's answer was.


Do you recall the question? Site's down. I recall thinking the right answer was to use POPCNT but maybe I'm misremembering the question.


The question was "how do you do bit-counting of a bunch of numbers". The two canonical answers are "lookup table" and "using bit shifts and multiplying by magic numbers".

The fact that there's a machine instruction for it does makes it a bad question.


Honestly I would have said popcnt as well. Lookup table or bit shifts when I can have the cpu count the 1's? I guess i'd need to benchmark it to be sure. Either way I can't say its a good question.


Popcnt isn't particularly well optimized in most micro-architectural implementations.


Looks that way with a quick test. But it looks like there may be a better way with SSE3 PSHUFB: http://wm.ite.pl/articles/sse-popcount.html


Is it? It looks like on most recent Intel CPUs, it's 3 cycle latency, 1 cycle throughput on a 64-bit register. A 8-bit LUT solution is going to less than 16-bits per cycle on any recent Intel/AMD CPU (maximum of two load ports).


Hmm, much better than I remember. I guess this goes a long way to explain why this wasn't always seen in practice: http://danluu.com/assembly-intrinsics/


just prepend "cache://" to the url


Woah, I've never seen this before -- thanks for sharing!


Just FWIW, about 5 years ago someone also contacted me for a google interview. The questions were very similar, so I'm not surprised. I don't know what's going on behind the scenes, but after a few mins I didn't took the call serious any more and in the end after 30 min it felt more like a way too long prank call.


I gather you didn't run this by your PR folks:

> Particularly, when one side presents something that makes the other side look like a blithering idiot, the likelihood it's 100% accurate is, historically, "not great".

I get that you are happy at Google, that you want to defend your employer. But implying the guy's a liar or a fool does not help. If anything, it makes me more likely to believe that Google has something to be touchy about here.


Well, I mean, he's right: any time a story seems outrageous and unbelievable, it's often because it has been embellished at least somewhat. That doesn't mean it's completely false either, though.


"Often"? Would you be so kind as to show me your data on that? Maybe it's just this election season, but I seem to be hearing about quite a lot of outrageous things that are perfectly true.

Even if you're correct and he's merely saying something generically true about almost any concerning story, him saying it in the context of the post reads to me as a veiled accusation.

For example, suppose you posted an open-source project of yours here. If I were to comment, "Open-source projects are often half-finished, buggy messes," how would that seem? It is factually true; randomly looking at GitHub projects is enough to show that. But in context, it's an unkind thing to say because encourages people to look at your project as one of them.


It's an interesting recruitment setup where overqualified candidates are rejected.


You can have situations where you are prepared to recruit someone with potential versus being the final article. In those situations the overqualified candidates may not compare well with what you consider the potential of the slightly under-qualified candidate, and may not have some of their other attributes. It's obviously a risk, but it happens more than people think.


Still, overqualified candidate supersets qualified/unqualified. He/she should pass the test.

Failure of recognising overqualified candidate from under qualified is a failure on the recruiter side, not the candidate side.

Recruiter is of course allowed to say "I'm sorry, but you are well overqualified for this position". In this case he was falsely recognised as somebody under qualified.


I realize this is a long shot, but seeing as you're actually in a pretty high position, could you try having the recruiter screen changed to be multiple choice?

It's simpler when having a non technical person asking the questions and I imagine it would lower false negatives. Frankly I don't care personally but it seems like most people would like this better.


IE maybe he applied to a position labeled director of engineering, but they decided to interview him for a different level/job instead.

A "different" job, several grades lower in responsibility (and pay). Without in any way prefacing him, beforehand.

Is that the way things usually work in the Google hiring process?


I agree with DannyBee. The author of this post seems to suffer from a bit of arrogance and inflated sense of his own abilities. It's likely he tweaked the story a bit to make himself feel better. I've seen many high level directors do this since at that stage in their careers, they can't possibly imagine themselves to not get an offer from a company.


Your comment combined with this one...

https://news.ycombinator.com/item?id=12701650

...creates an alternative interpretation in my mind that's not as bad. That is that the questions were a filter attempt done wrong in that it didn't account for stronger candidates giving better answers with a way for interviewer to confirm them. On top of that, a simple, data, entry error by HR person or whoever forwarded his name to them might have put him in wrong interview category. That's two, small problems vs a huge one implied here.

Although the damage appears huge if they're filtering out candidates with his track record with the pre-screens.


5 years ago after applying for a unix systems engineer position at google, I had a phone interview with more or less the same questions - more probably as this interview was interrupted, so for sure these are questions asked from google recruiters but for sure I would also be surprised if such interviews take place for such high profile position. I would assume things are done differently in such cases and not via the "usual" way.


> These are bog standard SWE-SRE questions (particularly, SRE) at some companies, so my guess is he was really being evaluated for a normal SWE-SRE position.

This makes the most sense to me, why would a director of engineering be responsible for getting Google back online if it went down when there are SREs.


Yup, those are SRE questions, but the fact that Google didn't interview him for the position he applied for makes them out to be even bigger idiots than I had them pegged at for using SRE questions on a director role. Regardless, just having such a stupid process exposed reflects badly on Google. In my own experience, Google isn't even able to call at the scheduled time so, while not the worst interviewers ever, they're pretty close and very far down the ladder. Put it another way, I doubt they could convince many to even interview without their extremely hefty compensation packages.


From what I've heard, google intentionally screws with timing, who you will be speaking to, and other factors in order to try to understand how you deal with changes in circumstances.


This is blatantly false.

Source: I work for Google. Our daily schedule is packed with meetings and we try to be as on time as possible, interviews (which are something that everybody should be doing) work exactly the same, we don't try to screw people over with bad timing just to "test" them. Sometimes it happens that people miss interviews and somebody else has to show up, this is unfortunately a problem and it shouldn't happen but sometimes it does (accidents and unforeseen things happen). It's not done on purpose.


I know you don't set up the system, so I'm not blaming you here. But this is the problem:

> Our daily schedule is packed with meetings and we try to be as on time as possible

If Google's goal were to respect the candidate's time, interviewers wouldn't have daily schedules packed with meetings. The less slack in a system, the worse the failures are. That employees try to be "as on time as possible" is a sign that everybody understands the scheduling is unrealistic.

This isn't unusual, by the way. Most hiring processes don't optimize for candidate experience. Or even for good hiring decisions. Indeed, if you take a POSIWID view of typical hiring processes, the point is to make interviewers feel powerful and selecting for people people willing to put up with inefficiency and suspicious power dynamics.


I remember reading an article that this is how they interview for product managers? Not for engineers, but product managers are people who maybe need to deal with more craziness.


I admit I am not familiar with the interviewing process for the non-engineering sides of the org, but I'd still find this very unlikely and weird. We are very very strict in our meeting and timing policies as our meeting rooms are usually packed with people all the time so we really need to be in and out at the given time. It's counter-intuitive that somebody would delay a meeting on purpose...


I interviewed for product manager at Google. It was tough, but no craziness involved: flew in, had a day of interviews, flew back.


If that's true (and I would love to know if it is), then Google should know that is a big no-no in a lot of cultures, and there are much better ways to test for it. Some still hold the broken word to be an indication of people you don't want to deal with, ever.


A counterexample can't prove that that never happens, but I interviewed with Google this year and every interaction on the phone or in the in-person interviews began within 3 minutes of the agreed-upon time. So I have a hard time believing this is a general policy.


This is not the case (I work for Goog).


You unintentionally screw with the timing? (I had the same happen)


Nobody intentionally screws with the timing like was speculated. It's possible individual meetings might be a few minutes late I guess, can't comment on that.


And you are on a hiring committee?


The hiring committee meets after all the interviews are conducted, so isn't really relevant here.


Yeah, I've heard this too. If it's true, it just shows what assholes they are when it comes to respecting people's time. Which I think is the original point of the article which comes in loud and clear: Google will waste your time; don't interview there.


When I was starting out, Doing well in the Google interview process was a nice confidence booster.


Unfortunately they will just ignore the complaints and move on to the next resume in the pile.


This reminds me the story of one of the WhatsApp founders getting rejected by FaceBook. [1]

Hope this doesn't turn out that costly for Google. But I'd be happy if it does, if it is the way the interview was really conducted.

[1] https://twitter.com/brianacton/status/3109544383


Wow. If this is actually what happens, even a little bit, then Google has a huge, huge fucking problem, and it needs to be fixed.

I've never bothered to apply to Google. But if this happened to me, I'd just walk away. You don't know me, but you don't want that :-)


I think in this case, it's more like there are "independent" recruiters with their fixed Q&A sheets sitting around, somewhere, and fishing around for possible candidates to make it to a second level.


I'm amazed he knew things in such detail. I mean who would know just how long a MAC address is? Or what the actual SYN/ACK etc tcp flags are? You just need to know what they're used for, and if you need the specifics, you'll find out with a single search. He seemed to know that as well though. Kernighan for bit twiddling algos, that kind of thing.

It's a bit strange to have someone non-technical interviewing a techie. You end up with stupid discussions like the one about Quicksort. If you point out qs is one of several things with the same big-O, you'll probably also get it "wrong". But the real problem is that a guy who is just reading off a sheet can't give any form of nuanced feedback. Was the guy blagging the sort algo question? Did he know if in detail? Does he know what the current state of research on that area is? There's no way to know that if your guy is just a recruiter, but I'm sure even a relatively junior coder would be able to tell if someone was just doing technical word salad.

I wonder what would happen if ordinary people recruited for medical doctor jobs? Would you be comfortable rejecting a guy who'd been in medical school for 10 years based on his not knowing what the "funny bone" is? Wouldn't you tell your boss that you felt a bit out of that league? It's amazing you can get someone to do this without them going red in the face.


Medical Recruiter: "What are mitochondria?"

Applicant Doctor: "They're double membrane-bound organelles found in all eukaryotic organisms, commonly between 0.75 and 3μm in diameter, that generate most of the cell's supply of adenosine triphosphate"

Recruiter: "No. They are the powerhouse of the cell".


What are the components that make up a color?

"Well, it depends what color model you are using, which differentiates between additive and subtractive color mixing, the medium used (print, screen, etc.)...

No. It's Red, Green, and Blue.


"Draw the first letter of the alphabet please".

'a'

No. The right answer is 'A'. We will stop here because it's obvious that you don't have the necessary skills.


If the HN crowd allowed for levity, this would certainly be an instance of the oft-celebrated "birth of a meme".


But hang on, the primary colours are red, yellow and blue! I know, they told me so in primary school.


Not sure how much you're kidding, but... that's true for a subtractive colour system, where you start with a white sheet of paper (i.e. reflects every wavelength) and subtract colours (filter out wavelengths) by painting over the paper with crayons. For an additive colour system, where you start with a black monitor screen, and you add wavelengths, the primaries are RGB.


...and, strictly speaking, the subtractive colors that closest match the typical human eye are cyan, magenta, and yellow -- the "CMY" of CMYK printer inks. Using red, yellow, and blue as subtractive colors gets you a big enough gamut for elementary school color mixing, but it won't give you as big a range of colors as CMY.


These kind of comments is why I love this site. The guy was obviously 100% joking but the engineers in us are bound to reply nonetheless.

It reminds me of a classic Dilbert. https://goo.gl/images/7DhC9f


… and we just add K because a separate black ink is cheaper and more precise.


I'm sorry you're incorrect. This paper says Red, Yellow and Blue. Thanks for your time, we'll be in contact.


I legitimately had a huge argument with my (former) roommate over primary colors.

She was more artsy, I was insistent that the primary colors were RGB, she was insistent that they were RYB. We googled. We were both right in some senses.


Good job you didn't have a third roommate who worked in printing...


Hilariously enough, my fiance now works in printing and actually mixes the colors (t-shirt printing).

I haven't brought it up yet because we once had a fight over whether that thing you put outside your shower is called a bath mat or a bath rug (Both are correct in different circumstances)


CMYK would be right. RYB means you can't produce cyan or magenta or vivid purples or pinks.


I had the same argument with my wife (fortunately not huge...)


Those primary colors are real to me dammit!

/colorblind


Totally off topic (except for the fact that "correctness" of an answer can be a deeper problem than checking against a list) but when you look at how the cones in your eyes are connected to the brain to actually transfer color information it's indeed closer to Red-Green/Yellow-Blue (as described in Lab*). That's why we intuitively include yellow as a "primary color" even though you can just use RGB to describe it.


"They told me so in primary school" is clearly anecdotal evidence. You really need a proper citation.

Like this one: https://www.youtube.com/watch?v=yu44JRTIxSQ


wow, there's an entire school just to teach that? (:


Experienced this more times than I wish to admit.


I get it, but do doctors really know what the size of mitochondria are? I can't think of anything any further from the realm of their day to day.


everybody who is a doctor aced intro to biology because they were premeds. That means they looked at plenty of cell bio pictures with mitochondria (properly scaled to the cell size).

Doctors know a ton of random crap tangentially related to their jobs that they were forced to memorize.


Someone specialized in genetics or mitochondrial disease might. Otherwise, I'd guess probably not.


I had something this happen to me in an interview. Miserable experience.


Perfect analogy. And you just jogged a bad interview memory - "thanks a lot."


Yay for Biology 10.


To be entirely objective, I have run into recruiters of that variety in my career. However I've also had plenty of recruiters that were able to pick up on one's deeper knowledge of a given concept and accept that as correct, even if it wasn't the "textbook answer" they were looking for.


You don't even have to give feedback in an interview. Just let them answer your question and if it's too detailed, write it down and look it up later. In fact, Google usually doesn't give back any feedback during or after their onsite interviews.


Except that, for some reason, medical professionals don't have to put up with this shit. But we do.


Personally, these questions and the way the recruiter asked them reaffirm my view that Google would not be a place I'd like to work at as a experienced software developer. First the recruiter's lack of technical knowledge points toward a beaurocratic or management first mindset common (necessary?) in such a large company. Second, the questions and expected answers seem biased toward just graduating but smart engineers who don't actually have experience to realize the subtleties present in complex systems. The recent lawsuits regarding age discrimination reinforce the notion that this interview setup is biased (intentional or not) toward inexperienced and thus likely younger applicants. Perhaps that's not a bad thing per se as Google's corporate development style likely would handle inexperienced but smart developers who can "mold" into the system better. Or it could be more nefarious as to lower salaries by getting younger devs. In reality, probably a mix of both.


I wouldn't be surprised if the interview process for experienced engineers is unrefined. At this point in Google's lifecycle most of the qualified, experienced engineers who would want to work at Google already do. There are orders of magnitude more new-grad engineers to interview and so it makes sense that they would lack the practice and refinement on those candidates, even if those are the most valuable candidates to hire. The recruiter might have just been confused and gave him the standard list for any "technical" job that they have to use for new-grads and so cannot ask things with nuance.

When I was an interviewer at Google it felt like 90%+ of interviews were with candidates who had less than four years of experience. Probably half were fresh out of college. After the fifth candidate in a row who can't do simple recursion or algorithmic analysis (and I mean simple) you get pretty discouraged. In one phone interview I got to interview an experienced engineer with over twenty years of experience in C. He completed the question I usually have to spend 45 minutes on with a new-grad in <10 minutes. It was probably my favorite interview of all time because I actually got to discuss the subtleties and he reaffirmed that I could maintain high standards.


Google is still a tiny company relative to the US economy. They might be slightly above average on some metrics, but that’s about it. Large enough they have plenty of idiots, small enough that most smart people don't work there.

As to high standards, you are testing for things that have very little to do with someone being good at the job. High arbitrary standards often remove the most talented people who generally don't have the same background as you.

EX: Suppose you where looking for a CEO, well having a collage degree seems like a reasonable requirement. However, a surprising number the best CEOs don't.


I'm a software engineer who really didn't know much about recursion or algorithmic analysis- I was a bio major/computer hacker. I did terrible on my first set of google interviews, at least on the algorithmic questions.

None of that was a reasonable predictor of my future performance at Google. You would have filtered out a perfectly good candidate (and this is, IMHO, the biggest issue is that Google rejects a number of people who would be great employees with its early filters). I can't say I have a better system.

The only interviews that made any sense for the time I got hired were the ones with the specific team members I was going to join. Once we got to chatting it was pretty clear I was a good technical fit for the team.

I still want to emphasize I don't have a solution to the high false negative rate in the pre-screening procecss.


It makes me think there's a screening recruiter, non-technical, before the "real" interview process. At big companies like Google with so many people interviewing, I can imagine these kinds of kick out questions.

Still, it shows a massive company trying to streamline some things and failing terribly. I personally wouldn't want to work at Google today. What might have been cool once is now nothing but a standard large company like IBM or Amazon. There's a great Quora post by some former Google people that say as much.


I knew all these answers too, because I was a developer in the 1990s.

There is absolutely no purpose to knowing off the top of your head how long an ethernet address is, or even what system call will retrieve an inode (his bickering over stat() "filling in" rather than "returning" was bogus, for what it's worth). The top Google search result for each of these questions has the answer. Knowing these things isn't part of being a practicing programmer; knowing how to find out is.


The question was what function "returns" an inode.

Those functions return a error code, you pass in a stat structure and the function populates that structure.

He was saying (correctly), that they don't return (in the classic C sense) the inode. They return an error code.

To me that is a big difference...

int lstat(const char path, struct stat buf);

vs stat* lstat(const char *path);

2 completely different functions.


This strikes me as an entirely trivial point, the meaning of the question was pretty clear. It wasn't "what is the literal return value". Many APIs will return error codes, and people still talk about them as "returning" certain values colloquially. Of course, if the OP's answer would've been "it returns an error code :) but I assume you're talking about..." I would think that's fine.

Btw, that is the only thing I really disagreed with OP on, the rest seemed just ridiculous.


There's a precise distinction, and thinking about it as "returning" anything other than an error code is chummy human thinking rather than the sort of precise knowledge the test was (supposedly) looking for. The question was actually worse than asking "I'm thinking of a number between 1 and 10, what is it?" because rather than applying a random filter to candidates, it punished the candidate for having precise knowledge.


The difference is memory management, completely different. Don't ask me for a function that "Returns" something in a technical interview and when I call you out on it say I'm wrong, when I'm not.

It's a technical interview, the question should have been technically correct. "What function passes by-reference copies of inodes?"


My worst offense of this was for a linux admin position when asked from another abmin about how to list connections on a machine. My answer was "lsof -i, since I've found it's easiest to pull granular information."

"lsof just prints open files... you would use netstat, not lsof."

I tried correcting him, but he wouldn't listen and ended the call soon after.

I never received a call back.


Most people don't know that lsof works into two modes- the typical one is to run it against a process ID. The rarer one is no args, which runs it against everything on the system. however, I observe that lsof needs to run as root to print the same information that netstat -tanp returns for a reglar user.


Yeah, you're right about needing root (edit: for processes not owned by the current user). Though, if you have the access, it's a very, very worthwhile tool to learn:

  ~ $ lsof -a -n -c chrome -iTCP:443 | head -2 #sanitized output
  COMMAND   PID USER   FD   TYPE DEVICE SIZE/OFF NODE NAME
  chrome   1234 me   12u  IPv4 12345      0t0  TCP 127.0.0.1:12345->127.0.0.2:https (ESTABLISHED)


Yes, but we are all chummy humans with our chummy human thinking :) We are generally gifted with the ability of shorthand, context, and understanding that surpasses a purely technical understanding of words, and this is exactly the kind of situation where that helps.

I personally wouldn't want to work with someone who wasn't able to understand what I mean (in such an obvious case, at least) and wasn't able to answer to that meaning.

(Of course, I'd also personally prefer someone who would point out that this is inaccurate, but do so in a charming and off-hand way, to make me feel comfortable. A high bar, maybe, but for a director that's definitely a bar they should clear).


Yes, but the recruiter should have said "oh, you're technically right (the best futurama), next question".


On the other hand, you are interviewing a programmer. Their daily life revolve around technicalities where the difference between returns and fills matter a great deal. Expecting them, all of a sudden, to throw that out of the window and parse technical questions in a hand wavy way is ridiculous.


I can't disagree with you enough. Engineers should always strive to be as clear as possible about what they're doing. The word "return" has a precise meaning in the C language. If the recruiter was unaware of that, that's their bad entirely.


Actually, OP replied exactly as he should because after the first few exchanges, you can tell how anal the interviewer is and as such you have to answer accordingly. Some easy going interviewers might say that its about right, some might say its technically right, etc.

Remember, you have to walk through the interview thru OP's mindset and how he took into consideration the interviewer's analness.


I think the problem was that he didn't update his knowledge about what an inode is after the interviewer told him. He still thought it was an integer value, as opposed to a metadata structure. If inode meant inode number, it works be reasonable to assume there was a function that would return it.


Here's the first sentence of the DESCRIPTION section of the stat(2) man page on my Linux systems:

    These functions return information about a file.


Somebody sack the writer ;) - so, OK, I doubt I'd argue the toss if it were an interview situation. But when writing a comment or documentation I think I'd take inspiration from the documentation for either of these two fine systems.

OS X: (OK, so I lied about the "fine" part)

    The stat() function obtains information about the file pointed to by path
Windows, here the VC++ CRT - fantastically poorly described, if you ask me, though of course you shouldn't be using any of this POSIX shit on Windows, so if it confuses anybody enough to make them go and find FindFirstFile then it can only be a good thing:

    Get status information on a file


In C, what stat does is how you return multiple values from a function.

lstat has the same signature as stat: it passes the stat structure back through an output pointer.


But not if you're a literal lunatic who thinks that because you literally used the word return with an error code that's what it returns. And. Only. That.


To be fair, the rest of the questions were demonstrably pedantic, and I would completely expect a question like this to be on the bill of fare:

   Q: What does lstat return?
   A: A struct
   >> Wrong, it returns an error code
That type of technical specificity about what well-known functions return is absolutely something I've heard people use as an interview question. Knowing it returns an error code (rather than the value you want) seems like a good indicator of "has actually coded in C" (whereas many other languages return the value you want and raise an exception if it had an error).


>In C, what stat does is how you return multiple values from a function.

IIRC, you can return structs from functions in C. You have to access the values in the returned struct with dot notation, of course, like point.x .

Pre-ANSI C may not have had this, but later, it did. Remember reading it in the 2nd (ANSI) edition of the Kernighan & Ritchie C book, and also used it myself in some programs.

https://www.google.co.in/?q=can+you+return+a+structure+in+c

Edit:

Might want to consider the cost of copying, depending on the perf requirements, size of the struct, whether the function call is in a tight loop, etc.


returning multiple values was the key part of the sentence


What's bogus about his comment on stat() exactly. It literally returns an int and fills in the struct that it takes as a parameter. If he had no idea what it did he could check Google or the man pages which are going to give him some variety of the same answer such as the two examples below.

Linux:

    int stat(const char *path, struct stat *buf);

    stat() stats the file pointed to by path and fills in buf.
Mac OS X 10.11:

    int stat(const char *restrict path, struct stat *restrict buf);

    The stat() function obtains information about the file pointed to by path.  
    Read, write or execute permission of the named file is not
    required, but all directories listed in the path name leading to the file must be searchable.
It may be pedantic but stat definitely doesn't "return" an inode. One of the problems with technical interviews like this is you often have no idea what the interviewer is really looking for. Some might just want to know what you would use to get the information about an inode and another might be seeing how you describe it as a test of your knowledge of pointers or something. Often it's impossible to know and that could have been easily a trick question where the right answer is "there isn't one but you can use stat to get this information as it takes a struct pointer and will place the information into that struct". Of course you know that better than most or you wouldn't have made stockfigher.


Technically stat neither returns nor fills in the inode, because the actual structure of the inode can vary by filesystem, and the inode will contain more fields than struct stat.

In a recruiting situation, if the recruiter is going all "right and wrong" on the interviewee, they should know these dirty details.

I also disagree on the ethernet address length. You know how long IPv4 addresses are, you know how long IPv6 addresses are, why would it be so extraordinary to also know how long MAC adresses are?

I think it could be useful to find out in an interview whether the applicant knows stuff because they have actually implemented low-level code and gained an intricate understanding, or whether they just did used some high level APIs and were never interested in more details than "I have a handle right here, it does all I need". For some positions that would be an important distinction.

Personally, if I were interviewing people, I'd hire the guy that explains something to me that I did not know, but that I find interesting and would have attempted to understand, too.


> his bickering over stat() "filling in" rather than "returning" was bogus, for what it's worth

While I agree, I have seen interviewers where if he answered "fstat" they would have come back with "Wrong! fstat passes the structure back by reference, it does not return it!"

With this style of interview I can't blame him for thinking it might be a trick question and then trying to explain why he didn't say those other functions as an answer.


But what if Google is down and you are tasked with diagnosing it?


If they are trying to hire for a position that will somehow be on the line for what happens when all of Google is down, I for real hope that they get some higher-power questions into that interview.

But, obvs, the answer is that you Bing it.


This is the funniest thread I've ever seen on HN


I lost it at this one. As if there is some single point of failure that's going to bring all of Google down, and some intrepid director of engineering has to inspect some TCP packets by hand to fix it.


This is where I threw up my hands too. The Director of Engineering does not need to know the difference between SIGTERM and SIGKILL, or how many bytes are in a MAC address. I guess it's a nice bonus if he does, but he'll spending 10 hours per day in meetings talking about roadmaps, shielding his team from the execs, and removing productivity roadblocks. "Third engineer from the left" is doing the packet inspection--ask HIM about SYN and ACK.


Reading more closely, it sounds like they are not interviewing him for a director of engineering position; it just sounds like he thinks his current role, CEO-who-writes-code of a very small software company (http://www.gwan.com/about), qualifies him for a director-of-engineering-level position. He's probably being interviewed for an SRE team lead or thereabouts.

Why he's being interviewed for that position is a different question entirely, and I can imagine Google being totally right or totally wrong.


How can you imagine Google being totally right here? The disconnect between the questions being asked and the interviewer's lack of knowledge made the interview a waste of time no matter WHAT role they are interviewing for.

Take, for example, the sorting question. "Why is QuickSort the best sorting algorithm?" The answer being looked for was, "It has the best Big O."

And this is wrong. Its average case is O(n log(n)). Its worst case is O(n^2). Which do you call its big-O? Moving on, the average case of O(n log(n)) is matched by a wide variety of sorting algorithms. How do you choose one?

Here is a better answer.

QuickSort is a very simple to implement algorithm which achieves the lowest average number of operations on a randomly sorted list. Which is why it is so widely adopted despite sometimes being very slow.

However Timsort appears to be the fastest general purpose sorting algorithm for the mix of random and partially sorted lists seen in practice.

When I tend to notice that sorting is slow, generally that's a larger workload where some type of merge sort would be appropriate.


I can imagine Google being totally right because I can imagine the interviewee not accurately remembering the conversation here. (I expect, for instance, that he did not write down the interview as it was happening.) In fact, conditioned on the assumption that Google is right that this guy's experience is better suited for SRE than director-level, it is pretty likely that he did not understand the questions being asked / thought the questions were beneath him / etc. and therefore wrote them down inaccurately.

For instance, perhaps the interviewer asked "What makes quicksort a good sorting method," instead of "What makes quicksort the best sorting method"—a very small difference in phrasing. In that case, the answer of "It's not always the best, or even suitable" is still technically true, but much more wrong. (And an answer like the one you started with, "Its average case is O(n log (n)), its worst case is O(n^2)," would have been enough to pass... but sitting on the phone and arguing about storage topology is itself a failure.)

As I mentioned in another comment https://news.ycombinator.com/item?id=12702130 , my (five-year-old, faulty) memory of Google's SRE phone interview is that they asked another question here with a very small but important phrasing difference: "What is the signal sent by the kill command" instead of "What is the kill signal". If you make that change, the interviewee's answer of "SIGKILL" becomes wrong, and the interviewer is right to insist on SIGTERM (which would otherwise make no sense). It is a quite literal game of telephone.

(Again, I can also imagine Google being totally wrong and the interviewer mangling the questions.)


> they asked another question here with a very small but important phrasing difference: "What is the signal sent by the kill command" instead of "What is the kill signal". If you make that change, the interviewee's answer of "SIGKILL" becomes wrong, and the interviewer is right to insist on SIGTERM (which would otherwise make no sense).

But... the kill command is the command to send arbitrary signals. It sends them all.


You are right that several of these questions could be due to his misunderstanding the questions asked at the time, answering the wrong one, and then remembering what he thought he was asked. But it is beyond my imagination to reconstruct a plausible conversation that could result in the one recorded without there being considerable ill will on both sides.


Here's a thought experiment: read the article, replacing the interviewer-side questions with ones that make them sound more plausible. (This is the side that we should believe to be less accurate, if only because the interviewer isn't reporting their questions.) Pretend you're the interviewer, and ignore the internal monologue.

For question 5, you asked about an inode, and were told about an inumber, and got back an answer insisting that the inode was an index.

For question 6, maybe change "inode" to "information in the inode". The interviewee still has not figured out the distinction between an inode and an inumber.

For questions 7 and 8, apply the changes I suggested.

At what point do you decide that the interviewee is hopelessly arrogant and not worth your further time? And how do you get them off the phone gracefully?

Maybe around question 10, when they're quoting bits to show off and not saying the words that would actually let them communicate with other engineers like "SYN" and "ACK"?

No ill will is required on the interviewee's side, unless you consider refusing to waste time on bad candidates "ill will".


Your imagination is better than mine. :-)

The conversation that I'd have to reconstruct has a very combative interviewee. Which would also fit said interviewee deciding to write up the article that I read. Which would mean that Google dodged a bullet.

I find it interesting to note that his site is down. There are a lot of possible causes, but it isn't good advertising for his webserver software.


> Its average case is O(n log(n)). Its worst case is O(n^2). Which do you call its big-O?

I know this is irrelevant to the larger point of your post, and I'm sure that you know this already, but the worst case is the big-O. This is just another reason why "big-O" is not the most helpful thing to discuss in practice.


Technically, no.

Big-O is a way of categorizing the growth of mathematical functions. Those functions can represent anything. It is wrong to talk about the big-O of an algorithm without specifying what you are measuring. Be it average operations, worst case operations, average memory, worst case memory, amortized average time, amortized average memory, and so on.

It happens to be that when we talk informally, we're usually talking about the worst case we are willing to think about. Quicksort's worst case is a sorted set, so we think of that as O(n^2). But then we turn around and cheerfully call a hash lookup O(1) because its O(n) worst case is incredibly rare in practice.


Apologies. Throughout my CS undergrad I had only been given the impression and understanding that Big-O measured worst case (lower bound, no worse than), Big-Theta average case, and Big-Omega best case (upper bound, no better than). Looking into it more now, I see that there are some more subtleties I either missed in class or was never taught.

Thanks for correcting me!


The subtlety here is basically that big-O and big-omega and friends are ways of characterizing functions, and functions map one input to one output. "Running time of a problem of size n" is not a function; it has a range of possible values for a given n. "Maximum running time of a problem of size n" is a function. That function itself, an² + bn + c for some constants a, b, and c, has lower and upper asymptotic bounds.

I thought you were right at first but then realized what was going on. This is a pretty subtle point and mostly uninteresting for well-understood algorithms like quicksort. But one slightly less subtle point is that big-theta isn't average case, it is the combination of big-O and big-omega, i.e., bounded from above and below (possibly with different constant factors) by the same asymptotic behavior.


Ask Jeeves? Being able to find it in a book might be useful, but when Google is down I bet they would like to get things up and running as soon as possible.


Then you do what people did before Google when they didn't know some trivia: Use a book.


I bet Google has their own internal "backup Google", for just such an occasion.


yes it's called "Stevens" https://www.amazon.com/TCP-Illustrated-Vol-Addison-Wesley-Pr... although you might also check TAOCP.


Is that a trick question? Open APUE, ``apropos inode``, ...


I don't think these questions are unreasonable as a spam filter. Yeah, they're trivia, but if you had actually worked in that space, I'd be surprised if you didn't know a lot of that trivia just by virtue of exposure.

The rejoinder, of course, is that it's probably misguided to structure your recruiting around a spam filter.


The length of an ethernet address is a trivia question. It's a good way to score a board game. Filtering out candidates based on it is lunacy.


I agree with the first and second sentence. As to the third: what would be your thought process as to someone who claimed to be a network programmer on the phone but couldn't answer most of those questions?


And here it is.

I do plenty of interviews, and so I do have some sympathy for the idea that some overwhelming majority of applicants simply cannot perform even the most basic coding tasks but are somehow trying to sneak in anyway, but at the same time I can't escape the suspicion that a lot of the stories are actually from badly-designed or badly-calibrated interviews gone off the rails.

I absolutely believe that the article we're discussing here gives a fair view of a Google phone screen, since I've been through it too and even got asked some of the same questions. The only reason I "passed" was that I recalibrated way, way down to meet the expectations and the technical understanding of the recruiter I was talking to (who did very clearly seem to be reading off a prepared script). This was surprising since Google had reached out to me to ask me to apply, which one would think indicates a confidence in basic technical skills, but if I hadn't caught that and adjusted how I was interpreting and answering the random trivia they threw at me, I likely wouldn't have passed the screen and would have been labeled just another impostor trying to sneak into a job I'm unqualified for.

So when I hear someone else talking about all the "unqualified" applicants they get, I can't help wondering how many really were qualified applicant talking to unqualified interviewer using unqualified interview process.

(disclosure: I don't work for Google, don't ever intend to work for Google, and in fact hung up on a later screening call out of frustration with the way they ran their process, which at least finally got their recruiters to stop spamming me)


I'm a network programmer and I forgot exactly how long an Ethernet address is. (I would have guessed 6 based on memory, but I wouldn't be 100% sure.) It's an opaque structure to me; if I need the size I'll sizeof it, which is clearer anyway.

Maybe I'm a bad programmer, but I don't feel like I would be a better one with that specific fact committed to memory. I dunno.


I think you can probably predict it: I would generate a work-sample test for it. For a network programmer, I might have them implement a 3WH coded directly to pcap_write() (which requires you to populate the Ethernet frame header). Like the best work-sample challenges, doing a raw 3WH is kind of fun if you haven't done it before.

My friends and I used to spend bar nights drinking over torturous interview questions (yes, I have always been this nerdy). For instance: we had a gruesome sequence of questions on how to implement the fastest possible traceroute that you could only clear if you knew about a trick using the IP timestamp option.

Later, I got (what I thought was) smarter about interviewing, and moved to more surgical questions. I'd ask candidates to debug a C program that segfaulted in malloc, or ask them to describe the utility functions they carried with them from project to project.

After taking over recruiting for a company that really needed to hire at a specific clip in order to balance sales and delivery, I'm embarrassed that I thought I was interviewing effectively with stuff like this.

You can't learn about someone's capabilities by putting them on the spot with trivia questions.


Thanks for the detailed response. Frankly, I was awful at recruiting: grilling candidates who claimed to know C++ on the intricacies of templates or whether they knew what Koenig lookup was. Caught a few fibbers but I don't understand in retrospect what the point was.


> You can't learn about someone's capabilities by putting them on the spot with trivia questions.

But that's not the point of these questions. These questions are a 5-10min phone pre-screen before getting to the actual interviews. They test if the candidate has experience in a given field, not if they can search for information or what are the precise bounds of their capabilities.

It's trivia, but it's trivia that is a reasonably high confidence proxy for experience in e.g. network programming/design (resp. other fields). If someone claims they have networking experience and don't know SYN/SYN-ACK/ACK it's in my opinion a large red flag.


These screens pass people who can answer trivia but who can't effectively code, and they reject people who have a gift for solving engineering problems with code but who can't answer trivia questions when put on the spot.

The ostensible reason they get deployed (I say "ostensible" because we all know that in reality the on-site interview consists of the same stupid kinds of questions) is to keep the employer from wasting time conducting more sophisticated interviews for candidates who have no hope of passing. But that's dumb for at least two big reasons:

* The filters obviously reject candidates who would do well on more serious challenges --- worse, they do it insidiously, because you can't tell that they're rejecting good people, only that you're seeing fewer bozos, which makes them look like they work. In reality, a new norm has arisen where the most qualified candidates get to skip these processes entirely, because we all know they're a crap-shoot and don't want to lose good people.

* Properly administered work sample challenges actually take less employer time than these stupid trivia screens do. That means there's literally no purpose to the trivia screens whatsoever; they do nothing but harm.

In my last comment, I just wrote off the top of my head a sketch for a work sample test that addresses the same concern as the dumb TCP/IP trivia quiz from the original post. It took me I think something like 30 seconds to come up with it. Think about how you'd score that (remember: the bar here is "must be more predictive than that dumb trivia question"). I'm thinking something along the lines of "run the code and see if it opens a new TCP connection".

Assuming your team has enough sophistication to build work sample challenges like this, try to justify the trivia interviews. I think you can't.


> These screens pass people who can answer trivia but who can't effectively code, and they reject people who have a gift for solving engineering problems with code but who can't answer trivia questions when put on the spot.

SRE's are hired to fix outages and other problems asap, knowing trivia is very important then since at that point you might be losing a million dollars per second.


What is a 3WH ?



He wouldn't need to know it for the job, but as others have said he probably would know it if he really had the experience on his resume. That's the value for me sometimes in questions like that, especially in an initial phone screen, to decipher if you're bullshitting or exaggerating your credentials.


"I knew all these answers too, because I was a developer in the 1990s."

I think you're onto something greater. Maybe Google should hire more smart people who were developing in the 1990's as they're a nice, middle ground between ambitious, young folks and the been-there-done-that greybeards. Them realizing such people already know all the shit they're trying to teach their developers might be way to sneak older people into these tech firms. Haha.


Not agreeing with the recruiters attitude but "if Google is down you will need to know this to diagnose what the problem is." was his comment. In that case the developer should have cheat-sheets printed on their desks.


if Google is down you'll have other problems (SYN, ACK, SYNACK won't help much with the DDoS that's killing it)


I've had questions that drilled down into what bytes are in a TCP/IP packet on an interview. If you say you know about TCP/IP on a resume, it's fair game. FWIW I got all the bytes but the checksum; apparently no one ever remembers the checksum byte. :)

What's completely wrong with the situation is that the recruiter was saying "wrong" when he was giving detailed answers. The right solution is to have a non-skilled recruiter take careful notes, asking the candidate to repeat if necessary, in the cases where the candidate insists they know the answer.

Then only in cases where the candidate can't answer a question should it be marked as "wrong." In other cases the answers should be run past someone knowledgeable.

I've gone through a phone interview with Google myself, and it was nothing like this: I spoke with a real, skilled engineer, and there was nothing like a "gotcha" question where I had to guess the exact term he meant. Well, except where he asked "I bet you know what my next question is" and I didn't guess "How can the algorithm be faster?" But he didn't count that against me. :)


The right solution is to have a technical person conduct this test.


This costs money. As long as Google receives plenty of acceptable candidates, false negatives are free.

(Assuming they get no false positives, which, thinking about it, is a big assumption)


My presumption was that, for whatever reason, they couldn't or wouldn't do screening at this level by technical interviewers. Obviously having a technical person do the interview is always better.

I may have gotten a technical interviewer because I can list a half dozen people who know me who work at Google, most of whom can directly vouch for me.

Not sure why the OP got an unskilled reviewer, but if they didn't list references inside the company, they may have been thrown in the "random unknown applicant" bucket.


1) His level of knowledge is reasonable for someone who does networking programming full time

2) We are getting description from him. When I interviewed at Google, they asked me to prove that P is equal to NP, and I did, but they said that it took me longer than allotted 45 minutes and I didn't get an offer.


P = NP if N = 1, or P is 0. That doesn't take 45 minutes.


Oh hey what are you going to do with your million?


You proved P = NP??? In a little more than 45 mins or did you misunderstand the question?


If I were asked that question, my answer would be that obviously P != NP because the former doesn't have an N at the front.

The pain of being tossed out onto the pavement would totally be worth it.


What if N = 1?


might get you bonus points. It's a clever answer.


If you think it's actually clever, you're overthinking it.


His point was that, hearing only one side of the story, the OP could claim whatever they wanted.


Maybe they didn't hire you because your claim that you proved P=NP in 45 minutes is ridiculous.


People that have been around know such things. At one point, Richard Stevens (RIP) was god. Every programmer had a copy of TCP/IP illustrated, Advanced Unix Programming and Unix Network Programming. If you wanted to do anything network, you had to write your own servers, you had to understand the details. The breath of knowledge was wide and the depth was just as deep. Ask around on HN, and you will probably be shocked how many that know such things. :-)


And then, the world developed robust, well-maintained, open source libraries that do all of that for you. Now, in 99% of commercial software engineering, you don't actually need to know any of those details, in the same sense that you no longer need to know exactly how many transistors and diodes are used to build the adder on the CPU. It simply no longer matters in almost all cases. Your time is much better spent moving on to higher level problems.

If you're interviewing for the rare 1% case where the job will actually require you to tinker with a TCP stack, then by all means ask about those details. For 99% of programming jobs today, it's irrelevant trivia used as nothing more than an ego measuring device.


None of that matters, until it does :).

I feel like I've seen 2-3 articles alone in the last month that has rediscovered head-of-line blocking and UDP. I'd say 1/10 engineers I talk to even know what cache aware datastructures are.

The thing that separates someone who can just string together whatever they find on npm to people who build real systems is this deep understanding. You're not going to need this for your standard LoB apps. However if you're in the business of building software you're definitely going to want people like this.


There's a vast middle ground between stringing together npm and rewriting the TCP stack. Most jobs that exist today fall firmly in that middle ground.

Trying to find the TCP stack engineer to build your app is like hiring a petrochemical engineer to do oil changes. If your pockets are as deep as Google's, yes, you can do that, but it's by no means necessary.


"Now, in 99% of commercial software engineering"

Whatever this thing... this... stringing together a 100 random github projects to get a webpage that kinda of works about 80% of the time is called -- please let's agree not to call it "software engineering".


You are jumping into conclusions too fast. There are plenty of good software engineers who work with high level languages and technologies on a daily basis. You just have to know your tools, no matter what you do.


"And then, the world developed robust, well-maintained, open source libraries that do all of that for you. Now, in 99% of commercial software engineering, you don't actually need to know any of those details.."

And that, ladies and gentlemen, is how we ended up with npm.


This is a false dilemma. Npm-level programming and being able to rewrite the whole operating system from scratch are not the only two options.


Yep, I was doing network programming back in the late 90's and you pretty much had to have a good grasp of winsock and linux network calls and responses because the level of abstraction was much lower.

That said I can't recall any of that stuff mattering in the stuff I do for work for quite some time and most of it is a google search away.

That interview was just weird, it was like asking the boiler pressure for a steam train to someone who was a master engineer working on electric trains.


Is TCP/IP Illustrated still the gold standard of books on the subject? I read it fifteen or twenty years ago and would like to finally own a copy (I borrowed it from the library), but not if there is something better.


The amazing part here is that so many people still remembers all those little details. I still know how it works, but honestly I forgot all the details long time ago.


A lot of it depends on what you do day to day. A year ago, I knew what a MAC address was, but in an abstract way. Today I automatically know it's 6 bytes long because I look at them every day.


UNP is excellent! Haven't opened it in a long time though.


I was interviewed once by someone who would cut me off after just a sentence or two. He asked a fairly open question like, "Tell me about hash tables." I knew I should have just said, "Lookups are constant-time," but I started off with a description of their behavior instead, and sure enough he cut me off and said, "Lookups are constant-time." Some people are just not very good interviewers, but also some tech people's rules are too iron, with no exceptions, like they can memorize but not think. They say interviews are a two-way communication, and that one certainly was. :-)


> ... like they can memorize but not think.

Which is really bad, when they're trying to hire people for a job where thinking is the fundamental skill...


My father is a civil engineer and late in his career, he interviewed for a senior position at a construction company. He figured out that the interviewer didn't know what he was talking about early on the interview and confronted him with that. The interview came to a end, he was put in touch with a senior engineer at the company. He didn't take the job for other reasons. I don't think this kind of rubbish would fly in any other industry.


A civil engineer is a professional engineer. I would assume your father is a certified PE [1].

This rubbish flies in our industry since we are not professional engineers.

[1]: https://en.wikipedia.org/wiki/Principles_and_Practice_of_Eng...


Nonsense. It's because your industry is relatively new and hasn't established the same institutions.


A professional engineer is not someone who professes to be an engineer. S/he is a "professional" per certification by "established" "institutions" within an "industry".

> Nonsense.

s/\./:


This is interesting. Is this similar to things like the bar council for lawyers? I don't know if there is something similar to this in India where I'm from.


> I mean who would know just how long a MAC address is?

Uh, my main languages are PHP and Python (high level stuff) and I'm a student (not someone with 10 years of experience) but I knew that. 3 bytes for the vendor block, 3 bytes for the device.

> Or what the actual SYN/ACK etc tcp flags are?

Yeah the actual bytes, who ever uses that? A MAC address I've seen plenty of times in my life as hex, and I've seen the TCP setup flags being exchanged plenty of times in Wireshark and looked up the hex once when I was implementing TCP from scratch, but I still wouldn't expect anyone to know that.

> You just need to know what they're used for

Agreed on that. I wouldn't blame anyone for not knowing the size of a MAC address, I just didn't think that one is that obscure.


> I'm a student (not someone with 10 years of experience) but I knew that. 3 bytes for the vendor block, 3 bytes for the device.

That is why you know the answer. Come back in 10 years and let us know if you still know it. What you think might be mainstream in a computer science class are rarely used in application. And if they are they can be easily looked up.

I used to know the exact effective distance of a CATV cable when I was a student. Useful? Sure. Something I need to remember for the rest of my life? Definitely not.

As another example, in 17+ years writing ISO level 7 programs, I have never once needed to use the Mac address.


> Come back in 10 years and let us know if you still know it.

I've never taken a CS class, and I still know it. I've just stared at enough packet dumps and debugged enough issues that it happens to stick.

I think it's a stupid question to ask as a screener, as it tests familiarity with trivia. It's the kind of thing I'd be happy to see that somebody knew, but it's never something I'd downcheck them for not knowing. It could just mean, as in your case, that there deep knowledge is somewhere else.


> I've never taken a CS class, and I still know it. I've just stared at enough packet dumps and debugged enough issues that it happens to stick.

We are all a product of our environment. And there is certainly a chance in 10 years the OP will still know how many bytes are in a MAC address. I only mentioned computer science based on the "student" part of the quote.

Personally I've been spoiled by working on higher level stuff so if I do look at a packet dump I usually filter the packet headers out. But again, we're a product of our environment and asking questions that a reasonable skilled engineer might go their entire career without knowing the answer to except in that interview is a bit suspect and will likely disqualify people that would be otherwise great for the job.


This reminds me of the time when I liked physics very much. I used to memorize 10 digits of speed of light and feel very proud.


I don't have the length of a MAC address memorized (because why would I need to?), but I know what one looks like, so I can pretty quickly work it out in my head. A hex digit is four bits...

Maybe 10 years from now I'll have forgotten everything about the hex/binary/octal representations of numbers, but I certainly hope not!


> What you think might be mainstream in a computer science class are rarely used in application.

Uh actually my classes didn't teach me that. Side projects did.


Kudos to having side projects. That's awesome.

The point still stands though. First, it is much easier to have side projects when you are in school so the causal relationship might still be there just one step removed. But the more important point I was making was that school or a side project or even a project at work, if people go 10 years without using it, they forget things.

As an aside, you may want to drop the "Uh" in-front of your sentences. It conveys a certain tone that I'm not sure you are doing intentionally.


> Uh, my main languages are PHP and Python (high level stuff) and I'm a student (not someone with 10 years of experience) but I knew that. 3 bytes for the vendor block, 3 bytes for the device.

Thanks for the refresher! I knew that too when I was in college. Good luck remembering that 5 years from now :)


Thanks for the snarky remark.

I did not learn this because some course demanded I learn a text book by heart. I'm not doing that kind of theoretical university. I knew that particular thing because I look at a lot of packet captures.


Still know that 20 years down the line, but then I do networking stuff routinely. If the candidate was interviewing for anything network related (the article is down atm), it's fair game.


Yeah, you're a student; you probably just took a class on this stuff. You'll forget it soon enough if you don't use it.


He's one of the best programmers you can get (in the world) when it comes to networking and friends. Gwan is less known around because it's not OSS but I believe still achieves unmatched performance.

This interview must be some kind of a mistake. It's like testing Richard Feynman's knowledge with questions about multiplication table (where you have wrong answers on your table).

It's so stupid that it's almost not possible to laugh.


>Gwan is less known around because it's not OSS but I believe still achieves unmatched performance.

ERR_CONNECTION_REFUSED


Gwan's performance is the real deal. It's also the least stable piece of software I have ever used in my entire life.

That last sentence is important.


> I'm amazed he knew things in such detail. I mean who would know just how long a MAC address is?

Not surprised by the questions. Most are [somewhat] common and not particular challenging.

I am surprised by the stupid counter-answers from the recruiter thought. This guy should not be giving phone interview.


I think the interviewee after realizing the shortcomings of the tester's understanding and the author of said "right answers" should have interrupted or ended the probing after a few red flags and politely suggested something along the following lines...

"Look I get the gist of what this screening is trying to do but please write down my answers and feed them to the technical staff who authored this Q&A and have them verify their misunderstandings and have them get back to me please. I am very confident you all have subtle details mixed up and could benefit by a strong candidate who understands such details with a higher level of fidelity."

Guess its another loss given to the recruitment focus on avoiding false positives over false negatives.


>...but I'm sure even a relatively junior coder would be able to tell if someone was just doing technical word salad.

In interviews, I've had people give answers I'm not familiar with. The easy thing to do is to ask them to explain how their answer works. You get to vet whether it seems legit, and as a bonus see how good they are at communication in a genuine teaching moment.


"I'm amazed he knew things in such detail. I mean who would know just how long a MAC address is? Or what the actual SYN/ACK etc tcp flags are?"

I was more impressed by how simply he explained many of them in real-world terms. Especially countering Big O. That plus what he's already built indicates he's exactly the kind of person to direct the sort of projects they're building. That they filtered him with this garbage Q&A speaks volumes about Google's ineffective hiring. Plus, indicates what kind of people might have made it into the organization.


Who would know just how long a MAC address is? Jeez man, these things are literally written as 6 bytes separated by colons. Asking a network engineer this question is about like asking a postman how long a zipcode is.


I'm just reaching here, but is there a chance at all that the test wasn't really about whether or not he knew the correct answers but more that he knew the correct answers and was able to simplify them to the extent that a non-technical user could understand and compare them? I have a feeling that Google is far more interested in someone being able to get their point across than someone that just wants to sit there and argue about whether or not an answer is right. Just based on reading his responses, I got a condescending vibe and a vibe that this guy always has to be right and would work terribly with people of different levels of skills. At a Director-level position, that kind of skill is the most basic skill you need to have.


That's not the problem, a non-tech recruiter cannot assess the correctness. Even the simplification can be done, which I disagree, the answer will be rejected because it's not a literal match. That, is the problem.


Yeah I agree. I think one issue is also the test. For a lot of these things, as with most things, there is a short answer and a long answer. I think by the second or third question I would have picked up on the fact that the recruiter didn't really know what he was talking about and was looking for the short textbook answer. Seems like the author refused to or couldn't do that.


That's not the point of the test. The point of the test is to see whether or not the person attempted to get on the level of the person they were talking to. I have a feeling that the interview would have kept going had the author not started to argue. They're looking for someone that can translate, not someone that will talk down and argue just so that they can be "right".


I am a bit confused.

I did not see any argument here from the statement in the article. The recruiter clearly had little clue about what is right and wrong. And the way the recruiter assess the answer by throwing right/wrong seems more rude to me compared to the author "wanting to be right".

Please do not speculating based on something that is not present in the article.

I had done similar interviews before, the recruiters I worked with did not show the same level incompetence as this one. When I want to be more specific on details, they would suggest that they think it's enough and move on. Not like this recruiter who just throw a 'wrong'.


I'm not speculating. This all took place during a phone call so the post is completely the interpretation of the author with regard to how the recruiter answered the questions. For all we know, the author just paraphrased everything as "That's wrong" to make the recruiter look like a simpleton so that they themselves wouldn't look silly for not passing. We have no other information except for 1 side that happens to be the author's side. Others have commented that they took this same test and were told after that the person doing the interview was a psychologist that wasn't testing technical skill. That's where my speculation is based.


> this guy always has to be right and would work terribly with people of different levels of skills

Are you talking about the recruiter?


No, I'm talking about the person that wrote the post. Since this was a phone interview, this is a paraphrase of what happened, written by the post author. The whole thing smacks of "I knew way more than that person, they were clearly an idiot".


It strikes me as more along the lines of "I knew way more than that person, and they were completely oblivious to the fact that they didn't know this stuff, AND they were in a position that required them to know stuff."

I'm fine working with non-technical people (or who don't know any given field), but I wouldn't be fine working with those people if they were insistent that they did know about these things about which they actually had no clue, or if they were in a position where they really need to know this stuff.

I have coworkers who don't know how to use the command line, but they aren't engineers and they don't try to tell me what commands to run when I pull up a console, so it's fine. If they keep insisting that I should use "dir" and that "ls" is wrong, that would be a problem. If they were the CTO, that would be a problem.

When we hire engineers for customer support, the non-technical operations guy interviews them, but he always has at least one engineer do a portion of the interview because he knows he's not fully qualified to judge someone's technical chops.

The OP isn't being condescending just because someone didn't know stuff. It's because someone didn't know stuff, but because they acted like they did. You want to be insufferable, insist you know better than the experts in a given field.


I can't fathom a scenario where a tech person would need to dump down something like SYN-ACKs and iNodes for a non-technical person. It's one thing if you are trying to explain performance trade-offs to non-technical users or colleagues. But not lower level protocol details.


If this was the point of the test he should have been told this was the point of the test. Gotcha's make poor interview questions.


Don't think so, or would you consider the O(n) notation something a non technical user would want to follow?


I don't think that's the point, though. The answers and questions are meaningless to the test. It's how the person addresses the questions and answers that matters in the test. Someone else posted that the person interviewing the author is typically a psychologist in this test. That, to me, means that the technical correctness of the answers is not relevant and that an actual technical engineering screening comes after it's determined that the person is a culture/personality match.


Then the final reply to "learn about ..." would be a bald lie.


[flagged]


I flagged your comment because you basically created an account to shit on somebody.

Or to put it in terms you'll understand, clicking around on your comment history a bit, we get a pretty solid impression of what kind of guy you are.


> I wonder what would happen if ordinary people recruited for medical doctor jobs? Would you be comfortable rejecting a guy who'd been in medical school for 10 years based on his not knowing what the "funny bone" is? Wouldn't you tell your boss that you felt a bit out of that league? It's amazing you can get someone to do this without them going red in the face.

The medical recruiter asks to see the diploma. The doctor shows his degree and gets the job.


"Diploma says DO, not MD. You don't get the job."

https://en.wikipedia.org/wiki/Comparison_of_MD_and_DO_in_the...


Given that osteopathy is pseudoscientific dogma, that's not an unreasonable outcome.


Eh, that's really oversimplifying it. Yes, DOs are taught some fringe-ish things. But the rest of their education is very comparable to that of an MD. I expect in the long run that many of these quacky aspects of the osteopathy curriculum will quickly vanish.


"who would know just how long a MAC address is?"

I've been looking at them on NICs, adding them to DHCP databases and using them for simple authorizations in iptables (and prior equivalents) since, oh, 1992 or thereabouts. They weren't new then.

It would take me a moment to count them in my head, that's all. "aa, bb, cc, dd, ee, ff -- that's six bytes".


It's actually pretty sad. It seems to weed out the smartest folks so that you only hire those mediocre enough to not understand these problems within the hiring test.


Actually, most people with a networking background would know this kind of detail (and more). It's the relevance (and appraisal of answers) that is shocking.


Anyone who's done any networking will know how long a mac addresses is


It tends to be incredibly helpful to know details like the length of a mac address and what tcp flags mean what if you're looking at an unformatted memory dump.

One example of how this can be handy is when doing low-level NIC bring up, I've read PCIe TLPs from a logic analyzer to correlate them to network traffic.


I bet you spent more than 30 minutes (typical phone screen length) looking at TLPs. Spending 2 minutes to google DLLP/TLP packet structure is not unreasonable.


s/know/have access to/g


"I mean who would know just how long a MAC address is? Or what the actual SYN/ACK etc tcp flags are?"

This isn't even my domain, but I remember some of this stuff. I had to dig in to this area years ago, and was knee-deep in that level for several months debugging and configuring stuff. Some of it stays with you, even if you're not there any more.

For someone applying for a director of engineering, I'm kind of split as to whether this should be required "off the cuff" knowledge. Would certainly help, but seems it would depend on the culture of the company - how hands-on they expect director-level folks to be - some companies seem to want that, some don't.


> I'm amazed he knew things in such detail. I mean who would know just how long a MAC address is? Or what the actual SYN/ACK etc tcp flags are?

It's what happens when you get old. There were commonly C compilers on 8, 16, 20, 24 and 32 bit architectures, some big-endian, some not, and you couldn't google things in 20 seconds back in those bad old days when old farts were building the networks that would be used to connect to us all to google.

(And he has apparently been in networking since)


> I mean who would know just how long a MAC address is? Or what the actual SYN/ACK etc tcp flags are? You just need to know what they're used for

Depends on what you actually do in your job, I suppose. If you use that information regularly or troubleshoot networks regularly you would soon enough memorize it.

> and if you need the specifics, you'll find out with a single search.

Someone doing network programming _does_ need the specifics.


> I mean who would know just how long a MAC address is?

Hmm, let me think. Somebody working with routers for 10+ years.

> ..what the actual SYN/ACK etc tcp flags are?

Same.

I mean, these things are just as familiar to me as grade school multiplication, by now..I bet there's things familiar to you which I couldn't even be bothered to memorise and instead had to turn to a search engine for.


I've been responsible for writing network code on embedded systems, you do end up memorizing lots of the structures and constants from the specification because you're constantly looking at them "on-the-wire" to make sure it's write (or debug it if it's wrong).


If you've had to troubleshoot network corruption at scale, you memorize packet details real fast :)


Understanding SYN/ACK, and what they are used for, can be quite useful in troubleshooting networking issues so I can see why this would be useful in a SRE or other role that dealt heavily with communication.


I would have failed the question. I don't remember how many bytes in a MAC address. That's something that can be easily looked up.


It happens because great developers do not do recruiters' job :)


Same things I thought. His answers are amazing!


I'm amazed people actually believe the account is accurate. The guy failed the interview, and goes to write a blog post to lash out at google (and try to mess up future interviews for them), So of course he write it to make him look good, and the questioner bad.

We will stop here because it's obvious that you don't have the necessary skills to write or review network applications. You should learn the Linux function calls, how the TCP/IP stack works, and what big-O means to eventually qualify if you are interviewed at a later time.

Come on Hacker News, I know you hate recruiters, but do you really think that happened?


It's true this is probably not exactly what happened, but Google does certainly ask these questions in phone screens and the interviewers read the correct response off of a cheatsheet.

The fact that several Google people posting here literally can't believe this is true shows how fucked up their process is.


The fact that several Google people posting here literally can't believe this is true also may mean it isn't true. Not that the questions weren't asked by a recruiter with a cheat sheet, but the quality of response and the attitude of the recruiter has been grossly exaggerated.


Without actually hearing the transcript verbatim, it's hard to give much enlightened perspective here, but there's a lot of "hur hur, dumb recruiter" responses here. What I will say, in general, is that figuring out what the "right" answers are here for what is obviously a technical phone screen by a non-technical person with answers on a piece of paper is also part of the challenge. This is a Director of Engineering interview. Understanding context & navigating "real people", having soft skills etc. is meant to be part of the job description. Feels like this gentlemen couldn't turn the hardcore engineer off who's technically right about everything but yet never seems to get anyone to listen to him.

Giving the hexadecimal representations of the 3-way handshake... really? You may have gotten a dumb recruiter and you may think you're smart, but from my perspective, you answered the questions in a pretty dumb way given the context of non-technical recruiter, very obviously reading answers from a sheet of paper.

I've done two of these before and I've often said "Oh well, it might be down on your sheet at this thing" and the recruiter goes "Ah, yeh, that's it. Tick" and moved through 3-4 questions that in theory I might have gotten wrong. If you take the "be a dick" routine... Congrats. You won the moral war. Best of luck with your next job.


"Dumb recruiter"? What?

Nobody's calling the recruiter dumb. Everybody is calling the process dumb. A process that puts somebody that cannot answer these questions, in charge of asking them and evaluating the answers.

Having the candidate evaluate the competence of their recruiter is not part of the interview process. What the hell.


I believe in the phone screen Google uses non-technical people to ask technical questions (as engineers are a scarce resource) so they're only able to handle "right or wrong" but you can probably work your way around that by being nice - this guy seemed to be being an ass...


I've had two phone screens with them and both times they were very technical people. Then again it was some time ago and with the bigger scale they may have changed it up.

It is unfortunate, but as mentioned above, you need to just play the game until you get to the real part. It's like when I call customer support, I gotta play along with the non-technical people and get them to bump me up the chain to someone technical when I need advanced help.

The unfortunate truth is that it's unreasonable to dedicate precious engineer time to screen millions and millions of people, they'd get no actual work done. So the first layers has to be like this. You just play along for the first step, and after that it'll get much much more interesting, trust me.

This guy seemed like the kind of person who loves showing off his knowledge and having the last word on everything. Honestly this kind of people, as knowledgeable as they are, usually do poorly in a work environment.


>I've had two phone screens with them and both times they were very technical people. Then again it was some time ago and with the bigger scale they may have changed it up.

Just to clarify, at least for the SRE hiring process, you first have a single technical phone screening with a technical recruiter (not an engineer) which is literally on the phone. At least it was for me, no webcam or anything. It's a pretty short and back-to-back question/answer type of conversation similar to what is told by the article (although the article strikes me as odd and does not match my experience). After that you have a couple (or more if need) of "phone" (read: hangouts with webcam and shared doc) interviews with actual engineers and those are more technical and require you to write code as well. Then you'll be moved to on-site interviews.

(This is for Europe at least, I imagine it'd be similar in other areas but can't know 100%).


You're exactly right. I was asked some of these exact questions yesterday. The guy should have realised what he was dealing with, the recruiters don't claim to be technical, and the questions are flagged as being straight forward pre screen questions.


I'm glad that someone has some sense around here. I'm getting buried for saying the same thing. Everyone is making the assumption that the author of the post transcribed this interview instead of paraphrasing it. This was a culture interview, not a technical interview, and the fact that the author misinterpreted it only strengthens the interviewer's decision to not consider them further. They are very obviously not a good fit for a Director-level position at Google.


Speaking of making assumptions, you're making a lot of them.

You say "Everyone is making the assumption that the author of the post transcribed this interview instead of paraphrasing it". As a member of "everyone", I disagree.

I do suspect it's not as black and white as the article makes it out to be but the general attitude is not uncommon in tech companies. It's in fact so common it has become a bit of a meme. So I'm personally taking the article with that in mind.

> This was a culture interview, not a technical interview

Oh spare me. If a "Director of Engineering at Google", above in the thread, calls the interview "super strange" and "making [the recruiter] look like a blithering idiot", you can't start making random excuses up for Google. "It's about the culture!"


Where do you see a "Director of Engineering at Google" above claiming the interview is "super strange"? There are other (supposed) Google employees in this thread that are the source for every single one of my assertions.

Also, as a member of "everyone", how can you disagree with that statement when no one has even bothered to call out the fact that we only have one side of the story and it's the side of the story that wants our sympathies?

EDIT: I found the post you were referring to (it wasn't at the top when I first posted my responses)... The "Director of Engineering" was even saying that he doesn't buy the transcript because it's only one side of the story. That pretty much seals my point.


Your point that it's a "culture interview"? Or your point that it's probably not a verbatim transcript, which nobody argued? Or is it your point from way back that these are skills needed for a Director of Engineering, which apparently this wasn't an interview for?


My point that the author of the post misinterpreted the interview (which was apparently not at all for a Director-level position), my point that we only have one side of the story, and my point that the paraphrasing was done intentionally in such a way as to make the interviewer look bad instead of the interviewee. Or, if you want to simplify it, my point that this person is clearly not Director material, as they would like everyone to think.

Also, you're moving the goalposts here. At the time this was posted, the author of the post claimed it was an interview for Director, people were claiming that the interviewer acted exactly as written, and I, along with others, were claiming that either the author left out information, misunderstood it, or edited it.


Then do a multiple choice quiz on a website.


Using non technical people to ask technical questions is also being an ass.


I don't understand why a non-technical recruiter would be asking technical questions of a technical candidate, especially to a high-level one. Maybe for college hires, where you need to weed out an overwhelming field of candidates. Maybe Google just gets that many more applicants but jeez, I feel a web form and a minimal machine learning classification could do a better screening job.


Maybe it's more of a case that a technical person was asking the questions but doesn't really care about justifying why they should hire an applicant if they don't say exactly what is written down


The non-technical recruiter is asking technical questions with technical answers, but without the technical expertise to consider correct answers that are not literally the same words as the ones they have been given. That is not a good way to interview.


I more or less agree, although the real wrong party here is Google, for putting a non-technical recruiter asking a quiz as a step. This story does sound bizarre though, very unlike Google.


Why is that wrong? As a Director, you'd have to deal with people at all different levels of understanding. You may even have to deal with companies, clients, and other departments that have zero skill in your area of expertise. This seems like the perfect exercise to test someone's ability to navigate those kinds of required skills.


That's wrong. The recruiter has a goal to get "correct" answers: if someone passes the interview without providing these answers via some "soft skills" (it more looks like a social engineering), then the recruiter fails. Because of that, there's no point in trying to explain your answers in hope, that recruiter will somehow agree that they match the answers from checklist.

More adequate approach will be to find the way to bypass this interview, by finding the right contacts who have the adequate expertise and can make the hiring decisions.

This situation is like trying to sell new fridge or delivery van to a waiter in restaurant, who was instructed to talk to business visitors while management is away. He was indeed put in charge, but he can do nothing for you or worse, communicate your offer to his boss in a wrong way, so you have to escalate - to find the contacts of management, to reach them etc.

I think this story makes sense as an illustration of how not to hire people.


What's wrong? I think this is exactly why this test exists. They don't care if you get the "correct" answers at this stage in the process. This is a glorified personality test that, in my opinion, the author misinterpreted as a technical exam. Directors at Google are not going to be the people that know the answer to everything and talk down to people. They're the people that have technical skills while, more importantly, having the personality and people skills to actually direct people and communicate with people of varying skill levels.

You and large amount of very technical people in this thread are the exact types of people that Google would, more than likely, try to avoid for a position like this.


Well, it's just your wishful thinking that it's a such kind of interview, not reality, and any personal attacks on me won't help you to prove your point. I have software engineering management experience in multinational companies and I have hired other managers: there are much more effective ways to find a person with good soft skills than such remote screening with a purely technical checklist. This way it's simply too costly: first, you need really smart recruiter with good soft skills himself, so he will expose the candidate's weaknesses and strengths. Then, there should be very well designed checklist that will allow to derive candidate's mentality from answers on purely technical questions. That's almost impossible, I'd say.


It's not my wishful thinking. Others in this thread have confirmed that they took a similar test when interviewing for Google and some of them actually got the job. One user even mentioned that the person doing the interview was a psychologist. I'm not attacking you. I'm simply saying that you're just like the author of the post. You assume, because the author says so", that this was a technical assessment when Google employees in this thread seem to be confirming that it is not. Your management experience is irrelevant to a basic failure to recognize this for what it is. This was a phone call. It's not like the interviewer was making these deductions of the interviewee simply by reading their answers on paper.


I don't disagree, although I'm sceptical that's intentional on Google's part. I think if you applied some common sense to the situation and you were able to get off your moral high horse about what's correct vs answering the questions he needs you to answer to get to the actual proper interviews.

Dale Carnegie wouldn't have approved, I'm sure.


What makes you skeptical, though? If anything, there are two things that make me almost certain of it:

1. The author of the post says that this was a phone call. That means that this, more than likely, is not a transcript of the call, but a paraphrase. The entire tone of the post lends itself to the author thinking that they're "correct" and that the interviewer was just a rude, monosyllabic simpleton.

2. The interview ended immediately after the author started to argue. Instead of trying to relate to the person and simplify their answers after the first few super-technical answers weren't accepted, they trudged on with the attitude of "this person has no idea what they're talking about and this is stupid" rather than "I'm clearly overshooting the mark here, maybe I should try and simplify the answers".


"This seems like the perfect exercise to test someone's ability to navigate those kinds of required skills."

I disagree. The Q&A process isn't indicative of almost any skills on the job except patients when your time is being wasted in a formal process. He'd have to have memorized every trivial, algorithmic fact plus their textbook (not real-world!) answers with no further knowledge or answers. Such a candidate is not valuable in any function in Google unless they're trying out for an IT version of Rainman. Not even for HR since they read a sheet instead of memorize it themselves. ;)


You seem to have misunderstood. The "required skills" in question are communications skills, not technical skills. This wasn't a technical interview with an engineering team member as they seemed to think. This was a personality/psychological examination to make sure that their personality and communications skills match up with the culture and personality at Google. Directors typically don't do the low-level, high-skill technical work at companies like Google. They need to understand it, but, first and foremost, they need to be able to communicate with people of varying technical skill levels. This Q&A process, as you called it, is completely indicative of a person's communications skills.


Let's apply your interpretation. In this case, they are evaluating the social skills of someone who will direct projects by bright engineers. They will aldo interface with management about it. There's a lot of skills involved sith associated interviewing strategies, certifications, etc that might be employed.

Instead, the interviewer asks algorithmic questions, gets great answers, explains they're not on his sheet, and rejects the person. This is the total opposite of kind of social problems an engineering lead or project manager deals with. Plus, the requirement of keep guessing until your answer matches a sheet doesnt reflect how goals or requirements are done.

If this was assessing social or management skills, then it's the worst method I've seen to assess it. It still is a horrible result.


But most of these answers weren't even "technically right", or even dickish; they were pretty simple and straightforward answers. For instance, the recruiter asks what's the KILL signal, and then says the right answer is SIGTERM. If the answer itself is wrong, then what can you do?

Not sure where you're getting this "be a dick" routine thing from, but when I read through the transcript, it was clear that the recruiter was looking for an excuse to reject the author, and nothing else.


The right answer? It is hard to believe that there is only one right answer to these questions. There are different level of answers to any given question. Think about how you might explain the second law of motion F = ma (scalar) vs F(vector) = ma(vector). Same thing applies here, three way handshake can be explained at many levels. Anyways, I understand that if you want to get hired you have to compromise, but I also understand this upsets many of us.


Best of luck to Google, more like it.


How were the answers stated to the recruiter being a dick? There was no room to a be a dick given the recruiter was looking for verbatim scripted answers.


I'd be willing to bet that you were an engineer that eventually moved into management.

Most technical people, surprisingly, don't have the tact to discern these two scenarios.


I agree with you on this, the job description said it was just as much management as engineering. Management means being able to tolerate irritating situations without becoming an ass.


I've been at Google for five years as a SWE and I've been interviewing for 3 of those. I'd fail this pop quiz.

This strikes me as bizarre and inconsistent with all the practices I'm aware of. The idea that we'd ask anyone this stuff, let alone director candidates, strains belief.


I was asked similar questions when I was hired as an SRE nine years ago. The recruiter stopped after a few correct answers, iirc. There was probably at least one I didn't know / guessed wrong on, but I don't remember. They also asked me to rank my knowledge in several areas on a scale of 1-10, and I think the questions focused on areas in which I'd chosen higher grades. Following conversations with engineers also focused on those areas.

As I understand it, this is meant to be a shibboleth a non-technical recruiter can use to spot an experienced software engineer / sysadmin in a quick conversation ("pre-screen"). That's a hard thing to pull off. They can't ask someone to design a system, diagnose a problem, or write code because they're unqualified to grade the answer. Instead, they ask some simple canned questions. The questions may not test essential, first principles sorts of knowledge, but if someone can't answer any of them it's a bad sign. The questions should have a small family of correct answers that recruiters can recognize, and the recruiter should just see that a candidate can get some of them right before scheduling a phone screen with a Google engineer. If the transcript is accurate, this process failed.


I had exactly the same kind of quiz at the beginning of the year on a Google phone screen interview. Some questions were exactly the same. I passed apparently but declined the in person interview. The recruiter over the phone was way more technical though and accepted answers not strictly matching his response sheet (and was able to discuss the technical whereabouts)


The only way this makes "sense" is if you already have the candidate (or pool of H1-B candidates) you want in mind, but have to prove you opened up the position to the general public first.


My company posts positions available in a common area as part of an H1B related initiative in order to verify that no US techie wants the job before they go looking for the candidate. They were offering around .6 of typical salary given position and location. Shameful what is happening in tech right now, absolutely shameful.


I interned at a large company that stapled stacks of job descriptions labeled "H1-B OPPORTUNITY" on bulletin boards outside the elevators. They listed salaries that were super underpaid, and everyone I worked with was clearly an H1-B employee. It's disgusting how shameless some of these companies are.


I know some companies do this, but this is Google. There's no incentive for them to hire H1-B's if a equally qualified American citizen is available, since they are going to pay equal salary.


They don't have to offer the same salary, just salary in the same range. That range can be pretty wide ($20k+)

Employees on an H1-B visa have drastically less job mobility than US Citizens. This creates a power advantage for the employer.

>but this is Google

Google has, in the past, illegally conspired to prevent other companies from recruiting their employees. This lowers wages and reduces employee mobility. Clearly there's incentive because they have literally broken the law in the past to achieve these results.


> Employees on an H1-B visa have drastically less job mobility than US Citizens. This creates a power advantage for the employer.

Yet Google pays the lawyers needed to get you a Green Card as fast as possible.


Its not that simple. There are quotas by country. For someone with a let's say Bachelors or even Masters degree from certain countries, just money wont get them GC soon. The wait time is several years AFAIK.


Yet a Green Card does not give an employee anywhere near the same level of job mobility as a US Citizen.


A green card allows you to live and work in the US without employer sponsorship. There is complete mobility, on par with a US citizen.


It's not as bad as previous poster states, but it's not quite as simple as you make it seem either. A green card holder forfeits their residency if they leave the US for "more than 6 months", or if border patrol people feel like they've abandoned their residency for any reason. This doesn't affect most employees, but if you're a consultant working on-site in another country for extended periods, or simply travel often, you have to do way more work to get everything cleared. And even then, there's no guarantee you won't run into problems.


I have some friends on H1B who work there and also other top tech companies, trust me there's no discrimination in salary. With the extra legal fee, i think it's a burden for them to have people on temp visas.


Google typically recruits people from Europe in European offices, and the same goes for offices in Asia, etc. The H-1B process is so painful that they do everything to avoid it, and only resort to it if they cannot find the candidates they are looking for in the US (which happens often on the scale of a company as big as Google).

Some people responded saying that they might still do this to get underpaid employees from abroad... which is just silly. The salaries are exactly the same whatever your country of origin, and companies like Google are not the ones you should attack if you want to make a point about H-1B abuse.


This strikes me as an interview with a recruiter. This is almost definitely not a legitimate Google interview.

Maybe it was for a vendor/contractor position.


Google recruiters ask these exact questions in phone screens.


To add to the chorus: I had an extremely similar interview (including a few verbatim questions) during my phone screen for an SRE role at Google five years ago.


I only have a sample size of one, but I blew threw a phone interview with Google about a year ago -- and the guy giving it was very technically skilled. We talked about the subtler aspects of the questions as we went.

They offered an in-person interview, of course, but I declined when I found out that they were only hiring for Google Payments in Boulder. I'm sure the job has interesting aspects, and maybe my imagination just isn't up to the task, but I have a hard time figuring out how I wouldn't go insane with boredom working on a system that just moves money around, not to mention frustration from working with the extra regulatory/process restrictions that must be in place to keep compliance up...


Wait. You didn't know what position and location you were being interviewed for? That kind of arrogance from a company can only exist if the job seekers are desperate or perceive any job at Google as nirvana.


I think you're misunderstanding how the process works. It's not like they wait until they're ready to hire you and then tell you where you're going to work -- they ask you where you want to work. After I had passed all of the interviews, they gave time to consider the different locations and pick which one I wanted to work at (I could only pick a location that had available positions, of course). After I had accepted the job offer, about a month before I started, they contacted me again with a survey that asked me about my interests and skills, which helped them place me on a team. Since I'm working at a smaller office, there were only two team choices, but the larger offices (ie. Mountain View and Seattle) will give you several choices. On top of that, you're free to change teams after as little as a year. So I would hardly call any of that 'arrogance'.


I know someone working in that area--apparently lots of string parsing. It's vitally important (moving money usually is) but not very exciting/cutting edge (moving money usually isn't--after all, it's an old technology).


Important? Absolutely.

But I do games, apps, machine learning, IoT, hardware drivers, and deep dives into broken code that no one knows how to fix. Fancy accounting software didn't doesn't strike me as interesting. Even if it involves string parsing. :)


Likewise, identical situation, identical thoughts/response.

I want to follow up on this and see what the deal is. It just strikes me as fundamentally wrong.

I'm honestly hoping this is not actually a Google recruiter doing this; if it is, that's just broken.


These are pretty much SWE/SRE questions, not director questions.


This seems like a pre-screen where the recruiter is trying to assess the viability of a candidate themselves. But I would assume that based on experience with recruiters elsewhere, no idea if Google has a higher bar for their recruiters.


I've had similar phone screens previously both at Google and Facebook. The depth/specifics of some of the questions seems maybe a little deeper but it's not too far off what I remember from my screens to be honest.


As another data point, when I interviewed at Google a few years ago I had a phone interview with some of these exact questions.

Also, judging by the smugness of the people I interviewed, the "We'll stop here because it's obvious you don't have the skills..." part doesn't surprise me either.


Wow, I had almost the very same set of questions for a junior SRE role! The difference is: I graduated ~1 year ago and therefore have a lot less experience than this guy.

It was kinda funny, the recruiter called me to "go through some of the questions to see what I will encounter in the actual interview". I remember, when asked about what Unix function accepts connections on a socket I could just answer "dunno", because I never used or did that. Even more funny if you consider that they actually came across my profile and called me initially and there is just no place in my CV where I claim such "low-level" knowledge. I also remember that Inode question, "What is stored in an Inode?" - Again "idk" and the answer the recruiter gave was "metadata". Yes of course metadata, goddammit, what else? (Can an answer really be that simple???)

After the interview I felt quite devastated because I did not expect that I had to come up with a solution to process an array with the size of 10,000 in a call with the recruiter. I wasn't (and I'm still not) sure if this was only preparation or an actual interview. In the latter case, I was sure that I failed. Surprisingly however, I was invited to an actual engineering interview some time later.


Maybe they wanted to test your personality and if you generally have a clue. Admitting to not know something is usually no deal breaker as long as you get the important ones right and show the right attitude.


The meta-question is, "Do you have the social savvy to give the conventional answers when being judged by someone who doesn't know what he is taking about?"


Yah I get the feeling you're supposed to "act stupid" so you can get to the next interview.

Definitely not an idea way of getting candidates - you are selecting for people who know how to manipulate screens (and thus increasing the risk of getting a bad candidate), or rejecting otherwise knowledgeable people who just don't have the time/inclination/"social savvy" to pretend to be stupid.


Why not? Especially in a managerial position, you will occasionally need to build a rapport with people who are convinced they have good technical knowledge, but don't. Being able to talk to them productively instead of picking fights is a skill worth selecting for.

If you can't figure out that the first person who's interviewing you has answers on a sheet of paper and you're supposed to parrot them until you get to the second person, how are you ever going to figure out that the first person you're selling to has some business requirements on a sheet of paper and you'll never get to the second person until you parrot those?

"Oh, we're not actually using Docker, we're using rkt, which is a compatible reimplementation of --" "I'm sorry, I've been told Docker is a requirement. We can't use your Cuber Netty thing until you support it. Bye!"


It's totally dependent on the type of person being interviewed. There is a difference between "knowing how to work in a team" "knowing when to be civil, when to push, when to go to battle", and wanting to parrot answers in order to just get to the next step.

If it were me, I'd try to engage with the recruiter and make them go completely off book. I'd ask them about their career, and try to find a different job for them (instead of reading stuff off a sheet of paper), or if that's what they are content to do, try to escalate and get them to reveal their "client" or "person they report to" (which in recuiting is a no-no) - I think I have the confidence and social skills to try to do that (I've talked past border officials, and various recruiters, and having been in a tele-job where I had to follow a script, I know exactly where a script reader is most likely to go off book if I ask something at exactly the right time). I am in no way qualified for a director of engineering position, but I can very easily get past this telephone screening, because precisely I've been the person asking this kind of question and using this kind of script. If I got past this screening, I'd be wasting the time of the person next in line for the interview.

Of course, it is a valid strategy to get an interviewee to follow along, but it's misguided - using the entirely wrong tool (scripted questions/answers) for doing the job (finding someone with managerial and people talent).


Have an up-vote. This is the most plausible explanation. They're testing his social skills and how he deals with people who are less technical than he--not testing his technical knowledge.

EDIT: To add to this, I've seen this tactic before on an interview. Interviewer asked me a pretty softball technical question, I nailed it, and then he said, "No, you're wrong, it's [OBVIOUSLY INCORRECT ANSWER]." He was clearly trying to gauge how well I handle someone who thinks they know what they are talking about, but actually do not--which can a surprisingly large number of people in the office.


The most plausible explanation is the guy failed the interview, felt bad about it, then wrote a version of it to make him look good and Google bad to punish them.


I find the thought more plausible that Google had 100s or 1000s of candidates and they weren't willing to interview all of them directly because most are usually crap, so they let some incompetent contractor do a pre-screening.


I really wish our industry could create some generally trusted benchmarks of skill that we could take once and then be done with. As things are, we have to prove basic programming skill with every employer. You'd think this wouldn't be necessary with more than a decade of experience and several degrees in computer science from eminent institutions, but apparently it is. I'm fine with employers asking very particular questions about the domain of work, but we shouldn't have to prove sanity all over again every time.


Really doubt that based on those questions. Recruiters are human and fallible. They form opinions about people, sometimes incorrect, and act on those opinions all the time. This phone screen is the final step before they lose control of the candidate process to the technical team.

Recruiters do reject candidates and create false negative situations when it comes to positions that have a lot of candidates and very few openings. For Google, that would be every position, especially engineering.


That would be totally dishonest.


> Being able to talk to them productively instead of picking fights is a skill worth selecting for.

Perhaps, but if the story is true, then it's wishful thinking to assume Google tried to do just that by putting a moron or someone acting like one in the recruiter chair. That way you risk hiring a quick talker who can talk, joke or laugh his/her way out of a wrong answer. If technical skills don't really matter, it's fine though.


This isn't the only interview. But if you can't pass this one, technical skills won't excuse that, yes.


Doesn't make sense given the end of the interview, unless the recruiter was being completely disingenuous. The recruiter recomended the interviewee getting some education in the topics he asked about.


While this recruiter probably failed at their task, I do agree that there is skill in communicating technical information to non-technical people, and it is a valuable skill. I wouldn't be surprised if the author of this blog post acted a bit curt or acrimonious to the recruiter, and stubbornly insisted on being "right" rather than communicative and agreeable.

Know your audience.


You mean the wrong answers. Great, in a technical interview for the Director of Engineering (not legal, not HR) Google expects you to know when to lie and give the wrong technical answers and when to give the right technical answers because this is essential to the Director of Engineering position? What are they hiring someone to be complicit in their next wage suppression scheme?


At that age, it becomes way more satisfying to educate somebody, rather than try to force your way in.


Hahaha, that is funny/tragic. Outsourcing hiring a director of engineering to a moron with a cheat sheet! Your answers were great, though, and indeed on the systems I have tried the popcount() computation with the bit shifts you suggested is faster than using an 8 bit lookup table. (however the SSE4 popcnt (https://en.wikipedia.org/wiki/SSE4#POPCNT_and_LZCNT) is faster still).


While I think these things are super interesting trivia, I really want to ask the question: "Why does a Director of Engineering for Google need to know this?" That is not only a rather unusual question (within the scope of daily tasks for a programmer, these tend to be an important minority for a project), but it's one that's best settled empirically and within the context of your execution environment.

There are a lot of different answers to the question depending on where you're doing it and how you want to impact the machine.

It's a bizarre question to pose to Director level, because the proper response even for a technical director should be, "The answer to that changes quickly, we should measure and check what our environment's latest capabilities are and if they are reliable."

I mean I'm all gung ho to program stuff, but I think that'd be a massive misapplication of my time with that title in my current job, and given my other responsibilities I'd only do it more slowly than someone with dedicated focus.


I had almost exactly this same engineering test when google interviewed me in 2006. It was terrible, and left a bad taste in my mouth. Given the complexity of the work I was doing at the time, the entire thing seemed ridiculous.


The inode question gave me flashbacks to my interview with Amazon. They wanted me to explain what a hash function is. I kept giving answers for about 3 minutes explaining hashing, common algorithms, reasons to use it and places it applies.

Recruiter: "I was looking for you to say it's a fingerprint"

So I guess I was wrong, because despite explaining them in decent detail, I didn't use the one keyword.


On my phone screen a Google recruiter asked me "how much is 2^24", and I knew the answer by heart and answered immediately.

So he asked "how did you figure this out so fast?". I told him I didn't, I just remember all the "important" powers of 2. He said "well... that's not what I was looking for, I wanted you to calculate it, but... I guess a candidate who memorizes powers of 2 is a positive sign?". I passed.


"16 million colors" if you get what I mean.


haha, I got that question too, in stead I worked in MPLS for over 5 years, I know by heart that MPLS label has 20bits which translates to 1 million labels, times 2^4 that is 16 million. The recruiter asked me how I figured it out so fast, I explained to him, that recruiter had no idea of what I was talking about. I did not pass.


Well, I answered the rest of the questions perfectly and exactly like he expected, so it might have helped.


I've found the best way to answer those is "buzzword diarrhea"

first thing I do is try to get them to ask it a different way. then I play spew out every related buzzword possible as if its a question. Are you talking about .... oh, you mean .... I can usually get it in a few tries.


I'd have failed too, because I use a very particular way to describe a hashing function.

A hashing function is actually a sorting function. It's supposed to take an input space and sort it in an unpredictable and evenly distributed way across the output space. What's more, neighboring points in the input space, no matter the sort used to determine proximity should not result in neighboring points in the output space.

Fingerprinting is just an emergent value that comes from choosing fixed length hashes, and the fact that the mapping from input to output is stable.


Why do you use the word "sort" rather than "map" here though.


I'd guess because sort implies an ordering, which, when total, implies a distance.


only some hash functions are fingerprints. Other hash functions are the exact opposite of a fingerprint.


I know some hash functions are not meant to create unique fingerprints (they're to pick a bucket to put/look in), but what do you mean by the opposite of a fingerprint?


The goal of a fingerprint hash is to convert an input space of "large" values to an output space where the output space values are much shorter- typically fixed size and two similar input values have effectively random outputs (without spending the CPU cycles to implement a cryptographic hash). This permits a wide range of optimizations (a document can be fingerprinted, and looked up by its fingerprint, to see if it's a cache, has associated data, etc).

A non-fingerprinting hash function - well, one example of one- is something that for similar inputs, produces the same output (similarity being defined by some distance metric). See, for example: https://en.wikipedia.org/wiki/Locality-sensitive_hashing and https://en.wikipedia.org/wiki/MinHash

Confusingingly, many functions that are used for similarity detection are called Fingerprints, (https://en.wikipedia.org/wiki/Acoustic_fingerprint), but I consider that a distinct use of the term.


"The goal of a fingerprint hash is to convert an input space of "large" values to an output space where the output space values are much shorter- typically fixed size and two similar input values have effectively random outputs"

But this is also the goal of non-crypto hash function like those used in a data structure no? Basically mapping a large space of inputs to a smaller space of outputs.

I would say the cryptographic hash is one the that has "desirable" security properties, things like not being able to recover the original input message using the hash, a tiny change in the input causes a substantial change in the output, or that its very unlikely that two inputs will produce a collision.


both types of hash convert an input space to an output space where (typically) the input space is much larger.

What matters is whether similar inputs get mapped to the same output. For the case where you want to minimize the probability that two inputs which are highly similar land in the same bucket, you want a crypto hash, although those are expensive so you want a cheaper approximation, which is exactly what fingerprint hashes do. The problem is that as the input values counts approach sqrt(output value size), you're going to start getting collisions, and ideally, you want those collisions to be evenly spaced.

In the case of a similarity hash function, you want the opposite, the closer things are in some metric space, the more likely they end in up in the same bucket.


"For the case where you want to minimize the probability that two inputs which are highly similar land in the same bucket, you want a crypto hash"

That's what I was trying to say:

"a tiny change in the input causes a substantial change in the output."

But I probably didn't articulate that very well. I think we are saying the same thing.

You mentioned:

"The problem is that as the input values counts approach sqrt(output value size), you're going to start getting collisions"

Can you elaborate on the significant a square root here? Does this relate to the load factor?


https://en.wikipedia.org/wiki/Birthday_problem Basically, you get 50% chance of collision around sqrt(size of table).


Ah right, a probability theory classic. Thanks, cheers.


Thanks for the elaborate answer! I knew about things like acoustic fingerprints but not that people use the term "hash function" to describe something that indicates similarity.

Could something like Hamming distance be called a hash function too? It's not mentioned on its Wikipedia page.


No because that works between pairs. It's a comparison method, not a mapping.


Ah I see. Thanks!


it's worth reading and understanding the ideas behind LSH and hamming distance. there are some... fundamental mathematical relationshps there that are still being ... hashed out.


recruiters need to have a few levels of keywords they scan for.

level 1: foo, bar, baz level 2: frobnitz, barfoo level 3: 42, etc

In someone is using words from level 2 that work together in the ways laid out, they're probably beyond level 1, and wouldn't use the word 'fingerprint' (in this case) - they're giving more detail (and probably better) than what was being listened for.


This idea sounds like a slightly more brilliant test than this candidate was faced with.


I got an interview with Amazon, but failed the first round, because I did not do well on the reasoning part of the online test. You couldn't skip questions, so I spent too much time on some of them and had to rush the ones at the end. They failed me despite getting the coding part 100% right.

Oh well.


Can you elaborate on what types of things you were asked to reason about in this test?


Sorry, I only saw your comment now. They were asking stuff like finding patterns (like in IQ tests). Then they also had ones where you had to read text and then say what you would do with it based on certain requirements. You couldn't skip questions so I spent way too much time on the number ones.

I cannot obviously tell you the exact specifics because of the NDA.


I am curious what your opinion is of these? I personally find this practice loathsome. Was this something that happened after speaking to a human or was this the initial screening?


It was the initial screening and I was very upset that they rejected me since I did (very) well on the coding part but poorly on that reasoning test. If I have a good GPA at a Uni with a degree in Computer Science, then they can certainly assume that I can reason and there is no need to put me through a very strictly timed "IQ test."

Maybe I dodged a bullet there and Amazon would have been a bad place to end up anyway.


Yeah that would be my thought as well - 'do I really want to work for a company that makes me take a timed IQ test. Probably just as well as you said.


> Given the complexity of the work I was doing at the time, the entire thing seemed ridiculous.

I had the same a couple of years ago, at the time I was a Rails developer at a small startup. The first screening they asked me some basic technical questions (I guess it was a recruiter), then the second screening they asked me to walk through some data structures and algorithms off the top of my head.

I knew they would do this, and hadn't looked at any CS algorithms since university, so got a copy of Programming Pearls and studied through that every evening for the week before. I picked a few that I hoped would come up, and it turned out that's what they asked for. I think first a linked list (maybe doubly linked?) and then a tree sort.

I surprisingly passed, but the attitude of the interviewer really put me off. He said he knew Python, Java and Go - none of which I had used - and wasn't too happy when I said I wanted to use Ruby (which the first guy I spoke to, said was fine). Then throughout the interview it seemed very much like he was fighting against me and trying to prove me wrong.

After that I couldn't be bothered any more, I didn't really want a job at Google, it's just a recruiter contacted me and I decided to try it out. I guess this style of interviewing must work for Google, but it's just not the way I like a company to introduce itself to me. It just seems like they are approaching it with so much ego, as if I would be privileged to work there, but to me a job should be mutually beneficial.


They asked me the same questions back in 2012. I was applying for a Java developer position.

The person asking the questions said right from the start that he is not technical (he was a psychologist) and that he had a cheatsheet in front of him.

I passed that telephone call and failed the process later on another call (on the lightbulbs and the 100 floor building)

Really surprised to see that 4 years later they still do this.


This is really weird and should not be happening. I am a software developer and do a lot of interviews including phone interviews.

These kind of questions are not expected, but I would say the lightbulb one is not super horrible as long as it is the discussion that was valued and not the answer. I have gotten weirder questions for sure when interviewing.


It makes sense if they want a firewall from the public, a legally defensible position, while behind the scenes they hire via other channels ( close friends )


I didn't hear about the light bulb drop question until this thread, and was surprised how in-depth it can go.

Can you really "fail" an interview on a question like that? I assume the person conducting the interview is reviewing your process of coming to an answer rather than getting the most correct answer - right?

By most correct I found this article that broke down the question/answer: https://pointlessprogramming.wordpress.com/2011/03/11/2-ligh...


In theory, that's basically the idea of most questions asked during interviews (except maybe very basic CS ones). The idea is that they will look at how you approach the problems. If you already know it and spew out a memorized answer, it's much less interesting to them than if you've never heard it before and you approach it with a clever way (even if you don't get the answer).

But yes, in practice, YMMV.


> They asked me the same questions back in 2012. I was applying for a Java developer position.

I've has similar issues with other companies. Network developer? Great! Let's ask some assembly / Java questions!

Uh... if you people are that stupid, I don't want to work for you. Thanks. <click>


That's really interesting that he said he was a psychologist. I am probably giving them too much credit, but maybe they rare screening for personality with this test and not for technical knowledge. If you handle a frustrating and difficult situation gracefully, perhaps how many answers you got "correct" is irrelevant?


    > They asked me the same questions back in 2012. I was applying for a Java developer position.
OMFG.


I can see both sides here. Google does want to make sure even higher up technical hires have good technical skills, which I think most developers would agree is a good idea.

But their way of measuring this (the standard way) is bad, because it ignores the nuance a more experienced person has (seen so clearly here).

On the gripping hand, I can't help but feel the candidate did demonstrate a big failure to communicate, which is an important skill in itself.

For example, listing SYN, SYN-ACK, ACK in hexadecimal is great for showing off, but is legitimately a bad answer to the question – as evidenced by the lack of understanding in the questioner. I also think some social graces might have got them further (e.g. "Oh yeah, sure. Quicksort is O(N log N on average, and is generally a reasonable sort to chose, but I wanted to mention some other factors that are worth considering").

At the end of the day:

- it's a hoop. Jump through it and get a fish, or don't.

- calibrate for your audience! This is a very important technical skill (and this test was unintentionally correct in its result IMHO).


Completely agree with you. Seems more like a personality mismatch, than a lack of technical skills.

Seems like both parties were at fault here:

- The interviewee for being a little brusque about his answers, and despite understanding the _intention_ of the question, giving potentially off-putting, difficult answers to check over a quick phone call.

- Google recruiter for seeming to ask questions she/he was a little out of her/his league to ask. If you don't understand the nuances in and around the questions you're asking, you are not qualified to evaluate someone's responses.

Wouldn't worry about any one particular interview, there are too many great opportunities in tech right now to let any one bad interview let you down!


It is scary if this is how Google hires engineering directors. Passing this test would select all the bad candidates! Answers to many technical solutions are "it depends". If you have a director that "knows" the right answer, they won't be very likely to encourage their engineers to experiment and find the best solution to the specific problem at hand.


A brilliant professor of mine liked to say "The Answer of the Engineer: it depends".


That is nearly the answer i give to every questions. But when i asked for more detail no one likes it. They all simply want "answers".

Well much like most things in life, there is no such thing as concrete answer.


I keep pushing the message that getting a "correct[tm]" answer to a technical question is probably not the answer you should be looking for. Problem solving skill trumps rote knowledge every time.

I recently wrote a somewhat lengthy post on the subject, after realising how bad some companies are at this: https://smarketshq.com/notes-on-interviewing-engineers-a4fa4...


Is this for real? Does Google really hire interviewers that stupid? The list of questions is really dated. They're all pre-1990. The answers are dated, too.

A Linux inode is the file system's representation of the base info of a file, from which the file's data blocks can be found. It also carries file metadata, but its real function is as the root of the file's block index tree. The format depends on the file system. The internal identity of a file is an inode number, not an inode. What you get from 'stat' is the file's metadata, which mostly comes from the inode data structure on disk, but isn't necessarily in the same format. The formats were the same back around UNIX V7, but there have been some changes in the last few decades as file systems improved.

If you need to count bits in a word, the first question is whether your CPU has hardware to do that. NSA always liked population count instructions, which are useful in cryptanalysis, and that's why most supercomputers had them since the 1960s. Now they're finally in Intel x86 CPUs with SSE4.2 (added around 2006), which has a "popcount" instruction.[1]

A MAC address for Ethernet is six bytes. There are other hardware layer systems, and Google probably uses some of them. Fiber Channel fabric uses only a 3-byte address, for example.

Hash tables are not O(1) lookup. It's an exponential as the table fills up. It's near O(1) only with a near-empty table. There's a space/time tradeoff on how full you let the table get before you expand it.

Quicksort is average O(N log N), but the worst case is much worse, which is why nobody uses pure Quicksort any more. You can beat O(N log N) with a distribution sort. The first sort to do that was SyncSort, the first patented algorithm. It's a distribution sort with self-adjusting buckets.

Who wrote this interviewer's answer sheet?

[1] http://wm.ite.pl/articles/sse-popcount.html


As many people here have already commented, these are some screening questions asked in the first informal chat with a recruiter. I got the same questions this week, but they were presented as a warmup, which I now realise might have been disingenuous. Thank you for the detailed answers, though – as always, your comments are extremely interesting and informative.


> Hash tables are not O(1) lookup

In terms of algorithmic time complexity, it is O(1)


No, as the table fills up, you start to get clashes, where two keys hash to the same value. There are various ways to handle that - rehashing and trying again, linked lists - all of which take you below O(1). See [1].

[1] https://en.wikipedia.org/wiki/Hash_table#Collision_resolutio...


Site is down. Does anyone have a mirror?

Edit: http://webcache.googleusercontent.com/search?q=cache:rPrtrh1...


Those hex codes should come in handy XD


7. what is the name of the KILL signal?

Me: SIGKILL which #define is set to 9.

Recruiter: no, it's "TERMINATE".

Me: SIGTERM (15) is different from the KILL signal (9).

Recruiter: that's not the answer I have on my sheet of paper.

You know, I think I've been asked this question on a Google phone screen - and I think the question is specifically "What is the signal that the `kill` command sends?". The answer is in fact SIGTERM, not SIGKILL; if you want SIGKILL you need to specifically say `kill -KILL` or `kill -9`. If you insist that it's SIGKILL, you're just technically wrong. And if you can't understand the question, you're missing very important skills; this sort of confusion will cause actual production problems.


Good catch, and a good indication that the author is paraphrasing and not exactly getting all the details right.

(I don't doubt this happened, but we're only hearing one side of the story)


Frankly, this reads like a post from /r/ThatHappened. For those unfamiliar, it's a subreddit to share social media posts from people telling obviously made up stories with the purpose of making themselves look good (e.g., brave student stands up to intolerant teacher, receives standing ovation from peers).

The interview may have happened and the questions may be accurate but the story is very exaggerated to make the person look good (and the interviewer look bad).


This definitely fails a basic "smell test", I don't think the author has the most honest intentions at all.


I'm not going to go into specifics (I like my job and want to keep it), but the versions of these questions quoted in the blog, especially where the questions seem stupid and wrong, are not on accurate representation of actual SRE prescreen questions. It's possible that the recruiter somehow garbled them, or perhaps the blog's author is misremembering them after a stressful and frustrating conversation. But if you read this and think Google asks really dumb questions, I think if you saw the "real" questions, you might come away with a different opinion (especially if you understand the nature of prescreening vs. an actual interview). A hypothetical example, let's say the complaint is about the following exchange:

R: What's a potato? A: It's a vegetable that grows in the ground. R: Wrong. It's brown. A: Potatoes come in different colors, and they are vegetables that grow in the ground. R: Wrong. It says on my sheet that the correct answer is brown.

when in reality the question was "What color are Russet potatoes?" I don't know what happened here. Something, unfortunately, went off the rails.


Well I had the first call with a recruiter this Monday about a Google SRE-TPM role, and I got the exact same questions – granted, presented as a warm up.


If you had literally the same questions, worded the same, there's something horribly broken (or you got the same broken recruiter). I think you're remembering similar questions (e.g. they asked the one about the potato) but the details make a huge difference.


I stressed the fact that I had this interview 4 days ago – I'm 100% I got the same questions. My guess is it's because they are really just to give candidates an impression of what the actual interview is going to be like – this is what my recruiter said as well. Again, this is NOT the phone screen, just the first informal chat with the recruiter.


Scan the whole comments section now and you will find multiple people saying they got the exact same questions (I am one of them)


You mean, when they asked about quicksort, they used the precise wording in the blog post? Or was it worded in a different way, say, closer to the versions posted elsewhere on the internet? The difference is important. I do not believe the recruiter asked "Why is quicksort the best?", I think they asked a different question.


If you scan the whole comments section now you will find at least 8 people (including me) that say they had the same questions with the same wording.

Do you think a of particular question that you believe is different in the way Google asks it and the transcript of the OP?


Hmmm...

* A link to his LinkedIn profile

* A direct downlink to his resume (well, an HTML file for his LinkedIn page; same purpose)

* Talking about how his skillsets are a "rare mix".

* A "transcript" showing how he's smarter than the recruiter for a Google DoE position

* Taking on a big company in a post bound to go Viral

Yeah, this feels less like a blog post, and more like an attempt at a viral cover letter.


This guy has been at it for years. He isn't dumb but he doesn't seem to adapt to the world well.


That's not so surprising with Google, I'm afraid.

I had the exact same questions (apparently for SRE-SWE prescreen), but a slightly more intelligent recruiter (who had actually chased me for 2 years before I agreed to interview, so they were a bit more invested than the OP's guy). I went on to a phone screen and then on-site interviews, and then the hiring committee. The HC decided that coding/algo was strong, but that they needed a stronger "signal" re system design, and to my surprise, scheduled me two more sys design interviews (I did a total of 7 on-sites). I prepared by reading all the Google, Facebook, and Amazon systems papers and did 6 mock interviews with Gainlo, where interviewers all gave pretty good feedback (which led me to believe I wasn't a total idiot). When I went to my final Google interviews, I thought I did pretty well except for a couple of TCP/IP questions related to checksums and congestion control. I found those odd because all the prep materials that recruiters sent me listed knowledge of TCP specifics as optional, and this isn't my area of expertise anyway.

Oddly, one of the interviewers also seemed surprised that I wasn't a TCP/IP guru (--I bet that now he's telling his buddies stories about a candidate who couldn't do the SRE equivalent of Fizzbuzz or something similar.)

A week later I get the rejection and a long survey containing a link to the job for which they applied on my behalf. It was an SRE position that required deep knowledge of TCP/IP and various other network protocols.

My actual areas of expertise (for the last 10 years) are 3D graphics and computational geometry.

Lesson learned: always check the description of the job that overly eager recruiters apply for on your behalf. Had I known, I would've studied all about TCP/IP and prepared for that level of detail. All that time I thought that I was applying for a general SWE track, and I simply expressed interest to be matched to teams specializing in infrastructure and distributed systems later.


> My actual areas of expertise (for the last 10 years) are 3D graphics and computational geometry.

If I may ask, what put you off your former field so that you decided to move to the server-side world?


Nothing in particular. I was just looking for a bit of a change and to gain more rounded experience. Being pigeonholed into one specific area (3D/CAD/CAM C++ development) for many years isn't that great for future career prospects, especially in a market where most mainstream job titles seem to be split into a front-end or a back-end development category, and my experience doesn't seem to put me in either...


I had a very similar experience with them (around Java and Python dev questions) a few years back. Completely turned me off applying to Google, or replying to the numerous pings I get whenever they want to expand headcount in Europe.

(I'm not going to mention other companies here, but I also had enough "technical" interviews of this kind to make me wonder if people actually know how to recruit outside their - often quite narrow - perception of what their area of expertise actually is.)

Makes me quite sad, really, because there is zero assessment of experience, talent and know-how in this kind of interview (and whiteboard interviews can be ridiculous in their own right, too, but that's not the point here - it's just that they appear to be the next step in this broken appraisal).



I am not him, if you ask this.


I was referring to the author of the blog post. As far as I know there is just one person authoring GWAN web server.

(funny thing the site is down, maybe given the traffic brought by HN, and I assume the web server hosting the blog is GWAN :-) )


yes, it is gwan :)


Ha ha re:

> 10. what is the type of the packets exchanged to establish a TCP connection?

> Me: in hexadecimal: 0x02, 0x12, 0x10 – literally "synchronize" and "acknowledge".

> Recruiter: wrong, it's SYN, SYN-ACK and ACK; if Google is down you will need to know this to diagnose what the problem is. We will stop here because it's obvious that you don't have the necessary skills to write or review network applications. You should learn the Linux function calls, how the TCP/IP stack works, and what big-O means to eventually qualify if you are interviewed at a later time. Good luck, bye.

That's embarrassing.


Surprising they wouldn't have given the recruiter a list of alternate acceptable answers.

Almost feels like the recruiter was conspiring against you, but why?


I feel like someone that's not unusually unintelligent should have been able to make the connections between SYN and synchronize or ACK and acknowledge. I don't know why people are defending the recruiter. The process is dumb, but so is an interviewer that asserts himself as an authority when he has to read from a sheet of paper. I don't expect him to be a technical expert, but I do expect him to know when he needs to consult a technical expert about a particular candidate's answers. If this is an accurate representation of his interviewing skills, he needs a new job.


This is actually a real danger in hiring: personal bias.


In my experience Google recruiters do have a sheet with alternate answers, but those can't cover everything.


Why the hell is a non-technical recruiter giving these tests? Google engineers or managers should be administering it, not the high school dropouts who make up the recruiting industry. There's so much wiggle room here that have a rote-memorization style test for engineering is completely crazy. I've never had a technical test given to me by a recruiting firm. Its always a call with someone from the company, if not the hiring manager.

Perhaps this is what they give to 'second string' applications to make HR happy while the guys who actually get the jobs are friends of the hiring manager or team leads. Or its a H1B ploy to say, "See Obama, we need tech talent. Look how terrible our domestic talent is. They score 40% on our tests!" Meanwhile Bombay Upstairs University has a wink-wink-nudge-nudge deal with Google hiring managers who accidentally leak the test on a 'forgotten' ftp site.

Everything about this is fishy. I think there's fraud here, not just incompetence. I've been the hiring stooge for shops who have already made their decisions before and its always terrible and, frankly, hurtful. These are the signs of a non-serious 'stooge' interview.


Are you surprised though? If a person who does not have the merit to judge the answers very well this is what happens.


"There's an array of 10,000 16-bit values, how do you count the bits most efficiently?"

multiply 10,000 by 16. There are 160,000 bits


My sarcastic reply is to your very reasonable answer is:

On the x86 architecture, integer multiplies take more cycles than a comparable solution that uses bit twiddling so you're just WRONG!

You are the weakest link, GOODBYE!


I get that but I am proposing to hard code the answer since I already know it


No no no, this is Google. You must start a MapReduce, or whatever replaced it these days.


Most people, at every level, are terrible at interviewing and hiring, and in fact a surprising number are conscious of that fact. As a candidate, you just have to know this going in, and expect to be made to jump through a lot of irrelevant/stupid hoops. Eventually you find someone who is adequately amused by your performance to give you a chance and hire you.

I don't think the recruitment industry is going anywhere anytime soon. I just wish there were some worthwhile recruiters in it (out of the dozens and dozens of recruiters I've dealt with, I've met one who is semi-competent (which is to say, not fully competent)). This is an area I'd love to try to startup in, I don't think any of the startups I know about are doing it right.


Is this for real? Wow. Reads like someone spoofing the real Google interview.

For "Director of Engineering" they have a technical illiterate (nothing wrong with that, in other contexts) check for literal matches of answers? the mind boggles.

(I'm really having a hard time believing this is for real)


I've had somewhat similar experience with other big companies. People wonder why I hate to change jobs.


Believe it. The depths of corporate absurdity know few bounds.


On what data are you basing this?


There's no reason to sealion this, but here's a recent article that talks about exactly this topic: https://aeon.co/essays/you-don-t-have-to-be-stupid-to-work-h...


Possibly same as everyone else: Been applying for a few jobs over the years ;-)


This is a phone screen - the test they make you answer before they take the time to schedule actual interviews.


Director of Engineering is a soft-skills position that involves communicating with a lot of non-technical people. It doesn't surprise me at all that the initial screen would be a conversation with a non-technical person.


This reads as an indictment of the entire hiring culture of the tech industry, which it is. Hiring is one of the most important things any company ever does. It's also one of the biggest opportunities to attain a competitive advantage. And it's one of the most difficult tasks that any developer will ever participate in. And yet, consistently there is a lack of seriousness when it comes to hiring, and a reliance on bad, counterproductive cargo cult practices. It's not just embarrassing for this industry it has a big impact on the quality of life of developers and the quality of developer talent at companies. If you want to hire good people you need to put a lot of resources into it. And that means you can't just hand off these sorts of important tasks to interns who don't know what they're talking about. Hiring, even at the phone screen level, is not like comcast customer service, and any company that treats it that way is setting themselves up for failure.


These would be stupid questions even if they weren't being administered by a non-programming recruiter.


FWIW: I have interviewed twice at Google for Dir. Eng. jobs, both times going the distance. (Once rejected after a few weeks, once received an offer two months later, too late to accept.) In those interviews, I was never asked a seriously hard technical question, let alone an annoyingly academic and trivial one such as those in this post. Rather, I felt like they did a pretty solid job of understanding how I would engage with the organization and perform in the role.

Though the lack of technical rigor was not surprising in evaluating me for a administrative & support position, the lack thereof in the onsite portion was something I sent explicit feedback about afterwards.


The Google interview process is a disaster.

Incompetent HR is apparently the norm. When I finally got to my ( 8th? ) interview there, I was asked by a very senior manager whether or not I had ever used Github before.

Considering Google cold contacted me based on the specific email address and projects I make available on my Github profile, it was kinda was confusing. Ohh yeah, he also asked me what CSS stood for.

I ended up stopping the interview early myself.


As a person who's taken this exact same test: 1) it's a pre-screen from an in-house recruiter. If you have 37 years of experience with computers and 24 years professional experience and any references/reputation, they should have skipped you straight to an phone interview with an engineer. 2) Were you recording the interview? Because if it's an accurate transcript the recruiter should be fired.

I assume question 7 was actually "give the Big-O of Quicksort", if you refused the recruiter might have assumed you didn't know it and were trying to BS through the interview. If the recruiter was too stupid to ask, once that person is fired you'll get a new recruiter. Google is (was?) notorious for contacting the same candidates every six months regardless of previous interviews.


> they should have skipped you straight to an phone interview with an engineer.

When a family member of mine used to be an admissions officer at a small tech school, "unsuitable" (mentally ill, ppl who seemed like they would never get in, be able to pay or complain and tarnish the reputation of the school) used to come in often. They were given a screening test which was waived for all "normies."

This is how power works. It is not fair. With communication media like the internet, we're starting to perceive it. The real challenge will be re-architecting our society to be more fair and clean up from the aftermath of our unfairness.


A friend of mine was offered a similar position and had a similar interview. Annoyed by the questions and the "my solution is the only right one because I have to stick to this script"-attitude he asked: "So what do you want me to do at Google, how can I help you" the recruiter could not really answer and ended the call shortly after.


It's a freaking psychological profile test dressed up as "tech screen". Your friend's personality was effectively filtered.

IMO, the very first step in entering googleplex/facebook is accepting that it is OK to lend your talents to build a global surveillance & propaganda platform. These tests seem to be probing for other desired characteristics as the next step.


Having been through several goog interviews I find it hard to believe that the interviewer would dismiss his highly technical explanations/rebuttals in that way. Seems more likely this is his interpretation of the tone of the interview, rather than a transcript.


I think the candidate made a simple mistake: the interviewer is always right. Your job in an interview isn't to be right or to teach the interviewer. Your job is to make the interviewer like you foremost, and second make the interview think that you're qualified. And of course no one likes being corrected or told they are wrong. In my opinion, it is better to do well on an interview and decide after the fact that you're not interested, than to do a poor job on the interview because you couldn't help yourself correcting the other person.

For instance during one interview I was being asked questions about a particular topic, and I started to guess that the interviewer didn't understand 100% the topic he was asking about. Rather than correcting him, I simply tailored my answers to what I thought he was looking for, not what was right. I passed the interview and got a job offer, whereas if I had corrected the interviewer the results may have differed.


> the interviewer is always right

That is an awful sentiment, and I find myself in violent disagreement with you.

A good number of my enjoyable interviews have been with candidates who clearly knew more than I did, and could expand from an interesting detail to a short ex-tempore lecture on the topic. I cherish each of those.

An interview where I, as an interviewer, learn something is a fine thing indeed.


If the interviewer is in a position where he/she seems willing to listen and learn, then by all means impress. However, correcting an interviewer is always a dangerous gamble, and it is downright foolish to keep arguing with him or her when he/she doesn't agree with you.


> That is an awful sentiment, and I find myself in violent disagreement with you.

You find yourself in violent insistence that the world is the way you wish it was, rather than the way it actually is.


In that case I am shaping the world around me.

Any company who maintains that their interviewers are always right is telegraphing that they treat their workforce as mindless cogs in a machine.

Every time I encounter a candidate who thinks differently than I do, I treat him or her as a potential source of inspiration. Occasionally I learn something, and occasionally they do.


Is this common for higher level positions at Google? I've gotten a similar kind of phone screen, but for an entry level software engineer position, and getting a few "wrong" was not a big deal. It's just so they can avoid setting up a real interview with someone who doesn't know the basics.

If the call actually went like this, it seems like you just hit a new/not very good recruiter.


"Is this common for higher level positions at Google?"

No, and in fact, it's so far outside the norm i'm not even sure what to make of it. Like I said elsewhere, my best guess is that he was really being evaluated for a much lower level TL/M position in SRE or somewhere.

(The detailed linux questions are usually a giveaway that SRE is involved)


> It's just so they can avoid setting up a real interview with someone who doesn't know the basics.

They're (allegedly) interviewing for a Director of Engineering. If google can't even google the guy and look at his track record, code, whatever, they don't deserve to have competent people working for them.


I know, that's why I would be surprised if this is standard practice for high level positions. But even if it is, this is unusually bad execution. My recruiter had at least surface level knowledge of the concepts and was able to ask follow-up questions if I got something half right, he wasn't just looking for verbatim responses.


This reminds me of the infamous Nigerian emails. They are so obvious for a reason - avoiding false positives - [1]. Cannot fit this same scheme in a Senior position recruitment at Google though.

[1] http://uk.businessinsider.com/why-nigerian-scam-emails-are-o...


> This reminds me of the infamous Nigerian emails. They are so obvious for a reason - avoiding false positives

Unrelated to the main topic but I wonder if email providers could effectively fight those scammers by having chatbots automatically reply to their emails. That would make a bunch of false positive to sort through and possibly make the whole thing uneconomical for the scammers.


You would think that someone posting prescreen questions in anger would at least bother to edit the transcript so that they gave the right answers.

An inode is not an identifier. This super-awesome-knowitall has not bothered googling "define inode" before posting this.

If this really is the transcript then not only did you answer questions wrong, you also argued with the recruiter.

I have worked with people who used to be the best where they came from, who are not used to being wrong. This guy did not handle it well.

If this really is a true transcript, with internal monologue, then Google has avoided accidentally hiring someone arrogant who can't accept being wrong, or is at least not open to being wrong.

The reaction -- posting prescreen questions in anger in retaliation -- only reaffirms that Google did the right thing.


This right here is why Google hit the wall and stopped innovating. Which is sad because they seem to be trying really hard.


So much this. They would have missed Einstein & co.


I've been hit up by Google recruiters many times. I'll never interview there. The last time one of their recruiters hit me up he pretended it was for a Google X company, and then pulled a bait and switch on me, sending me a bunch of form e-mails on "interview prep" and how to expect their process would take 6 weeks minimum.

No thanks. I'm not about to put myself through a 6 week interview process at 32 years old to go work for an ad company. I'm passed the point in my life where I care and need to justify myself by trying to get a big tech company job. Already done the big tech thing.

It's not all its cracked up to be.


Isn't a Director of Engineering meant to be an interface between engineers and the rest of the organization? Perhaps this was more like a smoke test for the political insight such a role calls for, not the ostensible test of technical skill it was presented as. After the first couple of wrong answers, people with the right social skills to manage others and represent their interests to the broader organization would stop striving to prove their technical chops, clarify what kinds of answers the test is looking for, and adjust their subsequent answers accordingly.


I have been in similar roles (Software Architect , CTO) where I had to explain to non-technical, but impatient (or pissed off) people (CEO, VCs) technical concepts and judgement calls.

But I knew in advance they were not technical.

Also, a CEO would not challenge my explanation of what algorithm to use for sorting, be real :-)


You can know the technical level of your audience and adjust your explanations accordingly, yet still completely fail to achieve your intended result because your focus on what you want has blinded you to your misunderstanding of what they want.


I think this is exactly what they're looking for and what half of the people in this thread are missing. Someone else mentioned that they also had this interview and remembered these questions and that the person doing the interview told them he was a psychologist. Why would they be doing a technical interview with a psychologist? This was a skills interview for sure but it wasn't the skills that most of these people are assuming.


> Why would they be doing a technical interview with a psychologist?

Because every psychology major I knew from college either went back and got a masters in a different subject, or is working at Home Depot. They'd kill for a $12/hr job.


It's possible but Occam's razor leads me to think it was just a clueless interviewer.


I actually believe they rejected him because he didn't show proper people/social skills. The tone of the post, forgetting the part about his resume, looks like it's written by an angry teenager.


This hiring test makes me cringe. How is this any better than taking a multiple choice test in college? These questions asked should invite back-and-forth conversation like the author is providing, especially when the company is seeking out highly qualified candidates.


Early in my career I got asked a question about database reporting on sales items and totaling the number by a given date. This was one of the hardest questions I was ever given because I had to "write" the sql out over the phone and was not given a schema so I had to make one up as I went. I was then asked how I would go about writing one that included $0 sales days. At the time I didn't know the answer to this -- just that querying for something that doesn't exist is not straight forward in sql. I explained that I didn't work much with reporting so I had not encountered this solution before and gave a list of possible answers:

* Generate a complete list of dates in memory and see which do not exist. * Create a temporary table to match it up that way. * Do post-processing on the report. * Research prior art.

I was not offered the job.

Years later I was given the task of creating a "data warehouse" to enable easy reporting by business analysts so they could stop bothering the engineering team. So, it being the first time I had done this, I read up on different techniques. I solved the problem the previous interviewer proposed by have a table of dates with attributes on them (is weekend, is national holiday, day of week, etc) and all dates foreign key to this table.

However, because I couldn't think of this solution within a few minutes -- despite it not being in my background -- I did not get the original job from years before.

In many ways, I'm thankful the original interviewer passed upon me. I was able to get a different job that valued being able to think of creative solutions AND being able to research prior art so we don't invent a badly designed wheel over and over again.

I suspect the author of this article will experience the same feeling with time and it is a real shame companies are valuing root memorization and keyword matching over real problem solving skills.


To be charitable here: is it possible that this person did not work for Google? Historically, I've known third-party recruiters play fast and loose with their affiliation with the company they're recruiting for. This literally could have been any recruiter in the Valley trying to feed candidates to Google.


No, I have been asked the same questions by a person with @google.com email address that clearly said he was a psychologist and not a technical guy. His Linked in info at the time was Google Recruiter.


What the heck is a "Quicksort big-O score"? I've never heard anyone use the noun score when talking about complexity analysis.


I assume they want the average O(n log n), but who the heck knows given the script from above.

I get the feeling this: http://bigocheatsheet.com would come in handy


Big-O technically means "worst case", which for Quicksort is O(n^2), although usually (in normal cases) it runs better than that. Other sort algorithms can guarantee not worse than O(n log n).


Depending on how the pivot is picked, Quicksort can actually be implemented in O(n lg n) worst-case time.

EDIT: I was going to link a proof for this but it's surprisingly hard to find. IIRC, the idea is to use the median of medians algorithm ([1]) to pick the median for the pivot, and deal with values equal to the pivot by alternatingly placing them in the left and right partition, or alternatively just keep them in a third partition in the middle.

[1] https://en.wikipedia.org/wiki/Median_of_medians


That's incorrect. The /mathematical/ definition means that it's the upper bound. I.e. yes, the set of all functions of O(N) is a subset of the set of all functions O(N^2). So you could say Quicksort is O(N^100) and still be technically correct.

However, the big-O notation does NOT specify anything about worst/avg/best-case complexity of a given algorithm. That should still be defined in the analysis.

You mixed up those two slightly different concepts.


Must have been a while, I thought you went with average, but I guess its been 20 years. I mostly lived in optimizing cache since college.


"Order" would be the commonly used term, right? At least in my native language.


> Recruiter: that's not the answer I have on my sheet of paper.

That's not an interview. That's a waste of your time and an insult to your intelligence.

This reminds me of the time Google rejected the creator of Homebrew because he couldn't invert a binary tree: https://twitter.com/mxcl/status/608682016205344768


Mediocre people can't evaluate exceptional people. What mediocre people think is knowledgeable can't be exceptional knowledge, by definition. If you want to find exceptional directors of engineering, you need an exceptional person worthy of being director of engineering to evaluate them.

In seeking victory, not going beyond what everyone knows / is not skilled.

Victory in battle that all-under-heaven calls skilled / is not skilled.

Thus lifting an autumn hair does not mean / great strength.

Seeing the sun and moon does not mean a / clear eye.

Hearing thunder does not mean a keen ear.

So-called skill is to be victorious over the easily / defeated.

Thus the battles of the skilled are without / extraordinary victory, without reputation for / wisdom and without merit for courage.

- Sun Tzu Ch. 4 (Denma translation)


[flagged]


I... disagree? Like? The questions were OK questions, it was the answers that were asinine and clearly not meant to gauge anything except the ability to skim wikipedia pages about computer science.


This is also the test for the Technical Program Manager position. You must prepare rigorously to get through the initial conversation with a recruiter. The recruiter I spoke with seemed more open to me explaining why my answer was different than the recruiter OP dealt with.


Here's the good news: Google will get the directors of engineering they deserve.


I had this almost exact same Google phone screen, in 2010, for an SRE position. I'm 100% certain this is not a director-level test. I thought it was incredibly simple.

The second interview (phone interview with an engineer) was challenging. The third was on-site, and I failed, but was still a good experience. I'm a "small startup" type anyway.


I realize they have enough applicants not to care, but constantly reading this kind of thing about Google makes me never want to apply to work there.

It sounds like they took the worst parts of a startup, graduate research lab, corporation and bank and put them all together, but they have enough advertising $ that it doesn't really matter.


The problem is Google doesn't need you. Google gets onefinity applications per day. If they shave %90 percent of cold applicants this way then they will not hurt for more applicants. A company that large will gain it's best participants by reference.

-----------------------------------------

my interview with google was very short.

them: what is 2 ^ 37?

me: can I use a calculator?

them: no

me: then we are done here.


Did they want an exact answer? Getting closeish without a calculator is not too hard: 2^30 is about a billion, so 2^37 is about 128 billion.


You don't need a calculator for that, it's: 10000000000000000 ...

(binary obviously)

I could say pretty quickly it's somewhere around 12 billion decimal but an exact answer on paper would take me some time...

The amazing thing is that everyone thinks they're hiring the best people while it's pretty much never true. Generally speaking a place that pays well, treats people well and has challenging work will have better people than a place that doesn't and the hiring practices ability to filter beyond that are negligible.

OOPS 128 billion :)


The standard Google phone interview.

I got almost the exact same questions. My interviewer seemed more capable though.

I also wasn't as technically correct as this gentleman so my slightly less correct answers were more "right".

As sad turn of events given that I would have loved to answer the same way as he did here, these are much better answers than mine. But this is just technically recruiting, it's a crapshoot at best. It's much better to just take jobs at places that don't insist on trivia competitions.


Hehehe - that's really funny. Why did they even need a "body" behind the test? If all the person is doing is reading from script - it could have been just web based.


I have interviewed many people, in large and small companies, and saw how interview processes get built in the wrong and right way.

First, here's two reasons why interview processes suck. One, is that no one wants to help recruiting, or HR, do their job. Helping them is an errand and is done in a robotic way. You end up with an idiotic checklist that is a shame to great engineers. Second, is when the team finally realizes idiots joining the team are hurting the team - they will go about and have more motivation to support the recruitment process. Then, when you let a group of people devise such a task you get: an average process designed to leave out the human parts (shift bytes in array) and fit each person's interview style, or a crazy idiotic processes made by people mentally jerking off to each other (implement raft. in assembly). A team can be a company or an actual software development team.

The right way to do it, is this. I believe a single person, who cares about developers, who is passionate about developer experience - should build his/her team's interview process. He/she can take feedback from the team, but that person should eventually build the process and make the calls.

In this specific case I want to believe there is an accurate answer, and a correct answer, and they wanted the correct answer, or at least to see him negotiate his way to the correct answer even though it is subpar (although his accurate answers are impressive). That being said, I'm giving Google credit here. There is a small chance it might just be one of the first options I've described here.


terrible interview questions, that's why only technical people should be giving technical interviews. Most answers to questions are more nuanced than a straight quantifiable answer they have on a sheet of paper.


I had a somewhat similar experience when I was doing an on-site interview for an internship at a different large tech company.

There were multiple interviews with short (15-minute) breaks in the middle. Most people asked a mix of technical and behavioral questions, but one of the interviewers did things a little differently.

After we briefly introduced ourselves, he immediately asked me a technical question. I usually talk through the problem and ask for clarifications, he mostly gave me non-answers. I was asked the same question by another company a week or two earlier, so I started writing out code on the whiteboard. At a few points I asked whether or not I should handle an edge case, and every single time he said "yes".

When I was done writing all of the code, it took up pretty much the entire board. He asked me - "Why do you have so much code on the board?" and I responded "because I'm handling a lot of edge cases". As I recall that was pretty much the end of the interview.

I'm generally pretty good at interviewing, so I took whatever he threw at me and just thought he was a little off. It's not hard to imagine someone who isn't great at interviewing or has imposter syndrome doing worse in that interview just because of the way the interviewer acted.


I would agree with "metadata" answer being correct.

"Each inode stores the attributes and disk block location(s) of the filesystem object's data." [1] A file's "attributes" are independent of its storage. One could also argue the notion that a FS object's ACL are attributes of the file.

[1]: https://en.wikipedia.org/wiki/Inode


Except that anybody who actually does understand what an inode is and what it's for would say exactly what OP said.


No, because OP is describing an inode number, not an inode.


What always bothers me with most interview questions is the focus on hard technical stuff that's very easy to acquire and forget and easily relearned. I much more care about a candidates capability to write code that's maintainable. The hard challenges we are facing in our jobs are rarely hardvore technical issues but usually just being able to keep going at a fast pace without drowning in the legacy code we wrote last week.


Beyond even that, Google is in sore need of some creative people with business sense. They continually create and cancel products. And the products they create in general just dont fly. Besides email, search engine and ads, they just fluster about confusedly.


The interviewee give the right answers but in a more precise and more low-level - even pedantic way.

Clearly he know's much more about the questions than the interviewer but he's mistaken the nature of the interview.

This really is a preliminary phone screen and he should have recognised the interviewers lack of experience early and just played the game.

I'd be be tempted to not go ahead with him just for his lack of emotional intelligence.


For those doubting the interview, I was asked many of these same questions for a mid-level role by the recruiter just a couple of months ago. It's real. Passed the phone screen, but decided not to continue.


Why would a person interviewing for a director position let a recruiter do a tech screen? Wouldn't it at least be respectful for the hiring manager to do this evaluation? Something is very off here.

Or does Google use the same "we have a generic interview and if you get through we will then decide which team to place you in" crap for director level hires as well?


Google almost certainly receives many thousands of applications for the position. It sounds like this particular "prescreen" may be badly designed, but after a prescreen you'd certainly expect to talk to more technically-literate and -competent people.


This well illustrates the problem with "CS Trivia" style interviewing. It's billed as "exploring your background and how you think, not quizzing you for specific answers," but almost always devolves into an adversarial ego-measuring contest to see who can recite the most trivia from memory.

Much of of the content is minutiae that a working software engineer doesn't ever need to know and would just look up in the rare case where they did (how many bytes in a MAC address? Seriously?).

It's particularly bad when, as here, the candidate knows more than the interviewer does. The interviewee gives exact, nuanced answers while the interviewer only knows approximately correct answers that don't quite match. The interviewer then feels that their ego is vulnerable, that they are facing a serious potential competitor for social status at the company. They then summarily reject the candidate's correct answers as a defense mechnaism.


I took this exact same test for a SRE (site reliabilty engineer) position at google earlier this year.

I failed in the same way the author did but the recruiting person helped me so much that i could proceed to the next level.

Obviously she was aware that this test was broken and that candidates would occasionally pass it by sheer luck.

Other later interviews were equally broken in the same ways.


The irony of "Director" roles is that they're supposed to provide "Direction".

This requires gently pushing Engineers out of their comfort zones and into new territory.

Often a Director's specialized knowledge comes at the expense of defying conventional norms.

Filtering for candidates that can answer CS trivia will just get you Directors that know trivia.


wow, thats especially terrible. The engineering interview pre-screen I had at Amazon was better than that. Can't imagine why a company would do this for ANY technical position, much less such a senior one.

I did have an HR pre-screen one time where they sent me a SQL literacy test (did not specify sql dialect). One of the multiple choice questions had all correct answers, in different sql dialects, with no "all of the above" option. I corrected the test. I did not get a call back.


Google and all the other mega-tech corps do a similar tech phone screen where they're reading off an answer sheet. (powers of 2, sort these functions, linux kernel calls, etc).

Sounds like you dodged a bullet, this sort of thing speaks volumes to how the company is run internally and how you'd be treated there.



Perhaps they are looking for a pointy-haired-boss type rather than someone who actually knows his thing.


Sorry, you got too many right, you're not right for this role!

Actually, come to think of it, I did have an interview experience like that, just a few years ago. At the time, my title was VP of engineering, but for a small startup. A bigger company was interested in hiring me for a VP-level (or maybe director level, I can't remember) position, and phone screened me. After about ten minutes of talking, he said, "are you sure you wouldn't be more interested in our technical architecture team? You sound like a good fit for our technical architecture team".


Thing is, mortals like I know very little about workspace psychology compared to large scale operations such as Google. I have an inkling of an idea because I was repeatedly subjected to the same problematic environment too many times in my 10 year career as a simple software developer with no lead titles.

Off the top of my head:

There is always someone incompetent getting paid more than you. I think this means they are willing to spend money on someone else so you work harder to reach his/her status like a horse going after a carrot it will never reach.

There is always some kind of boss who will not indulge in technical details in order to understand how long a project would really take before pitching the idea to a customer. This is their way of "getting things done" or "making hard things a reality".


He seems to be pretty proud of GWAN, but it doesn't appear to have stood up well to HN traffic.


Could be his software sucks, or could be that he put it on a single node somewhere on a tiny VPS and he hit some sort of limit by his provider. Not fair to draw any conclusions imo.


Have had similar phone screen experiences for various roles, and sometimes the screener seemed to understand that I had a lot more knowledge/experience they did, and trusted my explanations, because I'd talk to them some more and explain the answer (their answer) in more detail back to them. Some of them may have thought I was completely BSing them, but a few seemed to grok I was more than qualified based on the answers and put me through.

Also took a computer-based test at a recruiter once, re: PHP, and it was... 20 questions. One of the questions was wrong, in that the syntax of the question as posted was incorrect. Pointed that out, and got a "thank you, we'll call you" kick out the door.


A "technical" recruiter asked me "how comfortable are you with 'tomcat' programming language". I said I am not at all comfortable with "tomcat" programming language because I have to invent one. He was furious and hung up.


Once in an interview for a web developer role I was asked to list the layers of the OSI model.

As I rattled off the layers the guy looked super surprised and asked how I remembered them and I said "pretty dicks never touch shitty people's assholes."

Needless to say, I got the job.


When I interviewed at Google for a Ads Solutions Engineer, I had some of these questions asked as well. (I got an 8/10 and was pushed to the next round).

I assume they have a massive list of questions (for various positions and have a minimum cut off - 7/10 I'm assuming as a few of my friends who got 6/10 and were not moved forward) and are asked to all applicants without consideration of experience/skill/previous work as well explained by Max Howell[0]).

[0] https://twitter.com/mxcl/status/608682016205344768?lang=en


Are they hiring a reference manual or a director of engineering?


But can he invert a binary tree?


He failed in the first sentence:

> I started coding 37 years ago (I was 11 years old)

they wrote him off right there


The Google person's response to the answer in question 5 would have pretty much ended the call with "can I talk to an actual engineer as you are doing poor keyword searches on my answers". I do admit to hearing the "I will stab you" voice in my head on that one[1]. This reflects really badly on Google.

Plus the answer on question 8 is the best worded answer to using big-O improperly I have seen.

1) http://www.urbandictionary.com/define.php?term=I%20will%20st...


If this is true, then it's not surprising as usually the person asking the question is not up to speed on the details of the question being asked.

I recall a question, which was how do you stop a socket from overflowing?

I said what do you mean? If you write to a socket and it fails to accept the write it will fail. Sockets do not have any option to stop this.

Naturally I failed the question yet I asked for the correct answer and was told the answer is simple, just use a cache.

Nice answer, excepts sockets do not have caches.

Correct Answer: Add a cache to the socket to stop them failing but that was not the question.

While that answer is obvious, it was not the question asked!!


I would have stopped you half way through. You obviously know enough to move on.


Sometimes I wonder if this approach has hurt Google's ability to innovate.


Nah. Having 57,000 employees did that.


Interviewing is a two way street. You have to be high enough in the food chain to be able to reject companies. Otherwise, you are going to just face such meaningless things over and over.


I'm surprised when people are surprised that Google's interview process is obnoxious. That's pretty well established by this point, both by Google employees and others.


There's literally quite a few people in here, including Googlers, that can't believe Google asks these questions.

(I don't necessarily believe the interview went down as reported, but the questions and the interviewer using a sheet with approved answers definitely are genuine)


Knowing the right answer, or when "it depends", is not the expectation. You need to get inside the mind of the interviewer and figure out what he/she thinks is the right answer is, or guess what's probably written on the answer sheet. Think: "What answer would a know-it-all 23-year-old Stanford graduate give?" That, and not necessarily the right answer, is what will get you the job offer.

This is also called "cultural fit".


I was with you until the end. Instead of cultural fit, I would call it "a stupid way to hire".


Well, I had a similar experience in my last (out of 9) interviews for a SWE position, and that interviewer was simply not accepting answers other then his, even though they were sometimes better. Welp, I will start again my interviewing marathon at Google next week, hoping that this time I meet the right interviewers (which is the only way you can actually get in, not by knowing stuff).


I interviewed with Google about 10 or so years ago for a manger post. I was not selected after 3rd phone interview.

Potential job was about predicting number of servers required or something like that. I was recommended by a friend's friend who apparently worked there. Google contacted the day they saw my resume and did two quick phone interviews verifying my basic background.

In the third interview which was highly technical, they asked a bunch of techie things including what an inode was. Whatever answer I gave about inode was apparently OK, as the interviewer proceeded to ask me how I would repair a faulty inode.

Since I did not know the answer, I honestly said I do know and joked that I would "google" for the answer. My pathetic attempt at humor did not sit well with the interviewer and was told that I did not pass the current interview.

Fixing inodes or other hardware/software problems is certainly something that needs to be done, but I did not buy the interviewer's assertion that everybody in Google knows how to fix inodes. For making predictions about server needs, failure rates are just one factor to consider and the time needed to repair them.

Whether a prediction manager needs to know the low level details of fixing inodes is questionable in my mind. I just assumed that Google interviews a staggering amount of people and reject a large portion of them for the smallest of reasons (like how astronauts are selected). It also seems they hire people for one thing, but that does not preclude them in deploying them in totally different positions. Otherwise there was no reason to ask me techie questions.

What surprised me was how fast they moved, how upfront they were about overall interview process, and how they asked about non essential questions from my perspective.


I had very similar intervies at Palantir and Yelp, albeit Yelp was understandably just a quick phone screen with a recruiter. However, the "trivia" interview at Palantir came from a forward deployed engineer on the DC based team I was interested in working with and one of my last hour long, onsite interviews of the day. Seemed like a big red flag at the time.


> the "trivia" interview at Palantir ... Seemed like a big red flag at the time.

Besides the bigger red flag that it was Palantir, you mean.


No, at the time Palantir was mine and a lot of other people's too choice at Berkeley. They spent a ton of money on recruiting my class and I went there for dinner/lunch several times.


Well, I think it's clear that the author is qualified to move beyond this stage of phone screening. So, the interviewer was at fault for not passing him, mainly due to a lack of technical knowledge imo and possibly and incomplete cheat sheet.

That said, I question how he's phrased some of the question/answer pairs. For instance, this is a phone interview so the questioner can't capitalize KILL in the signals question ("what is the name of the KILL signal?"), which makes me wonder how the question was actually phrased. It does strike me that SIGTERM is what the 'kill' command will send by default, which could have been the intent of the question.

He obviously still should have passed though, all of his answers indicated a good knowledge of the subject, and that's even if you consider them to be ultimately incorrect. I guess it's possible that op's tone didn't ingratiate him to the interviewer, but that's impossible to say with the information we have.


These are pretty much the questions I got asked on the recruiter screening interview for SRE/SWE.

But I kind of agree with the sentiment of "uhg, srsly, are we still doing this?". I answered a few of those with "not sure, I'd Google that". The recruiter's reply was "but what if Google's down". Well, there's always Bing.


I've seen a lot of these types of blog posts on here, about interview practices at the large tech companies. Because of these posts, I have no interest at all in seeking an interview with Google, Facebook, or Apple.

Maybe these types of interview practices are designed to reduce the applicant pool by scaring people away from even applying.


Pretty sure Google's interview questions are under NDA.. Brave!


This sounds like a phone screen--i've never signed an NDA for phone screens or at any stage of the interview process for that matter.


I think the person interviewing in this article should have seen the person doing the interview was looking for short answers. It is not a perfect interview process but I am not shocked he did not get it as you can see his annoyance as the interview goes on.

He could have answered, "because Quicksort is N * lg(N)". Instead he opted to have a long answer about it depends how the algorithm is implemented. It either shows he is completely unfamiliar with big-O or he chose to give a annoyed answer. He could have also answered that there are a family of algorithms such as merge sort which are also N * lg(N).

A bit skeptical of the transcript as well. Knowing a MAC address in bytes off the top of your head? Almost every other person would first think okay how many HEX characters is it, and how many bytes in a HEX. But he has this knowledge immediately?


The process seems geared towards hiring people right out of school since they'd have no expectation that such a candidate would be able to answer such questions.

Am I right in thinking that it's easier to become a Googler by starting a company that gets acquired than by submitting to such a broken interview process?


Am I right in thinking that it's easier to become a Googler by starting a company that gets acquired than by submitting to such a broken interview process?

No, not at all. At least it used to be so that when Google acquired a company, they ran all your engineers through their interview process, and if more than half didn't make it, they stopped the acquisition and the deal was off.

The justin.tv/Twitch people blogged about how this effectively made them a billion by stopping an early acquisition until they were worth much more.

(Or at least that is what I remember reading - I can't find the article right now, so maybe it wasn't Twitch)


I assume the easiest way to get into Google/Apple/Facebook is an internship.


I would like the original blog poster to state for a fact that the questions they posed are exactly the ones that were asked. In particular: the quicksort one. It doesn't jibe with my memory of how the question was asked (it was more like "what is the running time of quicksort").


That's pretty similar to what I was asked over the phone in an interview for API support (I forget exactly what the title was) for Google Maps, I believe. I got the SYN, SYN-ACK, ACK answer; questions about sorting algorithms; and stumbled a bit on random Linux questions, even though I use it daily as my desktop OS and have administered various Linux web servers for years without issue. I guess I use the man pages too much. The recruiter didn't cut my phone interview short, but did tell me that the technical interview was going to be WAY harder and that I needed to study my Linux, low-level networking, and algorithms. It was a good interview experience to have, but left me feeling like I'd never have the knowledge to pass a technical interview there.


New theory: maybe it was a psychological test, to see how you would react in a difficult situation.


Did anyone bother to read up on this 'business' of his? I just spent some time in the Wayback Machine[1], and it strikes me as pretty odd: unbreakable encryptions systems (which are not disclosed), NSA backdoors everywhere and general paranoia.


Recruiter: wrong, it's SYN, SYN-ACK and ACK; if Google is down you will need to know this to diagnose what the problem is. We will stop here because it's obvious that you don't have the necessary skills to write or review network applications. You should learn the Linux function calls, how the TCP/IP stack works, and what big-O means to eventually qualify if you are interviewed at a later time. Good luck, bye.

Whether he got interviewed for the wrong position by mistake or not -- how can Google begin to think that putting technically illiterate people in charge of "vetting" obviously highly senior people... could ever possibly be a good idea?


Hiring on their scale is generally unsolvable - anything goes.


I do some interviewing for my company sometimes. HR people suck at it, but they don't ask any forms of technical questions.

So it'd be strange to have such deep questions asked by a non technical person for such a high position.

Secondly, I was interviewed by Google once. The interview was great (even though I failed it).

Third, during this interview, I signed a kind of non disclosure.

On the other hand: I was also interviewed by Microsoft. The interviewer was non technical, and asked at least two tech questions (the difference between a struct and a class in C++, and what volatile meant). And, finally, Roy Osherove has a similar tale about Google.


This is hilariously depressing. Makes me never want to bother with the big guys.


This is 75% similar to a phone screen I passed (somewhat surprisingly, in hindsight) in 2003-2004. I don't think this is a director-level test, it's just a coarse engineer-track filter.


Have any current Googlers here had an experience where they knew someone personally, could vouch for their skills, but still could not get them in, due to them not passing the interview process?


This is what happens when a non-technical person conducts a technical interview.

Though I'm surprised this happened for a Google interview. Their recruitment workflow should now have allowed this to occur.


This seems a bit strange. I know some senior Google folks who spend a ton of time interviewing. Could it be that these are maxed out, and that HR people are forced to do the initial screens? In that case it's best to just feed them the simplest accurate answers possible. Similar to taking the SATs.

This type of process does imply something about their recruiting funnel. HR people asking technical questions suggests they've historically passed too many technically deficient people to overworked senior engineers.


How is Tcp/ip stack, MAC addresses and low level linux commands are "up to date codong skills".

Not saying they are not important, but you could have asked the same questions 20 years ago.


#9 is especially stupid because it's so context-dependent. SSE4 gives you a popcount instruction, for example, which would be easily the fastest way to do this, if available.


Yes, but without that instruction, the algorithm mentioned by recruiter is really the quickest way. I coded chess algorithm in the past and this was exactly the method top chess open-source engines used. But imho it is hard to figure that out without prior experience with this problem.


I coded chess algorithm in the past and this was exactly the method top chess open-source engines used.

Your statement is rather vague in time, but for example Stockfish did certainly use the hardware intrinsic at some point. Some of the top closed source engines were using SWAR approaches mixed with loops (when the expected population is 0).

The answer is very dependent on the exact HW architecture and the cache pressure in the surrounding algorithms.


No, the algorithm mentioned by recruiter is among the slowest ways.

I recently tested different approaches. I’ve been working on some code that downsamples large set of 1 bit voxels to get shades of gray on the edges. For that, I had to counts gigabytes of those bits as fast as possible.

Advanced manually-vectorized SIMD code worked several times faster, esp. on the hardware that supports SSSE3 or XOP instructions.

And even when the hardware doesn’t have SSE4, doesn’t have SSSE3, doesn’t have XOP — SSE2-only backup plan is still faster than lookup tables. Here’s the code: http://stackoverflow.com/a/17355341/126995


Yep. Went and tried the lookup method against a 5 step parallel shift and add method (which is the fastest bitwise way I know of without, and the lookup is ~5% faster than the bitwise way.

https://gist.github.com/monocasa/1d44a03cbd0170bfffc6a4a5c37...


Your code has 6 shifts, 6 adds/subs and 6 ANDs.

You can do it with 4 shifts, 3 adds, 1 MUL and 4 ANDs.

Your code is simply suboptimal.


For a 64bit quantity? I'm curious to see your algorithm in actual code.


Which is why you ask follow-up questions instead of just giving the optimal solution for UltraSPARC and rejecting what would be the optimal solution for other CPUs.


If I ran Google, I would just send unsolicited offers for a three-month trial contract to people who fit the profile of a talented software engineer with a friendly personality and who recently started searching for jobs. Yes, I know, "that's why you don't run Google," but I don't think it's that crazy. For example, I may or may not be a good fit, but even if I am, the chances are slim that I'll ever have the confidence to apply.


So… someone care to explain to me where I'm going egregiously wrong, apparently? Because, at least on my machine with the code I spewed out quickly, the Kerningham way of counting bits is ~6x slower than a lookup table (including generating the table itself).

http://sprunge.us/IIEH

The Kerningham way code seems faster for very sparse arrays (i.e. only one bit set per uint16), but slower otherwise.


You weren't doing anything wrong. He discusses two methods: https://graphics.stanford.edu/~seander/bithacks.html#CountBi... https://graphics.stanford.edu/~seander/bithacks.html#CountBi... The second option will probably beat a byte lookup table. GCC switched from a lookup table to similar code. If this still isn't fast enough, try using SIMD registers and the Harley/Seal method based on carry save adders (http://www.hackersdelight.org/hdcodetxt/popArrayHS.c.txt). If you like this sort of bit twiddling, you probably would enjoy Henry Warren's Hacker's Delight.


This wasn't a technical interview, it was an asshole detector. And it worked.

This test was written by somebody who knows the answers as well as this interviewee. It might have even been administered by somebody who is also in on the game. A tech prima donna like this is team cyanide. I once had to escalate to the VP of Engineering to get one such engineer off my project. The entire work atmosphere improved after he was removed.


quicksort is O(n^2) which is definitely not the "best big-O" for sorting.


Quicksort is O(n log n) average and O(n^2) worst case. Heapsort is O(n log n) for both average and worst case.


Its worst case is O(n^2) sure, so heap sort would be better if you know that your data would hit the worst case scenario of quick sort every time.


"how do you count the bits most efficiently?"

What does this even mean?


If you have a collection of data, and you want to know the number of 1 bits, and you want to do it with a minimum of resources... what is the process?

For example, standard bit-shifting and masking the lowest bit to set a counter is one way to do this. Possibly there are other, faster ways, such as using a lookup table (a byte or more can be "counted" at a time). Of course, because so many people were doing this, intel added a popcnt instruction which probably is more efficient (faster) than either of the above, at the expense of more CPU real estate, heat, etc.

Turns out counting 1 bits in a dataset is a super-important problem that shows up in a lot of situations.


What i think he means is counting the amount of set(1) bits in the array,


Thanks, the question makes more sense now :)


quicksort is O(nlogn) average case, and O(n2) on already sorted arrays.


Just a nitpick: There are versions of quicksort that perform well, O(n*log n), on already sorted data. But there is always some worst case scenario where it can be O(n^2).


On #9, is it 16-bit or 64-bit values? Sounds like you may have been answering a different question. Isn't the standard answer to use a 65536 entry lookup table?

Edit: oh, are you reinterpreting the array of 10000 16-bit integers as 2500 64-bit integers? But then what operation do you use on each?

Either way, if you find yourself arguing with the recruiter, it's probably a bad sign.


I thought the reason it was generally accepted that the 8-bit lookup table is optimal is because it can fit in the L1 cache.


(Very) embedded developer speaking here. What's this L1 cache you speak of?


It's level 1 CPU cache, the fastest (and smallest) cache.

It caches both instructions and data.

https://en.wikipedia.org/wiki/CPU_cache


It was a joke. Some embedded processors lack a cache.


GP is making the point that not all CPUs have a cache.


> Isn't the standard answer to use a 65536 entry lookup table?

Yes, it is, but is it such a common knowledge?


Thinking more about this, it's also possible that the candidate failed on understanding what the screener wanted, and giving it to him. If the question was "what signal does kill command send by default" and the candidate hears "what is the name of the kill signal," that's an indicator about important communication skills.


I think he took his website offline.

You can see read cached version here, http://webcache.googleusercontent.com/search?q=cache:53TICuY...


It's not, it just got overloaded. Guess G-WAN can't handle Hacker News. Funny that.

In case you missed it: G-Wan is a software he is selling, which supposedly is twice or fourt times faster than everything, including nginx and varnish cache.


You can fire this Director of Engineering.


There loss.

I am glad you kept a log of all the questions. I think the problem exists with a lot of tech companies our there. In my short career, I have been asked to solve mathematical problems, which is absolutely fine, but I write web apps that are not even remotely related or perform any complex calculations.


I was recently asked in an interview (for a position as programmer at a 3D printer startup): "When we have to update our software, we have to update thousands of machines. How would do you see that reflected in the _code_ of our _error recovery_?"

Am I stupid, or is that a hard to answer question?


> Google pagerank: the ultra-secret mathematical formula demonstrating that sponsored search results rank higher than reality can.

I am not sure why he was rejected, but this sentence makes it clear that he isn't mature enough to be a directory so I doubt that Google misses him.



Q: What is 2+2?

  Me: 4
  Recruiter: Wrong.  2 + 2 = 5 for very large values of 2.


Wow, I need to apply for a director job at Google. I knew about 8/10 things to the level that the recruiter knew about them with my 20+ year old C knowledge. The guy who responded was way better but obviously overqualified!


This is exactly on par with my experience with them -- I was interviewing and a particular engineer just wouldn't let me move forward with "linked list" because he wanted me to say "queue".


Both of them come off as really obnoxious. I highly doubt this is real though.


Am I the only one who thinks that this is a terrible way to hire any Engineer, Director or not? I may have hung up after first 3 questions saying that "this does not sound like a good fit to me".


They need to get rid of that recruiter. He/She is obviously not technically educated enough in computer science enough to understand the meaning of what is being said - Only the literal answer.


Perhaps the context of the interview is that you must answer using no more than three words or numbers. In this context defining a hashmap as a fingerprint or an inode as metadata is ok.


I believe the guy just talked with one of the non-tech recruiters from the agencies, not the actual Google employed recruiter.

There are bunch of those non competent recruiters playing smart asses...


Looks like the page is down. Is there an archived copy somewhere?


Archived version: http://archive.is/2Fj58


Another google HR "experiment" that obviously failed...


(un?)fortunately big companies like these have enough resources to have failed experiments..


Well, assuming the transcript is even remotely correct, they should start doing phone screens using algorithms. I'd guess they would outperform this recruiter.


Good to see that recruiting is broken for big positions too.


Had similar questions for an SRE role, but yes i didnt like the fact that i had to answer these over the phone to a recruiter. Also i was asked what is 2^16


I would be extremely offended if I was in his position and an HR-bot (both literally and figuratively) wasted my time with an interview process like this.


While there is a notion that devs are in demand for a company like google they have so many candidates that can be dicks to people and get away with it.


He's confused an inode (file object) with an inode number (the index). But who cares? This is not a useful interview question.

His confusion carries over to the stat question, because he's still thinking an inode is just an index. That said, I wouldn't describe stat() as returning an inode, either... it fills a stat struct. That's all. Inodes can have more or less information than is present in the standard stat structure.

If this is how Google interviews director candidates, they've really gone downhill since I last tried to interview there in 2011 or so.


Received a phone interview for the UK Google Maps team. Interviewer couldn't pronounce 'spatial'. My heart sank.

spat-e-all. spat-e-all. \o/


Looks like the other side was just a turing machine.


It was not the answers but the cultural fit ehehe...


Looks like the recruiter failed the interview here.


Feels like overqualification for that position.


I've interviewed at google for a lower position, and recruiters don't do phone screens or ask technical questions.


If I were interviewing him and he gave those answers I'd definitely give him a second round and try to get him in.


This is epic. Why would not Google just give a written test and not use a recruiter for this? I do not see the point.


"Powered by G-WAN" but that is down.

Based what the interview goes, Google might be on the "down" path too.


I had a Google phone interview around 2008. I remember a question how much is 2^5 (without calculator :).


In summary: "Recruiter: that's not the answer I have on my sheet of paper."


> What is the opposite function of malloc() in C?

Google uses C++. Do they use malloc/free?


Google makes extensive use of third party libraries. Some are in C.


I really hope this isnt real.


I once coded chess playing algorithm for fun, and can confirm that the recruiter was correct on #9: you count bits by using a lookup table and then sum the results. It's the quickest way. But I am not sure if this is possible to figure out immediately without such experience...


Ah, the appeal to authority.

It depends, but generally speaking, you are wrong and OP is right that you'd want to benchmark on the actual architecture.

a) First of all, you're probably basing your answer off of experience with 64-bit popcounts. But note the question was about popcounting multiple 16-bit words, not single 64-bit words. This isn't typically what you do in a chessprogram. b) The table has a cache footprint and can be pushed out of L1, which kills that approach in many real programs. c) Modern CPUs have a POPCOUNT instruction. It's slow and limited to one port on most Intel machines, though, so not necessarily always a win either. d) Lacking POPCOUNT, and with cache pressure, the SWAR approaches are good, especially if you can compute multiple results at once. With AVX2 it becomes especially interesting. f) If the the expectation is that many of the numbers are zero, a simple loop will win.


>"Lacking POPCOUNT, and with cache pressure, the SWAR approaches are good, especially if you can compute multiple results at once"

Can you explain what a SWAR approach is?


I recently tested different approaches. I’ve been working on some code that downsamples large set of 1 bit voxels to get shades of gray on the edges. For that, I had to counts gigabytes of those bits as fast as possible.

Advanced manually-vectorized SIMD code worked several times faster that lookup, esp. on the hardware that supports SSSE3 or XOP instructions.

And even when the hardware doesn’t have SSE4, doesn’t have SSSE3, doesn’t have XOP — SSE2-only backup plan is still faster than lookup tables. Here’s the code: http://stackoverflow.com/a/17355341/126995


can you really confirm that? I'd like to see your code.


>that's what happens when AI bots discover recreational drugs. LOL


This entire transcript reads like a metaphor for democracy.


The best solution to this problem is multiple choice.


This is the best example of one unintentional DoS attack, now I cannot read it anymore :(


for this level, isn't the process itself a bit humiliating?


looks like he needs a better web server for hosting his blog :-)


anyone have a mirror ?



thank you


Can anyone access parent site? Seems we have given it the HN kiss of death.


Site's down?


http://archive.is/2Fj58

ctrl-f for "down" next time :P


Offered without comment:

The site has been flaky, if not down, for hours now.


aaaand website is offline 100%


I'm reminded of a story about Einstein in the 1930s. The fad at the time was to have a "skills test" which was supposed to place you perfectly according to your results.

Einstein took the test, and did terribly. Everyone was shocked, and asked him what was going on. His response was "The questions tested memorization. Why would I remember what the capital of North Dakota is? I can look that up in a book!"

Perhaps unsurprisingly, the tests faded out of popularity.

It's sad that 100 years later, even Google hasn't learned that lesson.


Perhaps, as of yet, Google isn't living fully by their own principles ("automating themselves out of a job"), maybe because the tech isn't quite there yet or maybe for other reasons. It remains to be seen if the ultimate solution to bureaucracy is AI/ML or if it will just be a further extension of these same mechanisms.

The problem with simple "skills tests" is they lack depth, and of necessity due to time, always will. The problem with performance tests is the opposite: they may take too long to reach a conclusion in human terms, during which time significant money is lost.

Recruiters seem like a great candidate for replacement by AI. [edit: and I should add, most middle managers too]


This is very hard to believe. In most big tech companies technical interviews are conducted by technical people, not recruiters. Especially for a senior role like Director of Eng. the experience described here stretches credulity.

Also most of these companies have policies which don't allow any feedback to be given on interview performance. In light of that the recruiter saying "you don't have necessary skills" is extremely surprising.


Many companies do a pre-interview screen performed by recruiters to filter obviously bad candidates. I have personally helped construct such screens -- though not at google.

Ensuring the recruiter understands the difference between an answer over their head and a wrong answer is crucial. What we did was to make sure the recruiters understood that answers they didn't understand needed to be surfaced to someone in engineering for a sanity check.

Often our strongest candidates would give an answer which recruiting wasn't capable of vetting. The key is making everyone aware that the answers may not fit the script, and that this is OK. Recruiters shouldn't be judging the technical details of an answer; they should be looking for "gosh I don't know I've never seen that."

Pre-screening is sadly necessary when doing high volume recruiting. There are a LOT of people out there who grossly inflate their competency.


It saddens me slightly how far down the page your comment is, as that's exactly what's going on here. I've also (and also not at Google) had a hand in constructing these. It's a nightmare on the hiring side, too.

We tried very hard to tailor our questions such that the candidate — if they new the answer — would only give one exact answer. Ours included a download of sample data to operate on; one was a question along the lines of "count the number of lines in any .h files that contain the following pattern <human description of the patter>"; the answer ends up being a single integer.

My experience has been — overwhelmingly — that recruiters hiring for technical positions are incredibly non-technical. This wouldn't necessarily be a bad thing, except for, as mentioned, the huge amount of winnowing required to just find candidate that are suitable to advance to a phone screen with an engineer.

> There are a LOT of people out there who grossly inflate their competency.

Exactly this.


As stated elsewhere, these questions have been very typical for what I'd call 'first level phone screen' for SRE positions at Google.

I interview with Google every couple of years for fun; I always learn a lot and have a lot of fun with the on-site interviews.

The recruiter does indeed ask these kinds of questions. I've been asked most of the ones mentioned here, some of the multiple times.

I find it extremely hard to believe that the answers the recruiter had in front of them was wrong.

So to be clear, how it has worked every time I've done the dance with Google is:

1. An initial, non-technical recruiter chat.

2. The recruiter gives me a series of technical questions with extremely clear-cut answers, as described in the article.

3. A phone screen with someone technical.

4. A second technical phone screen.

5. On-site, 5-6 50 minute interviews.


What the 5 different subjects of those 60 minute interviews?


This wasn't a technical interview. This was a phone screen. As dumb as it is, having non-technical recruiters do a "technical" phone screen like this is increasingly common. I don't think this practice is completely justifiable, but part of the root cause is the enormous volume of grossly underqualified people who apply for any particular position. It takes a hiring company a large amount of work to vet a candidate but it takes a candidate nearly 0 effort to apply for a job.


I've never interviewed for director-level positions, but I have definitely received a few technical questions from recruiters who didn't seem technical. The questions were generally easy.

I still think it's a bad practice, though. If you absolutely insist on doing so, multiple choice is the right way to go.


It's standard practice at the Googles and Facebooks of the world to have a recruiter lob a bunch of canned questions at you during a phone screen, including Director-level phone screens. My experience was a lot different that the writer's, however. The recruiter was friendly and I was friendly back. I think I missed one or two but still went on to the next round. This guy comes across sounding like a pompous asshole and I wouldn't want him as my Director of Engineering, either.


It's just a phone screen. I got some of these questions from a Google recruiter on a phone screen, for an SRE manager position. Actually the recruiter asked me if I prefer questions focusing on system or on complexity and general CS topics, and I chose the latter. After passing it I got to a phone interview from an engineer, which had more questions and some coding. But this sounds about right for an initial screening.


"4. sort the time taken by: CPU register read, disk seek, context switch, system memory read. Me: CPU register read, system memory read, context switch, disk seek. Recruiter: right."

This is the exact question I got in a phone screen (although mine included CPU caches so was actually harder) for the lowest level SRE position at Google. That and the obvious lack of knowledge of their interviewer, who I'd expect to know these answers inside and out, point to Google lowering the bar extensively not only on their interviewing practices (where it certainly was never high as in quality) but on the actual quality of their hires. Any SRE could be Director of Engineering at Google, apparently, going by this test. I'd say that equates to not having a bar at all.


"Recruiter: wrong, it's SYN, SYN-ACK and ACK; if Google is down you will need to know this to diagnose what the problem is. We will stop here because it's obvious that you don't have the necessary skills to write or review network applications. "

Because when Google is down, it's typically upper management that fixes network issues. CEO level to directors of engineering are on pager duty most nights there.


May I assume this is tongue in cheek?


Absolutely not. The reason so many Google upper management types sport beards (case in point, Sundar Pichai), is the chronic lack of time from working nights as data-center technicians/network engineers. How else can you explain Google's up-time and low-latency? :)


Right. Who has time to shave when you're busy examining network traffic?


Just a general comment, again anecdotal. There seems to be a trend of recruiters performing bait-and-switch in SV currently.

For example, you may be hired for a technical role, but you end up doing manual work or bash script maintenance.

Another tactic (this one is pretty hilarious) is that hiring managers will continuously interview new people for the same job description, without any intention of hiring them, in order to keep the slot open. Sad, but true.


This is weird - do big companies actually ask stupid questions I can google, and let non-technical people lecture engineers on Big-O?

Hope this obviously talented person finds more meaningful employment elsewhere.


> do big companies actually ask stupid questions I can google

Well, he was interviewing _for_ google. Maybe they wrote up the test before google was online and forgot to update it.


I feel sudden brick wall awaits in a recruiter career path once the google people read that.

Not only because of that interview, but the questions transforms into - who else have we missed from the great guys ...


Having had some experience with big companies, I doubt that. They'll probably just write this guy off as being 'non googley' or whatever, and keep on keeping on.

(assuming the interview really went down this way. Which I have a hard time believing)


Doubt it.

This is a pretty standard google first interview. Nothing out of the ordinary.

The recruiter will not suffer for following standard proceeding.


Maybe they set unrealistic expectations to see if the applicant lies about meeting them and give wrong answers to see if the person acts like a smartass.


I think this is a very equitable process. If a medium-functioning retarded person wouldn't hire you, that's saved the company like $100 of time to ascertain your qualifications. This is much better than using biased metrics like problem solving sessions or personal assessments of character from people who will be working with them.


This entire blog post does not cast the author in a favourable light.

First off: It is rude to post the questions and expected answers, perhaps moreso when you just failed to pass the test yourself. That reeks of sabotage out of spite.

Then pulls the argument from authority, aggrandizing his accomplishments in an effort to salvage some of his obviously bruised ego.

> Is Google raising the bar too high or is their recruiting staff seriously lacking the skills they are supposed to rate?

Seems like Google raised the bar just high enough to weed out the unpleasant personalities. A job interview, even a phone screening, is not all about technical skill, it is also about soft skills: Is this person a pleasant and communicative person?

> Recruiter: wrong, it's file metadata.

> Me: the inode is an index uniquely identifying ...

Here is where the interview (or at least, this is probably how the author remembered it) went off the rails. Arguing with the recruiter and trying to right a perceived wrong was just plain bad strategy.

From there on out it starts being argumentative, and you can see the recruiter having none of it.

> Recruiter: Quicksort has the best big-O.

Here you can clearly tell the recruiter is just fishing for "correct" answers at this point. Just tell the recruiter with the big-O is here, don't be obtuse or difficult.

> Why not let me compare my code to yours in a benchmark?

Here it turns into a code-measuring contest. Not a very bright idea. By this point in the interview I find this author's personality very grating and needlessly argumentative. He can't do anything right by now.

> We will stop here because it's obvious that you don't have the necessary skills to write or review network applications.

The recruiter seems to agree with me, but I doubt this is what was said exactly. The conclusion may be similar though.

> Maybe Google should have stated that practice is not necessary for the job.

Maybe Google does know best what is necessary for the job. Or maybe the author knows, but he is too curt in getting his views across. Not a favourable trait.

It is hard to not get salty over a failed job interview. It takes strength of character to not burn bridges and try again in the future with a more pleasant humble approach. Take it as a lesson and move on.


>>Recruiter: I have to check that you know the right answers.

This is classic: the right answers!! Must be google getting infested with managers who don't have shit clue about what is programming and computer science.


I'm feeling a bit confident looking at the questions. Woo.


It seems easier to get Google to fund and then acquire your startup then get hired there.


[flagged]


So apart from giving the right answers, you now have to intelligently predict when to give wrong or suboptimal answers to account for the inadequacy of the interviewer? Not that I'm blaming the recruiter here because he/she was just trying to do his/her job, but come on. If I were in his shoes, I would be pissed off not because of failing to land the job but because of being told that I gave the "wrong answers" when they were clearly right.


As a director are you not required to be an interface between the technical and non-technical?

If you can't dumb it down you're just going to waste peoples time.


Dumbing stuff down is not the same as giving the exact answer on some cheat sheet you've nerver seen.


Dumbing it down and dumbing it down to precisely the words written on a piece of paper are different things. If the "correct" answer were simply any explanation that the interviewee understood, that would be entirely different and considerably more reasonable.


I think he was mildly amused.


Not saying that this "ceremony" is right in the first place though. But it is easy to prepare for and go through.


I am doing neither of those things. I said it was tempting, because it puts things I've read about G-WAN into perspective (the claims I saw some time back when I was shown a heads-up versus nginx were questionable, and it's an interesting data point). That's why it's an unrelated addendum, and it's completely unrelated to the blog post at all.

I have no desire to discredit someone I have never met and whose name I do not know, much like I have no desire to have my intentions explained to me by a Hacker News commenter. I wrote, quite clearly, that I wasn't doing something. To directly assert that I am in fact doing that thing and then ascribe further malice to it is to challenge my honesty and integrity, and I'd appreciate if you'd not do that in the future because you've never met me and know nothing about me.

There is an interpretation of my first bullet that would support your conclusion, but I only put down my first bullet to establish relevance in the comment, not to connect the two things.


We detached this subthread from https://news.ycombinator.com/item?id=12704822 and marked it off-topic.

Please don't take any more HN threads into tangents of drama. I know it's annoying to be accused of something you don't feel you did, but this subthread is the kind of noise we all need to avoid here.


I never said anything about your integrity but you inferred from my comment just like others will do from yours.

The difference is the individual you're disparaging is a real person with a reputation. You, like me, are a throw away account on a message board. You have no integrity because you have no identity.


I wouldn't call an account almost three years old a "throwaway" anymore.


Underwrite his mortgage and ask him to be the godfather of your firstborn.


Looks like he really did hit a sore nerve.


The repeated charge that I am somehow disparaging and discrediting an individual was my clue that I should probably disengage. I probably should have worded the first bullet differently, in hindsight, but still.


Well, the problem here is apparently one of human relations. You are absolutely, 100% remiss to expect that, when you state "it is tempting to draw that conclusion," that you are actually stopping short of drawing that conclusion. It's as bad a communication mistake as expecting Quicksort to perform well in all cases is - which is to say it simply does not conform to reality.

Raising the possibility of the accusation and defining the basis for raising the accusation are, for almost all intents and purposes, exactly the same as simple raising the accusation, especially in an internet forum, where nuance, body language, and tone are absent.

Thus, you really didn't avoid actually disparaging or discrediting this person there - instead, you attempted it via an obtuse use of a 'sneaky' method, and you bear deserved downvotes for doing so. If that was not your intent, you may look on this experience as a bug - the language you used did not communicate your intent to your discussion partners. It's almost always valuable to gain a deeper understanding of the functions you're using, though, whether they're from English or C++! Have a great day, and talk to you some other time! :)


>The difference is the individual you're disparaging is a real person with a reputation.

So? If the criticism is invalid, then it's inoffensive. If it's valid, it's deserved.


False I think.

Invalid criticism posted on the Internet or spoken out loudly might very well negatively affect someones life even if they don't personally care.


You tell that to anyone who has ever been falsely accused of anything, and ask them if the invalidity of the accusation led to the experience of being accused, questioned, tried, and then found innocent being describable with the word "inoffensive."

(Then multiply whatever result you get from that by the propensity of the internet to hear an accusation and judge it to be true WITHOUT actually researching it/'putting it on trial', so to speak, and post your answer here, I'm sure it will make for fascinating reading! ;) )



Do you really think there's no such thing as a simple observation without intent? I'm not allowed to find it odd that a Web site about a Web server operated by the same Web server went down under a Web load, and remark upon it yet stop short of drawing or stating a conclusion because I don't have all the facts? How many times do I have to say that I have no intent, here? Apply Occam and cui bono. I don't even bloody know who the person is. What is my incentive to disparage him or her? What do I gain from that? Why would I resort to questionable, politically-charged rhetoric to tear someone down who I literally didn't know existed four hours ago? Is it really more likely that I'm out to get him or her?

Maybe the datacenter burned to the ground. I don't know everything so I'm not going to conclude what's happened, just that I find it odd it's been hard down for several hours now. It's interesting, and it's oddly characteristic of this community to infer that I have malicious intent simply for observing something and finding it interesting.

There are many unkind interpretations of the blog post and I felt I did a pretty good job with restraint in the section even before the additional addendum. I didn't have a lot of sympathy. I didn't accuse the author of lying, or making shit up, or any sort of malicious behavior even before the evil addendum that everybody hates (and many, many non-gray comments nearby have done just that). Why would I suddenly change gears and attempt to destroy a reputation?

I am aware people are inundated with rhetoric like this in several forms of media due to the current political climate and other factors, but Jesus, people really need to put their knives away and start challenging their assumptions of the worst in people or we are all royally fucked. That's letting the rhetorical climate win. Occam: I'm a shady person not-so-subtly and rather hamfistedly deploying rhetorical tactics to destroy someone's reputation simply for blogging about the Google interview process, or I'm just a random dude typing things as I find them interesting. Your pick.


See, now, the thing is, Occam's razor would, when you use my experience on the internet to train it as a heuristic, absolutely, 100% tell me that you were being a mean, stupid, malicious sack of excrement in this case, because ALL of the reasons you state re: "What reason would I have to destroy this person's reputation?" are ones I've heard time and time again as excuses from someone who turned out to be a troll who was destroying somebody for fun.

And that is the baseline Occam on the internet: the simplest explanation with the fewest assumptions is that the person on the other end is a troll who is destroying someone or something because destroying things is fun. Nobody needs any reason, on the internet, to want someone to go down in flames. Seriously, I have to make many more assumptions to assume good faith on your part than otherwise - the only reason I would bother applying another heuristic, that is, not to attribute to malice that which can be explained by incompetence, would be because it isn't worth the skin I'll lose off my typing fingers to engage you. You seem interesting and worth engaging, which I why I'd tell you things like, "raising an accusation is functionally similar to making the accusation."


lol, site down. nice scale


lol, site down nice engineering work!


Google doesn't ask these types of trivia questions. I don't believe that you were talking to a person from Google one bit.

Everything documented about their hiring practices and anecdotal evidence from people gone through their interview process is that you talk with a real engineer and write code for your interview.


I took a phone screen interview for SRE (site reliability engineer) at Google Dublin and the questions at the screen test were the same, almost to the word.

I don't know this guy but the questions he's talking about are genuine Google screen questions.

Questions in the later interviews (that is, if you pass the initial screen) are more complex, and involve actual coding or longer problem solving.


The phone screen is like that, albeit in my case with an interviewer that could adapt a lot more, even if not technical himself.


It does. I have already said it in another thread, but I have been asked the exact same questions at Google (applying for a Java developer job)

Several other people have come out and said the same.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: