Hacker News new | past | comments | ask | show | jobs | submit login
Google's “Director of Engineering” Hiring Test (gwan.com)
1764 points by fatihky on Oct 13, 2016 | hide | past | favorite | 923 comments



FWIW: As a director of engineering for Google, who interviews other directors of engineering for Google, none of these are on or related to the "director of engineering" interview guidelines or sheets.

These are bog standard SWE-SRE questions (particularly, SRE) at some companies, so my guess is he was really being evaluated for a normal SWE-SRE position.

IE maybe he applied to a position labeled director of engineering, but they decided to interview him for a different level/job instead.

But it's super-strange even then (i've literally reviewed thousands of hiring packets, phone screens, etc, and this is ... out there. I'm not as familiar with SRE hiring practices, admittedly, though i've reviewed enough SRE candidates to know what kind of questions they ask).

As for the answers themselves, i always take "transcripts" of interviews (or anything else) with a grain of salt, as there are always two sides to every story.

Particularly, when one side presents something that makes the other side look like a blithering idiot, the likelihood it's 100% accurate is, historically, "not great".


This looks like a typical pre-interview recruiter phone screen… they're looking for shibboleths that identify the candidate as a genuine computer person who took CS 101, and exclude candidates who spam every job with bogus CVs. I'd start every candidate with this screen, unless I personally knew them & was familiar with their technical ability.

  > none of these are on or related to the "director of engineering" interview guidelines or sheets
They'd be internal to recruiting, so you wouldn't see them unless you were heavily involved (doing interviews and recruiting trips isn't being heavily involved). They're for any recruiter to use to quickly eliminate bogus applicants.

  > Particularly, when one side presents something that makes the other side look like a blithering idiot, the
  > likelihood it's 100% accurate is, historically, "not great".
You can just outright call him a liar… I'd expect this to be a fairly accurate report. It looks like the recruiter misused the screen; instead of eliminating obviously bogus candidates, they used it to eliminate a candidate who may or may not get an offer (and thus a commission). They should have proceeded to the technical phone screen stage. If the guideline on the recruiter screening is: drop anyone with <100% correct, then I think that's silly.


I'd hope it's not too typical, since four out of the ten official answers are wrong, and even one of the questions manages to be wrong. (Specifically, the "why is quicksort the best?" is just completely ridiculous.)

It's one thing to blindly apply a simple questionnaire without thinking about the answers that come back, and yet another thing to do it with a questionnaire that's doesn't even get stuff right.


I wouldn't be surprised if the recruiter just googled to find a list of questions and answers. This candidate probably isn't even on any official radar. The recruiter probably just uses this as a means to evaluate candidates before they officially call dibs on them. Google could very well be different since they do many things differently but recruiting has always been a sales position with everything it comes with, namely leads, quotas, conversions, etc.


I don't think that's what happened. The questions look too familiar to me, and I've been through the SRE-SWE interview process which is what the top-level comment talks about.


Maybe it's the whole "better to have false negatives than false positives" philosophy Google espouses?


The problem is, once you have a crud ton of false negatives, people stop wanting to apply to work for you, especially when you get excluded via junk like this. And every false negative that posts about it online... well, this post is at +1363 right now?


That's part of it, but the other part is what you mentioned earlier--leads, quotas, conversions, and don't forget diversity initiatives, inexperienced recruiters, and the fact that the first part of the funnel has to be dirt cheap to work at scale.

For what it's worth, lots of other companies seem to use almost the exact same process.


At one point at least, physical security and recruitment were about the only contractors Google used.


They do seem typical; the recruiter asked many of those same questions when I interviewed for a SRE position back in about 2004. I particularly enjoyed the bit count; I went back later and confirmed that the bit swiggling approach was faster on the machines I had handy. Large lookup tables have poor cache behavior.

On the other hand, the recruiter did not tell me whether I got the right answer (or I didn't miss any). It was pretty clearly a scripted initial phone screen with someone who wasn't a programmer.

Oh, and they didn't ask anything truly ridiculous like the QuickSort question.


I was asked the "what syscall returns an inode?" question (and agree with DannyBee that this is extremely similar to my successful SRE screen) and I answered stat() without the clarification because I understood what the screener was doing and the parameters in which she was operating. That context on how phone screens work is missing from this transcript, but it's also unfair to expect that sort of context from a candidate because not every interviewee is familiar with the standard Google style tech interview.

The screener is the wrong person to walk down Pedantry Lane or Hexadecimal Packets Street and that's the sort of thing you save for the actual interview. But yes, I agree that it's shitty that the incentive is to answer for the test instead of the exact truth. (I wasn't extremely supportive of the interviewee once he turned slightly sarcastic and rattled off hexadecimal bytes instead of just saying "SYN" and "ACK," though.)

As an unrelated addendum, I'm intrigued by the following four things:

    1) The author wrote a Web server and framework, G-WAN
    2) I've seen G-WAN advertise itself questionably in the past w/r/t perf
    3) gwan.com is powered by G-WAN
    4) Under Hacker News load, the entire gwan.com domain is hard down
I'm not drawing a conclusion, but it is tempting.


No software can perform better than the hardware allows it to. End of argument. Even if you make 100% optimized ASM tailored to the hardware and workload, and you can still kill it hard with enough requests. For all we know he hosts the website on a free tier of whatever to show how well it performs. It being down doesn't tell anything useful other than the current workload exceeds its capabilities.


> I wasn't extremely supportive of the interviewee once he turned slightly sarcastic and rattled off hexadecimal bytes instead of just saying "SYN" and "ACK," though.

His response:

> in hexadecimal: 0x02, 0x12, 0x10 – literally "synchronize" and "acknowledge".

What do you think SYN and ACK stand for? Could it be "synchronize" and "acknowledge"? Moreover his point that knowing the bytes is more useful when you're looking at packet dumps is valid.


The messages contains a lot more than the flags though so those bytes aren't enough and he didn't mention SYN-ACK.


They're bits. SYN can be represented as 0x02, ACK can be represented as 0x10. 0x02 BITWISE-OR 0x10, ie SYN BITWISE-OR ACK or 'SYN-ACK' colloquially, is 0x12.

"in hexadecimal: 0x02, 0x12, 0x10".


Those are just flags, the message contains much more than the flag. Therefore it is wrong to say that they just send the flags and that the flags are equivalent to SYN and ACK.


Although I can see why it's tempting to poke fun at this situation, his site could be down for any number of reasons not directly related to the quality of the code he wrote for G-WAN.


Even if I give you that, it's still 30% of the official answers that are wrong, and 10% of the questions.


> I'm not drawing a conclusion, but it is tempting.

Except that you did draw a conclusion.


You are drawing a conclusion and you're making it public to discredit this person.


It is true that gwan.com is (still) down.


"They'd be internal to recruiting, so you wouldn't see them unless you were heavily involved (doing interviews and recruiting trips isn't being heavily involved)."

Actually, this is a super-bad assumption, pre-screening questions, etc, are all public to google internally. There are no magic internal-to-recruiting parts to the questions, and they are in fact, listed as SRE pre-screening questions, so ...


But they appear to be changed in subtle ways from what's listed on other sites. For example, googling for: Google SRE interview questions inode

returns a few hits, including:

https://www.glassdoor.com/Interview/Linux-system-call-for-in...

which lists the question as "system call for inode data" - which is importantly different from a system call to return an actual inode.

This post says something similar: http://gregbo.livejournal.com/182506.html

"There were some questions I just didn't remember the answers to, such as "What system call gives all the information about an inode?" and "What are the fields in an inode?""

(Argh, the blog post is down, so I can't compare some of the others, but several of them seemed to have been changed in ways that made the question itself seem wrong.)

((Thanks to leetrout below for bopping me on the head with the google cache. Next bit added thanks to said bop.))

Another one: The blog post lists "what is the name of the KILL signal?", but googling for: google sre interview questions kill signal

turns up this post on glassdoor: https://www.glassdoor.com/Interview/site-reliability-enginee...

Which lists the question as "What signal does the "kill" command send by default?"

That matches much more the answer SIGTERM that the interviewer was described as insisting on.

That suggests a few likely possibilities: (a) The interviewer misread the questions; (b) There was a horrible communication failure between the interviewer and the interviewee; (c) The interviewee failed to actually listen to the questions before answering.

I have no information with which to assign weights to those possibilities, but all of them seem more likely than "the interview questions themselves are actually this horrible" (they're not as broken as the blog post made them out to be. After writing this, I looked.)


I got asked some of these questions literally yesterday by Google and you're right. The wording was how you've presented it, not how he did.



Anyone asking a question should be qualified to interpret the answers in context. If they don't, use a multiple choice quiz or something instead. These questions/answers are just ludicrously out of sync.


> They'd be internal to recruiting

I managed to find them and I don't work in recruiting, they are for SRE pre-screens. The guy misunderstood most of the questions which is why he failed and then worded them incorrectly on his blog, it wasn't the fault of the questions or the interviewer.


The recruiter misunderstood the answers and or he is not qualified to ask those questions. Usually when you ask someone something that is not literal as in "what is 1+1?" You can't expect them to be literal. Questions like: Why Quicksort is the best sorting method? Guy gave very good answer showing that he has perspective and he is able to make a valid argument. The recruiter on the other hand just read the paper and completely disregarded the fact that the person he is trying to recruit is a valid candidate.

Also at the end the recruiters attitude aweful. Like what is that, he was reading answers from a paper, couldn't make valid arguments back to the candidate and at the end turns and says to the candidate "sorry my paper says you don't know this and that, and goodbye?"


No, I meant that the interviewee misunderstood the questions/answers given by the recruiter and thus misrepresented them when he wrote the blogpost. Since he couldn't even get the questions right I highly doubt that he gave a correct representation of the recruiters attitude as well.


Which of the following seems more likely?

A recruiter who was already giving the guy the wrong interview, and whose job revolves essentially around HR and sales, made mistakes in asking a series of technical questions.

An expert with decades of relevant technical experience misunderstands and confuses basic networking and system topics.


Which of the following seems more likely?

A person fails to read a question verbatim.

A person who has been the "smartest person in the room" for decades has an inflated view of his fluency on a topic and makes mistakes in his favor when he tries to reconstruct the questions from memory.


That's a big claim. Can you share some of the actual wording of the questions?


If you work at Google you can find them by searching for it.


Maybe not such a good idea to post an internal link? :)


Do you have proof of this?


So you're saying Google's recruiters don't tell what position they are interviewing for and that they found a 20+ years experienced engineering manager holding patents on computer networking under-qualified for an ordinary site maintenance position. Well, that sounds like a dumb recruitment process.


> they found a 20+ years experienced engineering manager holding patents on computer networking under-qualified for an ordinary site maintenance position.

To be fair, I've interviewed people at previous companies that had patents and 15 years at IBM on their CV and completely failed even the most basic system / coding questions. (fizzbuzz style).

There are a lot of people that read great on the CV but then it turns out that they mostly kept a chair warm and organized meetings over the last decade without actually retaining any technical knowledge.

Not saying that was the case here, but it happens and it's probably worth checking people on their stated qualifications.


Perhaps that suggests you're giving them the wrong interview.


Well, general interviewing (unrelated to tech) contains various amounts of "are you lying on your resume" type questions. If someone walks in with a breakdown of 10 years dev, 5 years management, they should be able to at least comfortably answer system/coding type questions. As in, if you do something every day for 10 years, you don't forget all of it in 5.

I had a candidate in a few months ago that was interviewing for Software Development Manager, so he got an initial phone screen and then a face-to-face with myself and another dev on the team he'd be managing. I was impressed with how little he knew about programming.

"Name some data structures." "What does MVC stand for?" "Name some design patterns" etc. All of which were unanswerable. Generally when it becomes clear someone was dishonest about their skillset, the ability to get hired for any position becomes impossible.


How can you not know what MVC stands for? It's pretty much a buzzword!


I mean, yeah, 99% of candidates should know what that means because it is an extremely common initialism. Although, I could see some engineer who worked on networking drivers for 10 years might not be up to date on the design patterns of frontend engineering.


That's exactly what happened to me. I was stuck in embedded systems world right out of college and then one day took interview with Google, they were asking me questions clearly looking to hear "MVC" in my answer but I just didn't know it back then...


Not all programming/engineering circles use the same buzzwords. For five years my mobile development groups used the concept without the acronym.


Agreed. I interviewed a few QA candidates at a previous company that used a term completely differently than we did. When I rephrased the question from defining the term to "what kind of test would you run in this situation" I got the kind of answers I would expect. It's far more important that a candidate understands the concepts needed to solve a problem, than that they have memorized a term.

Hell, someone could be able to define MVC and explain how you would use it, but have no idea how to actually implement something using it for a given programming language.


Even then it's worth remembering not every MVC is the same. Fat/slim models. Intelligent/simple views. There's lots of approaches to even a well known paradigm.


Knowing proper terminology is necessary in order to stay up to date with developments in your professional subject area. If concept X is an established concept in your professional area of expertise, and you don't even know that its name is concept X, then you likely have not read much about concept X, and consequently, are likely not up to date with current developments related to concept X. This isn't just semantics, it's about professional literacy.


I'm sure a lot of people know what MVC stands for. I don't think there's anyone on the planet who can be sure they know what it really means.


Right. Everyone and their mum know what M and V stands for, right? Now .. C? C is tricky. Please don't ask any further questions about C, will you?


I'm not sure if this is a joke. Model and View are really clear. Controller I usually find munged in with the View and it's not always profitable or clarifying to separate it.


My kick-out questions:

"Could you write out what an HTTP request and response looks like on the board?"

I'm really surprised at how many people can't do this. If you've spent five years developing web, surely you've had to look at raw requests, either debugging using netcat or with wireshark or just looking at the information in the Chrome/Firefox debugger?

"What's the difference between a GET and a POST request?"

"What is the difference between a statically typed and a dynamically typed language?"

I had one candidate try to tell me Java was dynamically typed and Scala was statically typed. It was for a Scala position. They also said "statistically typed" instead of statically, even after I corrected them.

-_-


I believe 90% of my coworkers and former coworkers would be unable to answer the HTTP response question.

And 95% haven't used netcat or wireshark. I wouldn't have either, if it wasn't for some particular work related to messaging.

They're able to develop reasonable line of business websites in spite of that.

I would be extremely worried if they were unable to answer about the difference between GET or POST, or the difference between statically and dynamically typed languages, so I agree with those.


I basically lived in Wire shark for a couple of years working for a voip company and still use stuff like curl all the time and I don't think I could walk through an http request of the top of my head.


GET / HTTP/1.1\r\n and some kind of sensible response is not too much to expect someone to know. HTTP is super easy and I see the HTTP transaction test as "did you ever get curious as to how exactly a core part of the current Internet actually works". I'm sure that there are app developers out there who can spin crud stuff all day and have no idea about this, just as there are curious people who couldn't stand up todomvc to save their life, but in general, all of the most talented people I've worked with knew their stuff front to back, and had at least a few areas of expertise.

CGI is also cool to learn about the workings of, since it almost seems too simple.


> I see the HTTP transaction test as "did you ever get curious as to how exactly a core part of the current Internet actually works".

Sure I did.

Then I forgot most of the details because they didn't matter, and I knew I could look them up quickly if I ever needed to write a HTTP client/server for some reason.


You would've gotten it wrong though!

You need two newlines to finis the request, plus the HTTP 1.1 standard requires clients to send a Host: header for all requests.

Not saying every interviewer would care about that in an early screening process.


if you answer "GET / HTTP/1.1\r\n" I'm going to ask you if you left anything out.

Because you did: after that you have to provide a Host: <hostname>.


But could you describe the general structure?

Yeah expecting many people to be able write out a complete http request from memory without a reference to look at. But the general structure of a http request is something so basic to web development that asking what the structure of a http request looks like isn't an unreasonable expectation.

Request line (method, uri), header(s), empty, body...


There is some truth to the saying that some people don't have ten years experience, they have two years of experience five times in a row.

Learning to use wireshark or tcpdump is a power tool that does show whether you got more experience in understanding the lower levels, or stayed at requirements-and-tests. (Not necessarily bad, but a good "fork" to jump off from)


The number of web devs I've encountered who regard my ability to talk HTTP over telnet as black magic makes me sad.


In all fairness, if you can talk normal HTTP 1.1 over telnet with some service, someone configured TLS wrong ;) And if you can talk HTTPS over telnet unassisted... well, I am truly impressed.


There's always openssl s_client


Yep, and when devs watch me key in the s_client pipe to OpenSSL to dump the cert info it's like I've become Neo and entered bullet time. I guess they don't need to know this stuff, but trying to do things like editing a hosts file in OSX, flushing dns, opening an incognito tab, looking at the SSL cert through the GUI, manually comparing it to one in an editor....vs a pretty short one-liner with curl or OpenSSL and a diff....I guess I'm either biased or lazy. I also almost never get asked WTF I just did, just a "wow, thanks" at most.

A previous employer had a sysadmin wiki. We call it Devops now, but I really liked working with the plain-text files of Dokuwiki there. Confluence is good for some things, but as a notebook of shell snippets and when to use them, it's not great.


> I also almost never get asked WTF I just did, just a "wow, thanks" at most.

Asking "wtf did you just do" is responsible for probably 1/5 to 1/4 of the professional knowledge I have today. It's sad that many people ignore that people will often teach you their little tricks if you ask.


What? HTTP 1.1 doesn't require TLS.


Scala is statically typed.


> "Could you write out what an HTTP request and response looks like on the board?"

Why should anyone remember what an http request or response should look like? Statically typed vs. dynamically typed language?

Fuck.

Are these entry-level positions or for someone with 10 years work-ex? A simple search on Google can tell anyone the answer of these questions, why do you expect people to carry an imprint of it in their memory? If the problem they'll work on mandates knowing these things it'd be pretty easy to solve with just one search. It is exactly questions like these that are worth kicking the host organization back in the butt.

Either your interviewing process is hilariously stupid or you're just spiking it up to boost the ego here.


Static versus dynamic typing is so fundamental that I don't see how a programmer could be remotely competent without having been exposed to those concepts enough to have internalized them. It would be like an accountant not knowing what the number 4 is. Yes, you can look it up, but if you need to then how did you ever get this far?


OK, ask me that question about defining the difference and I'll argue with the question, and back up my argument with examples of how type systems are far more of a spectrum of different cases than a stark static/dynamic binary.

And then your non-engineer phone screener who's expecting the answer to match the scripted sheet will conclude that I don't know this "fundamental" thing and thus am unqualified.


This isn't a question a non-engineer phone screener can ask. Coming up with first pass filters that don't require an engineer to interpret is harder.


This is proposed as a question that an engineer would ask, not some base-level screener.


Which would be true, or rather: over-qualified.


> It would be like an accountant not knowing what the number 4 is.

It's a hypothetical no-go! Every person, even the fourth grader knows the number 4. So why ask a question that measures their ability to remember 4, say 4 or show that they know 4.

> I don't see how a programmer could be remotely competent without having been exposed…

Share this link with them:

http://stackoverflow.com/questions/1517582/what-is-the-diffe...

Invest in people and people will invest back in your business. Interview process that I follow at my workplace has just one goal to assess: whether or not it'd be great to work with this person and spend over ~50 hours per week with them.


Are you hiring fun people who know nothing about computers? Or are there actually more criteria than you let on here?


> hiring fun people…

Absolutely! This is super super important. Fun to work with, not annoying to waste time with.

> know nothing about computers

It's sad that you think this way of people who couldn't answer your questions at the expected level.

> Or are there actually more criteria than you let on here?

Yes! One way to know if they're any good or not suitable is by giving them a problem statement like so:

'Design X, feel free to choose a language that's suitable for this problem', and then may be proceed to hint with: 'You might want to look at advantages of Static versus dynamic typing'… and then let them ask whatever questions they want to ask or read up or search or start implementing whatever.

Observe what they do -- and how fast can they get to the decision of what language and why. And how to make X (break down of steps) or if they can dive and start making X there itself. Note, if they had theoretical knowledge of what you seek during an interview it will work to their advantage naturally. Or sometimes not.

Of course, this process may not work for you as it does for us -- therefore seeking direct answers about static vs dynamic language may not be such a bad question after all (I get it), but expecting people to accurately remember what an http request or its response looks like may not be fruitful at all. It can throw good people off guard and ruin the rest of the interview for them.


Well, following Netflix's mantra - it is a team and not family that you are hiring for. Anyone can Google and find answers, doesn't mean you would hire everyone, would it?

There are a number of basic items that a competent programmer needs to know off the top of his head. If they had to google for every single item, then their productivity goes down the drain and so does the entire team's productivity. You should fix your hiring.


It is also a ridiculously dogmatic question. Many people believe into a fallacy that static typing makes safer programs, for example, and expect that somewhere in the answer.


How is it dogmatic? Sure, there's a lot of dogma around which one is better, but simply explaining what each one is and what's objectively different about them isn't remotely dogmatic.


Could you explain how static typing makes less safe programs?


I have yet to see a large static typed program that didn't -- somewhere -- run into the limits of static typing and contain a set of workarounds, using void* or linguistic equivalent. That's code a dynamic language doesn't need.

The only code you can be sure isn't buggy is code that doesn't exist.


void* is usually not a symptom of limits of static typing, but limits of the [type system] design or human brain. You can think of it as "ok, I give up. Anything can be passed here, proceed at your own risk, compiler will not save you here, errors will show up at runtime". Even the memory safe Rust does not do without such unsafe blocks. In dynamically typed languages that is everywhere, though. I have said this before: safety benefits of static typing show up when you are working with at least data structures, not simple variables. Imagine you have an external endpoint or library call that is specified to return a single object and does exactly that. At some time after release you are the maintenance programmer responsible for implementing spec changes:

  * The object returned no longer has member/property x, it is obtained by other means;
  * The endpoint returns list of such objects.
How sure are you that tests in dynamic language cover these cases? My experience shows that tests very rarely get designed to anticipate data changes, because data is driving test design. Which is more likely for a test: a) to test whether object returned contains keys x, y and z; b) to check if the object returned is_list() (see appendix)? Static typing covers such cases. Static typing is not something that magically saves oneself from shooting them in the foot, but is nevertheless a safety tool that CAN be used. It is of course a burden if one does not intend to use it and that is the core of the debate.

Fun thing: in the second case if your code manages to convert input list to a map and assign one returned object to a key that coincides with the removed property and map access looks syntactically the same as property access (a very specific set of assumptions, though), the bug can butterfly quite deep into the code before manifesting :)


Static typing is basically a bunch of free type-based unit tests. You can write safer programs in dynamic languages, but you need to write and maintain a lot more tests.


You can't compare static + N tests, vs not static with M > N tests.

Compare static with N tests, vs not static with N tests. In what case would the not static be safer?


If the type system is not expressive enough and you have to get around it?

The claim that "dynamically type language" allows code to more closely follows the business logic has merits. And you could follow from that to claim that type system could be causing more bugs (ie less safe).


That's a preposterous attitude. Just imagine if we took a similar approach to hiring for other kinds of jobs:

"OK, so you'd like to work here as a mechanic. What's the difference between automatic and manual transmission?"

"It's not fair to expect me to know that off the top of my head. If I need to know, I'll just do a Google search."


"OK, so you'd like to work here as a mechanic. What's the difference between automatic and manual transmission?"

Nope, more like "can you write out on the board what types of connectors are used in the car cooling system and in what order".


That's a crap comparison.

You're gonna have a hard time drawing a comparison between a line of work where you build things and one where you fix things.


Ok, how about "What is the firing order on a Chevy LS1?" ?


I would not want to work for or with him, if he asked me that in an interview, I'd walk out.... if he think thats what good computer scientist should know...


I'd be able to write HTTP request by hand, I've done that quite often, however I would not expect that to be a common skill. Looking at something, even often does not in any capacity mean that you would be able to reproduce that from memory.


Exactly. These are memory tests, not ability tests. Beyond a very basic level, memory tests are too random to be useful.

I once aced a geography exam because I happened to read up on the economics of Nigeria just before I took it. By sheer luck, there was a question about Nigeria in the paper.

If I'd read about Zimbabwe instead I'd have been screwed.

Neither possibility provided much insight into my competence as a geographer.

Even if a job spec needs specific knowledge of key facts, you can't generalise from pass/fail memory questions to broad spectrum competence, or lack of it.

If a candidate has no idea what an HTML request is, that's one thing. If they know damn well what a request is but can't list all the elements in a stressful interview while you're staring at them, - because in fact they spent the last year working on database code, and the API stuff was the year before that - that's something else entirely.


Why it is important to know the difference between statically and dynamically typed languages? If one writes in only one of those (or one set) it is not important to him/her and doesn't specifically make him/her a worse programmer in that particular language.


Knowing a cursory difference between a statically and dynamically typed language in this day in age is not an unreasonable requirement for many developer positions, especially web development where you're often using a mix of languages.

As always, this sort of question is a test of competence by proxy and there are usually outliers, but statistically speaking, I think you'll find a very high correlation between inept programmers and people who don't know the difference.


If they didn't know it, I'd want to dig down into whether the understand the specifics of their particular language at least.

I actually just sat down in a meeting with a dozen programmers, some of them with decades of experience, and half of them didn't know what functional programming was.


And the half that didn't know where much worse programmers than those that knew?

Your example shows that not every programmer has to know that.


General domain knowledge. We do Scala and a little bit of Python here and there. You should know the difference to show you're well rounded. Senior devs sound have some experience in both types.

I have a follow up question, "What are the advantages of a dynamically typed language over a statically typed one?"

This one kinda exposes the "Java-zellot" side of programming. If you love Scala and you're applying for Scala position, you don't often think like this. Being able to think critically about the things that are harder in Scala, that would be easier in a language without strict type checking, is a another good way to gauge if people can think critically.


I agree. Why the hell would you ask someone at that level basic questions like fizz buzz? It's absurd. I also tend to shy away from asking coding questions in interviews, they don't tell me much about aptitude for critical thinking and culture fit. Skills can be taught but culture is much harder. ... But I'm not saying to throw in some questions that don't prove that they are actually competent, just be casual about it.


I think coding questions are really important. You see their logic flow. Now stupid coding questions (in a list, find all the number pairs that add up to another number in the list) are terrible. They're complex and even good programmers need time to think about them. Fibonacci is one that people expect, so they look up all the variations and you get people who are good test takers (would ace a GRE/MCAT) but not good designers.

You want a simple question that isn't common, but that shows how they break down a problem under stress. Example: you have an input with paragraphs at 80 characters. Write a function to return the same paragraphs wrapped to 40 characters. You cannot break a word and must maintain paragraphs.

Great design questions: a word problem (You have an autoshop with, staff and customers. Customers can own multiple cars. A staff member gets assigned to a car with a work order...) .. draw an ER diagram. This is actually a pretty low stress question. It should be straight forward. If someone draws a terrible ER diagram with lists in tables and no normalization, or unnecessary relationships (or you have to keep asking them to label 1-to-n/n-to-1 relationships and they struggle), you know they're not going to be good at designing database schemas.

Another great general knowledge question: "A user types in a web address into a web browser and hits enter. Describe what happens. Go into as much detail as you can." This gives people a change to elaborate as much as they can. People can talk about DNS, HTTP, load balances, HTTP request/response, cookies, load balancers, web apps vs static content...

Questions need to be geared to the job. You don't ask someone to draw an ER diagram if they're being hired to rack servers and setup VMWare. Likewise you don't ask a web developer to write a function to do matrix multiplication.


I often ask the web browser one and find it quite illuminating. Best answer so far started with something like "Well, there's a microswitch in the keyboard if it's a decent one, and a circuit that debounces the input - err, is it a USB keyboard or a PS/2 one? Hmmm... How long do I have to answer this question?" THAT is the guy you want to hire...


Last time I got that question, I started with nerve impulses.


Still not enough: If you wish to explain a web-request, you must first invent the universe.


"Describe an HTTP request? Tricky.... What's the mass of the electron in this hypothetical?"


I did actually start my answer to that one with 'Look, I'm just going to skip over the microcontroller in the keyboard and the USB protocol --- is that OK?' and was met with a calm, 'That's fine.'


I feel like there is a happy medium there.


> In a list, find all the number pairs that add up to another number in the list.

You're saying this is a bad question because it's too complicated? Am I missing something? It really doesn't seem more complicated than the paragraph question to me, but maybe I'm having a brain lapse.


The entirety of HN seems to have something against Competitive coding.


My company has been giving the fizzbuzz for students applying for internship, with any language they wish and extra for style points.

The results speak for themself. All the good applicants do it in no time, without hesitation and give a perfect answer and usually some style points on top. The ones who have second grade coding skills have always something wrong with it.

It's a good 5 minute test whether someone can code or not. It shouldn't be the only test, of course.


How do you know the people failing your interview process have "second grade coding skills"? The fundamental challenge with evaluating interviews is that companies don't hire people who flunk interviews - so there is no easy way to reliably measure the false negative rate. Does fizzbuzz ability correlate with coding ability? Maybe, but you'd have to hire people who fail fizzbuzz to definitely answer the question. I know that I use google extensively at work - interviews don't allow you to use search.


I'm not sure about OP, but there is a tech company that has said it hires people who fail their interviews occasionally to see if their interview process is working. That company is the one that is the subject of this thread.


Can you cite your source? I haven't seen this anywhere.


I haven't read it myself, but I heard that's what Laszlo Bock said in Work Rules.


We actually do let people use Google during our code interviews. They'll use it at work, so why not.

We do watch them work though so if they just copy and paste from stack overflow and they don't understand the problem, it's pretty obvious.


It depends on the questions.

If you require using real, compiler correct language in a coding exercise, and the problem is not trivial, than allowing search is more than fair.

But the point of Fizzbuzz is being such trivial problem that it really should not require nothing more than an understanding of basic programming logic and constructs.

In my (limited) experience, there were instances where the candidate could not even decide on a programming language to use, I told them to use pseudo-code and they still flunked horribly.

Aside from that, Fizbuzz is rarely a dealbreaking task in itself, it tends to correlate pretty well with the overall performance, I would be surprised seeing someone failing fizzbuzz and excelling in the rest of the interview (once again, in my limited experience).


These were done in recruitment events at universities and the applicants were free to access Google if they wished. Some guys even went to the computer lab to do the assignments on a computer and then return a printout of their code. And we were completely fine with that.

But really, if an applicant needs to google to solve FizzBuzz, they don't have a firm grasp of the fundamentals. You're required to write one loop, a few if/then/elses and understand how the modulo operator works. Our jobs are much more demanding than that.


> Why the hell would you ask someone at that level basic questions like fizz buzz?

Because there are people applying for software engineering jobs that still can't answer those questions.


If you have a CS degree and cannot answer this type of question in your language of choice, you simply aren't ready for even a junior position in my opinion.

This type of coding exercise can potentially answer more questions about the candidate in two minutes than 30 minutes of softball questions about the candidate's past experiences.

I think that people who disagree simply haven't done much interviewing or haven't worked on a team with someone who couldn't do much more than copy/paste code from SO.


> I think that people who disagree simply haven't done much interviewing

Absolutely. Last time I went through trying to hire people was about a year ago. Easily 90% of the applicants we saw were completely unqualified. You have to have a way to weed them out.


I don't ask multiple questions like FizzBuzz, but I do ask for FizzBuzz. (I will explain the modulo operator if necessary because it doesn't come up that often in web development and people many forget about it until prompted.) Everything else about FizzBuzz (loop over a range, use a conditional, define a function, compare, etc.) is so basic you would think you wouldn't need to test it - but then you run into a person with 10 years experience who can't do it.

It's a (sadly) useful screen. Even more sad when you realize how popular and widespread that particular question is.


The question is, "why wouldn't you?" If the person is competent they will dismiss it in seconds and you can move onto something more interesting.


But this is Director level. You are wasting your time and their time. Far more important things to be factoring in for a director level.


Not being able to answer even a simple coding question with for-loops is a really bad sign, even if the question is "beneath" the candidate's level.

I'd expect any technical candidate to be able to do at least a fizzbuzz-type question.


Now take into account stress, lack of preparation, environment the person is not used to, unusual syntax patterns for them, biases against them, their way of talking, their appearance, etc. and you get yourself people good at your kinds of interviews in your biased view. You can only hope they are at least average at their job.


The most beautiful and elegant part of coding is the logic, Not how to use a for loop. Anything question that can be answered with google should be forbid from a interview test. show him a method, ask him how he can improve the performance. ask him a opinion based question on OO design.

if you hiring a house builder u would not ask him what a brick looks like right?


Yeah, but if your have to get through 100 house builder interviews and half of them don't know what a brick is, it saves a lot of time, no?


if you hiring a house builder u would not ask him what a brick looks like right?

The problem with this is that a home builder/contractor will have a long list of references, and possibly examples of her work available for examination. Many engineers search for jobs while still employed, so they generally don't include as references co-workers and current managers. Further, if your employer doesn't allow you to open source your work, then you need to do open side projects to have any sort of real resume prospective employers can examine (and this is problematic since your day job may already take more than 40 hours of your time).

So, no, I don't need to ask a contractor if he knows what a brick looks like, but I do need to look at his references, look him on Angie's List, post to local message boards about his work. And, of course, I'm not an expert on home building, so it would be unreasonable to ask him questions about carpentry or framing.


I see where you're going, and I generally agree with you, but I think that every programmer should at least be able to FizzBuzz, just as every architect or master home builder should be able to answer the question "which one of these is a brick?"


I think there is no point in assessing anything that doesn't take years to learn. And it's fairly easy for any team to come up with some fundamentals, that a candidate should know. There are more important qualities, than knowledge, though, as google-funded research suggests, like empathy.


If the interview distinguishes between people you want to hire and people you don't it's a pretty good interview, surely?


It's fair to discuss in the abstract, but seriously, in the OP the interview that failed him didn't fail him for an ordinary site maintenance position because he wasn't capable, it was because the recruiter was incompetent.

Of course, a sample of one (anecdota) which is most likely the min of the distribution is always the worst way to judge a distribution, but this is still upsetting.


Agree with everything you said... but I find it not applicable to this particular candidate given his answers.

It's possible that he got frustrated, became condescending towards the recruiter, and the recruiter decided to screen him out.

There are plenty of companies who turn down candidates that are false negatives for various reasons. Author should probably not take that personally and just apply again.


> completely failed even the most basic system / coding questions.

But could they at least tell you why quick sort was the best sorting algorithm?


Apart from the fact that it isn't always the best sorting algorithm and it's pathological worst case is in fact as bad as a bubble sort's worst case (O(n^2) with a higher base operational cost, as at least bubble sort's worst case is somewhat cache friendly). This happens in three cases; 1) all the elements are sorted in descending order, 2) all the elements are sorted in ascending order, or 3) a special case of 1) and 2) combined, all the elements are equal.


Because its quick, right?


It also sorts.


Also: It has a cool name. It's much better than 'slowsort'


Agreed.


First, it is definitely standard process to tell him (if they didn't, that's a definite failure). Again, remember you only have one side of the story here.

I like to try to gather facts before assuming things. IE Ready, aim, fire, not fire, ready, aim.

Admittedly more difficult in this case (and certainly, i have no access to it)

Second i'm going to point out a few things:

Experience may translate into wisdom, it may not. Plenty of companies promote people just because they last long enough. So 20 years experience managing may translate into a high level manager, it may not!

I hold a bunch of patents too on compilers and other things, it's not indicative of much in terms of skill, because almost anything is patentable.

Lastly, SRE is not an ordinary site maintenance position by any means. I"m not even sure where to begin to correct that. I guess i'd start here: https://landing.google.com/sre/interview/ben-treynor.html

Does this mean this person is under/overqualified/exactly right? I literally have no idea. I just don't think it's as obvious one way or the other.

"Well, that sounds like a dumb recruitment process."

Judging an entire recruitment process based on one side of a story from a person who's clearly upset about an interview, and even 3 sentences i wrote on hacker news, seems ... silly.

If you want to do it, okay.

But everyone in this entire thread seems to be making snap judgements without a lot of critical thinking. That makes me believe a lot of people here have a ton of pre-existing biases they are projecting onto this in one direction or the other (and you are, of course, welcome to claim i fall into this category too!)

I almost didn't jump into this discussion because it seems so polarized and rash compared to a lot of others

I think i'm just going to leave it alone because it's not clear to me the discussion is going to get any more reasonable.


> SRE is not an ordinary site maintenance position by any means

Then why ask about the nitty gritty details required by maintenance personnel as part of the screening process - things I would rather have my high level employees looking up rather than relying on a possibly faulty memory.

> Judging an entire recruitment process based on one side of a story from a person who's clearly upset about an interview, and 3 sentences i wrote on hacker news, seems ... silly.

This kind of opinion is not formed in a vacuum. It's formed of the dozens of posts that appear every year about how someone who seems qualified is turned down for spurious reasons like "being unable to reverse a binary tree on a whiteboard". It's what makes this particular post so believable - it fits the stereotype. Even your own developers who post here say "yeah, that's more accurate than inaccurate." Perhaps it wouldn't hurt to "undercover boss" your way through the interview process...

Speaking for myself, and only myself... I turn down all Google recruiters because I know I would not pass Google's interview process. Not because I don't have the skills, but because I don't have a college degree. Because I don't see the return on investment for studying for the next 6 weeks just to pass the interview process, especially when I won't even know if I'm getting a job I'll enjoy.

> I think i'm just going to leave it alone because it's not clear to me the discussion is going to get any more reasonable.

How about the responses from your own employees which are pointing out that they see the problem too. Are they being unreasonable?


"Then why ask about the nitty gritty details required by maintenance personnel as part of the screening process - things I would rather have my high level employees looking up rather than relying on a possibly faulty memory. "

This is one reason why i find it super-strange. It's not a set of "high level employee" questions. It's a standard SRE pre-screening.

"How about the responses from your own employees which are pointing out that they see the problem too. Are they being unreasonable?"

My view of unreasonable is not about whether there is a problem or not. It's not about the consensus. I don't actually have an opinion myself on the hiring process. If people i work on recruiting raise problems, i try to solve them. I have not had trouble trying to recruit in general. So i haven't formed a strong opinion, even after 11 years. If folks want to decide the process is horrible, okay. If folks want to decide it's great, that's also okay.

But it's unreasonable because it's both super-quick reaction without time to settle and think, and not aimed at anything other than trying to reinforce one view or the other.

Nobody is actually listening to each other, they are just trying to force whatever their view is, good or bad, on others.

So to answer you directly, i don't think pointing out a problem is unreasonable, but that's not my complaint. My complaint is that the actual discussion is not a discussion, but mostly people just arguing on the internet. IE You shouldn't take me saying "unreasonable" as a proxy for "me saying i think their viewpoint is wrong". I just think the mechanism of discussion here is unlikely to yield fruitful results.


> It's a standard SRE pre-screening.

To clarify, I was speaking of your standard SRE hires, whose position you referred to as "not maintenance drones".


I suspect for someone who has failed - rightfully or not - a recruitment exam in this manner, it may in fact be the only cathartic mechanism.


> Then why ask about the nitty gritty details required by maintenance personnel as part of the screening process

I'm not sure what "nitty gritty details" you're talking about here.

As much as some people here think it's impressive knowledge[1] to be able to give the size of an ethernet MAC address without Googling it, that's something that anyone with experience in computer networking oughts to know. Not at all because it's useful knowledge, but simply because if you actually spend time looking at network traffic dumps or ARP tables or DHCP configuration or SLAAC assignments you'll be seeing MAC addresses so often that it just becomes obvious. Just like knowing that an IPv4 is 4 bytes and an IPv6 16 bytes. Or that a TCP connection starts with a 3-way SYN/SYN-ACK/ACK handshake.

And the same thing applies to the other questions that look like meaningless details: knowing what an inode is and what syscall returns inode data for a path is something that someone with system-level C programming experience should know. stat(2) is far from being something obscure. Knowing what signal is sent by the kill(1) command is maybe slightly more on the trivia side IMO, but it's still a very well known fact.

A candidate is most likely not expected to know the answer to all of these questions. But failing in all of the categories is IMO a fairly strong red flag for someone interviewing for SRE, where in general people are usually expected to be comfortable with at least one of {networking, system administration, Linux internals}. In fact, this domain specific knowledge is the biggest differentiator between "standard" SWE and SRE-SWE, even though the lines get blurrier and blurrier.

This also indirectly answers this:

> things I would rather have my high level employees looking up rather than relying on a possibly faulty memory

You would have to be out of touch with the field for quite a while to forget such basic things. Which is likely something that you want to test for in such interviews. To go with a metaphor: if you claim to be a fluent English speaker on your resume, you can't be excused of "faulty memory" if you forget how to conjugate "to be" in the present tense. It's not something you forget easily, and if you did forget you most likely can't say you're fluent anymore.

Disclaimer: I was an SRE at Google for 2.5 years, but I'm not familiar with the early phases of the recruiting process.

[1] https://news.ycombinator.com/item?id=12701486


So, as someone who went through the process and got through it (so is less inclined to hold a grudge):

> Then why ask about the nitty gritty details required by maintenance personnel as part of the screening process - things I would rather have my high level employees looking up rather than relying on a possibly faulty memory.

AIUI you can get easily 5 or more of the pre-screen questions wrong and still proceed to the next stage, depending on your experience and how wrong you are. The point here is not that you know each and every one of those things, but to show that you are, in general, knowledgeable enough to spend Engineer hours on.

And your judgement of these questions is seriously impaired by the fact that they are written down wrong. I assume, that the author of this post has written down a rough transcript from memory and as such it's colored by their own (mis)understanding of the question and whatever got leaked from memory in the meantime. The questions he wrote down are, at the very least, not verbatim the ones from the checklist given to recruiters (and there is a strong emphasis on reading them out verbatim there, so I consider it relatively unlikely that the recruiter didn't do that).

> It's formed of the dozens of posts that appear every year about how someone who seems qualified is turned down for spurious reasons like "being unable to reverse a binary tree on a whiteboard". It's what makes this particular post so believable - it fits the stereotype.

Exactly. You are reading "dozens of posts every year" from disgruntled interviewees who got rejected and are pissed. On the flip side, a quick internet search will tell you that Google gets on the order of millions of applications each year, meaning you don't hear from >99.99% of applicants.

There is also the widely advertised fact, that the Google hiring process accepts a high false-negative rate, if that also means a very low false-positive rate, so it is to be expected that a good percentage of qualified applicants still get rejected. It is thus also to be expected, that you hear from some of them. Meanwhile, again, you are not hearing from the thousands of qualified applicants that do get accepted each year. Because an "I interviewed at Google. It was pleasant, everyone was really nice and they got me a good offer" blog post won't draw a crowd on hacker news, even if it was written.

> How about the responses from your own employees which are pointing out that they see the problem too. Are they being unreasonable?

Let's not ignore the responses from Employees that don't think there is a problem.

From reading this post, I'd say a likely reason for the rejection is, that this person wasn't being particularly pleasant. Frankly, he comes of as kind of an arrogant prick. And, as a general rule, engineers at Google, just like everyone else, don't particularly like having unpleasant people on their team. And I also believe this post has gotten enough upvotes, that someone will look into the situation to see what went actually wrong here.


> they are written down wrong.

Please, feel free to correct the record, then, with the correct screening questions. The proverbial cat is out of the bag, and has gone tearing down the street towards everyone trying to make a buck by "training" hopeful young graduates on how to make it through the Google interview process.

> Because an "I interviewed at Google. It was pleasant, everyone was really nice and they got me a good offer" blog post won't draw a crowd on hacker news

No, it won't. Because it's the tech equivalent of a lottery winner saying they think the lottery system is a fair and equitable way to distribute money.

> Let's not ignore the responses from Employees that don't think there is a problem

Same problem. If you're in, you passed the Google employment lottery, so it's much more interesting (and should be more meaningful to management) when insiders agree that the hiring process has problems.

Now then, of course, so long as directors find that they have plenty of applicants to back fill attrition and grow, they have no reason to think the hiring process is broken; so long as Google is happy hiring not necessarily the best people for the job, but the ones lucky enough to dodge more false negative flags than everyone else. Better to be lucky than good.

All that said, yeah, Google's hiring process works for Google. Coming here, to a conversation started by a crappy screening experience, and expecting respect for a process with so many false negatives is a bit optimistic, though.


> Please, feel free to correct the record, then, with the correct screening questions.

No can do. I actually like my job. And I also like my coworkers and don't want to make their life any harder.

> No, it won't. Because it's the tech equivalent of a lottery winner saying they think the lottery system is a fair and equitable way to distribute money.

The same goes for a "I interviewed at Google. It was pleasant, everyone was really nice but sadly I didn't got accepted" post.

The fact remains, that you don't read from >99.99% of people. My interview process was very pleasant. I had a bunch of nice conversations about programming and computers with friendly and humorous people.

> Same problem. If you're in, you passed the Google employment lottery, so it's much more interesting (and should be more meaningful to management) when insiders agree that the hiring process has problems.

There are a lot of insiders. With a lot of opinions.

> so long as Google is happy hiring not necessarily the best people for the job, but the ones lucky enough to dodge more false negative flags than everyone else.

Well, the thinking here isn't really "we want strictly the best". That would be a hopeless idea from the get-go. The thinking is "there is a hiring bar that we want people to pass and we want to hire exclusively from above that. We don't care about the sampling of that, as long as we get that". What they end up with is a pretty broad sample of that population. Some (like me probably, tbh) just barely pass the bar, some are the very top. Some other top-people got unfortunately rejected, some other barely passing people too.

So yes. There is indeed no ambition to actually get just the top 100K engineers in the world.

> All that said, yeah, Google's hiring process works for Google. Coming here, to a conversation started by a crappy screening experience, and expecting respect for a process with so many false negatives is a bit optimistic, though.

Well, mostly I (and DannyBee) are just pointing out obvious flaws in the discussion here. Like the obvious self-selection bias and selective reporting. And the also obvious fact that this particular post was written while angry and only represents one side of the story; and that not even accurately.

Secondarily, in these long-wound comment threads on reddit/hackernews/twitter, people seem to usually not even be aware of the goals of the hiring process and think "look, here, three prominent false negatives" is an actual argument about the process being flawed.


>Judging an entire recruitment process based on one side of a story from a person who's clearly upset about an interview,

It's not just this guy. There have been others: https://twitter.com/mxcl/status/608682016205344768

There's another measure I use to measure the quality of their hiring process. The output. Namely the track record of products Google has developed in house in the last 10 years.

I've also heard a few stories about friends applying for a position and being shunted by the hiring process into the hiring funnel for other (plainly unsuitable) positions. When I hear a very specific criticism from two separate places it's hard to stay skeptical.


Yep, those engineers they took on in the last ten years must suck, they've only managed to develop technologies that grew Google's annual revenue from 10 billion dollars in 2006 to 75 billion in 2015. That's the kind of track record that has to make you question the hiring process, right?


You seem to be confusing "I have a smug twitter-sized sound-bite response" for "I have a worthwhile counter-argument".

It's a common failing these days, but you should probably look into getting it fixed.

That said, yes, Google's hiring process is questionable. The Web is full of horror stories from obviously-qualified people who Google passed on, often very early in the process when no engineer had talked to them, and this suggests Google's success is not sustainable so long as that continues. They'll be able to hire fresh CS grads out of Stanford forever with this process, but the experienced/unconventional people they flunk out on the early screens are not going to come to them, and when their current crop of experienced/unconventional engineers retire or take jobs elsewhere, Google's finally going to have to fix this problem and stop pretending that it's better to pass on a thousand highly-qualified candidates than to give one unqualified candidate an on-site. That, or tumble back down into mediocrity.

(which, to be fair, is already mostly the case; Google is largely a mediocre company, with only a couple of externally-visible brights spots of talent or innovation clustered in a couple of particular teams, and otherwise Google runs on inertia and the hope that the 0.1% of interesting stuff they come up with will keep the 99.9% of mediocrity afloat)


I can expand beyond 140 characters if you like. The OP claimed that in the past ten years, as a result of their hiring practices, Google's product output quality has noticeably declined, presumably as compared to the search product on which their name was made, and gmail, which they launched in 2004. And it's easy and fashionable to knock Google because maps is not as good as you remember it used to be, or because they shut down reader, or because plus didn't manage to unseat facebook.

Well, in 2006 Google was a 10 billion dollar search and ad company with a fledgeling email business without a revenue model, who had just bought youtube. In 2008 they shipped a mobile phone operating system. That's now a thirty billion dollar business which has been built up through talent within google. They undermined Microsoft's office monopoly with an online office suite (okay, some acquisitions underpinning that). They have a credible seat at the top table in the cloud market. And they continued to develop their core ad platform to drive more revenue growth.

I've got no particular reason to stand up for Google, they're quite big enough to look after themselves, but the idea that their product flops in the last decade outweigh those product successes, and can be held up as evidence that there is something deeply rotten in their hiring model, seems to be cherrypicking to me. 70% mobile OS share, 70% search share, and 50% of global online ad revenue... that's a pretty good kind of mediocrity.


It's still the case that other than search and ads, most of Google's biggest hits were acquired rather than the result of in-house initiatives (even Google Analytics, which is probably one of their more heavily-relied-on products, was acquired). Google doesn't hire people who will create stuff like Android; they hire people who can pass their interview process, and get new product and service lines mostly through acquiring teams of people who probably can't pass their interviews.

It's also the case that Google is acquiring a reputation for bad interview/hiring processes, and for hiring people who have a Ph.D. in CS and putting them to work on CRUD web apps that any random coding-bootcamp grad could build, since there's just not enough interesting in-house work to keep all those top talents occupied.


Google internally-initiated successful products that come to mind: Cloud (2nd or 3rd in market, lots of revenue and growth), Play Store (also lots of revenue and growth), TPU chip, SDN, Photos, Chrome, ChromeOS.

Google (vs Alphabet) often acquires companies that have a seed of a useful product. Android for example was apparently not in a usable state when it was acquired. 99% of the creative work is making the thing actually work, not in having the prototype.

To say Google's own engineers didn't create Android because they didn't commit the very first line of code is doing them a disservice.


>I can expand beyond 140 characters if you like. The OP claimed that in the past ten years, as a result of their hiring practices, Google's product output quality has noticeably declined, presumably as compared to the search product on which their name was made, and gmail, which they launched in 2004. And it's easy and fashionable to knock Google because maps is not as good as you remember it used to be, or because they shut down reader, or because plus didn't manage to unseat facebook.

I don't necessarily blame them for plus (facebook was clearly a marketing success, not a technology success), but maps' decline isn't anybody else's fault. It has declined in quality and that is plainly an engineering failure not a product failure.

>Well, in 2006 Google was a 10 billion dollar search and ad company with a fledgeling email business without a revenue model, who had just bought youtube. In 2008 they shipped a mobile phone operating system. That's now a thirty billion dollar business which has been built up through talent within google. They undermined Microsoft's office monopoly with an online office suite (okay, some acquisitions underpinning that).

Well, yes. Acquisitions underpinned all of that success.

>I've got no particular reason to stand up for Google, they're quite big enough to look after themselves, but the idea that their product flops in the last decade outweigh those product successes, and can be held up as evidence that there is something deeply rotten in their hiring model, seems to be cherrypicking to me. 70% mobile OS share, 70% search share, and 50% of global online ad revenue... that's a pretty good kind of mediocrity.

All predicated upon outside purchases or the original self-reinforcing search monopoly developed before 2004.

What's worse is that they've often used their search monopoly to try to break into other markets (flights, shopping, etc. - plenty of stuff like this got preferential SERPs treatment) and failed because what they released was crap. That is, they failed even with a huge home ground advantage - the kind of monopoly advantage that let Microsoft make IE6 (IE6!) the industry standard for years and got them slapped by the DoJ couldn't even be put to good use by Google.

I'm not denying that they have some good engineers but the idea that they're the creme de la creme of the industry with the best hiring process is way way off base.


There are a lot of assumptions being made here. Sometimes companies grow despite poor hiring decisions. I think you need a finer-grained view than just revenue to really tell whether you're doing a good job or not. Lots of terrible decisions have been justified by this "the revenue went up so we must be doing a good job" line of reasoning.


And Comcast has some of the best customer service and engineering because they don't seem to be losing any customers.

Right?


> There's another measure I use to measure the quality of their hiring process. The output. Namely the track record of products Google has developed in house in the last 10 years.

That's a poor metric to evaluate the rampant complaints about a high false negative rate. I don't think that many people are disputing that the people who do get hired are qualified most of the time.


When the in house engineers come out with products like Wave and Glass while things like Maps and Android are purchased you have to wonder.


Psst: the Rasmussen brothers were behind both Maps and Wave.

https://en.wikipedia.org/wiki/Lars_Rasmussen_(software_devel...


I think you're neglecting the continuous improvement of successful projects, which take quite a bit of engineering effort.

Was it software quality that killed Wave and Glass, or was it more of the market not wanting either of those things? (To digress, it seems like both of those products came too early. Do you think that wearable computers will _never_ exist? And Slack seems to be the Wave-like thing that the market wanted.)


Funny you should mention that. I was just using maps and thinking "this is worse than it used to be".

From what I've heard from insiders, the adwords code base is an enormous mess. Not surprising for a product that old perhaps, but this points to their engineering practises being about as mediocre as the industry average.

I don't honestly know why people want slack. It seems to just be in vogue - one of those weird network effect things. It doesn't seem to have anything to do with their feature-set or engineering quality because it's not noticeably better than, say, hipchat.

>To digress, it seems like both of those products came too early. Do you think that wearable computers will _never_ exist?

They already exist.


Slack is in no way like Wave. Now you're just over reaching with your comparisons. Wave's flaw was showing you what the other person was typing as they were typing it. You try to separate quality from functionality and stick that to market's fault because it doesn't want Wave's functionality. That is not mutually exclusive. Wave's quality was egregious.


Not sure what you're saying here. Wave was great technically; the market fit just wasn't there.


Why is it a poor metric? Isn't the point of hiring employees to ideally build and launch successful products?

I think Google is pretty good at hiring "qualified" engineers who are very good at maintaining and scaling existing systems, but the process definitely selects against entrepreneurial product-focused engineers. Maybe Google thinks that's fine though: they can always pick them up through an acquisition later, albeit at 100x the price.


Google has never made it that clear what position I was interviewing for (and definitely not what team/role) when I interviewed with them. This was sort of pitched as a selling point, since after being hired you'd float around and find the niche eventually?


When was this? This was the case when i started (~2006), but it definitely changed and is not the case anymore.


Probably late 2000s when I was last on site. Google bugs me every year (most recently a week or two ago), but I don't usually push on the process.


Interesting. I could look up the date it changed, but it definitely changed because folks didn't like the old way :P.

Now, instead, they generally don't recruit (google is too large to not have exceptions) without some specific hiring managers and headcount in mind.

They will tell you what those groups are and what they do. So for example, the person i interviewed last week was targeted at two teams. I actually specifically asked if he knew what he was being interviewed for, because i like to get some idea what the candidate thinks whatever job they are interviewing for means, and he was able to tell me the two groups and knew what they did.


I interviewed at Google in March 2014 and was given an offer. I wasn't interviewing for a specific team. After the in-person interviews my recruiter set me up with 2 different team managers to talk to about potentially joining their team. I wasn't interested in either team, and my recruiter said "That's ok, we'll find a place for you," and a few days later found a new manager for me to chat with. I joined their team.

I did know I was interviewing for a general SWE role, but not anything more than that, and from all appearances the team was completely up in the air until after my interviews.

I don't know how much has changed since 2014. I also didn't get any of these pre-screen testing questions from a non-engineer. Is that normal practice for all interviews now?


FWIW, I got told what I was going to work on on my first day, by my new manager, when they picked me up for lunch. Before that, I didn't even know the PA. From what I can tell, that is standard practice for SREs, as SRE is very understaffed, so there is a lot of arguments and back-and-forth around where people are most needed.


>> as SRE is very understaffed in all likelihood due to the flaws in the process. I know quite a few people, who I highly respect, who IMHO are better than the people I know who work at google, who flunked the process.


> Judging an entire recruitment process based on one side of a story from a person who's clearly upset about an interview, and even 3 sentences i wrote on hacker news, seems ... silly.

How about the dozens of other seemingly qualified people who have complained about the google process?


"How about the dozens of other seemingly qualified people who have complained about the google process?"

And what's the other side of that? IE the literally tens of thousands to hundreds of thousands who haven't?

Again, i'm not saying there is no problem, i'm just saying this is probably not a great mechanism to evaluate whether there is a problem or not.

If you want actual usable data, this wouldn't be the way to get it, good or bad.


>First, it is definitely standard process to tell him (if they didn't, that's a definite failure). Again, remember you only have one side of the story here.

"Standard process" is what actually happens in the real world. Alas, standard process is to not tell him.

>But everyone in this entire thread seems to be making snap judgements without a lot of critical thinking. That makes me believe a lot of people here have a ton of pre-existing biases they are projecting onto this in one direction or the other (and you are, of course, welcome to claim i fall into this category too!)

Your story is also just one side of the story - actually, you weren't even involved so it's neither side. Still, you spend all your effort on saying why for example this guy's patents mean nothing and he's likely incompetent. I'd call that snap judgement, lack of critical thinking, and biased conjecture,


> "Standard process" is what actually happens in the real world. Alas, standard process is to not tell him.

Inferring what's standard from a sample size of 1 (which is ~0.0001%) is very questionable.

> Still, you spend all your effort on saying why for example this guy's patents mean nothing and he's likely incompetent.

That is not at all what they where saying. They where saying that patents aren't conclusive evidence of competency.


No, the policy/process DannyBee references is fiction. What's standard is what happens in reality. I'm clearly not talking about statistics.

For your second point, DannyBee focuses his efforts on discrediting this seemingly exceptionally qualified candidate, never yielding an inch from his position that Google is exceptional and can make no mistakes.


> What's standard is what happens in reality.

Infering what is "reality" from a sample size of ~0.0001% is clearly ridiculous. By that logic, it would be "standard" to be born a conjoined twin. Actually, it would be 10x as likely as what "standard" is.

> I'm clearly not talking about statistics.

You might benefit from doing so, though. It might help you realize what nonsense you are saying.

> DannyBee focuses his efforts on discrediting this seemingly exceptionally qualified candidate

No, this is factually incorrect. Repeating something factually incorrect doesn't make it more correct.

> never yielding an inch from his position that Google is exceptional and can make no mistakes.

You either can't or won't read. They very clearly acknowledged the possibility of a mistake several times in each post they made.


It seems to me parent's answer only reflects the general attitude at Google: they don't question anything they do, they don't do "customer support" and they don't display humility

Yes, I'm not expecting the conversation to have been exactly that, but it shows problems regardless.


Google questions a lot of what it does. It's made up of lots of engineers and other that are on this site and care deeply about the fields we are in. We are always questioning decisions made and try to use data as best as we can to back up those decision.

As for customer support, it depends on the product you are talking about. Your free gmail account or $5 purchases through the play store: don't expect a lot of support here (but there is some). If you are using Google Cloud, Apps, AdWords, or other products where you pay, you can expected to get amazing support (this will change with your spend level). For example: on the cloud side, you can pay for support contracts that gets you lots of 1-on-1 time with Google support staff to help you use the services[0]. Or with the new Pixel, there is on-phone support[1].

[0] https://cloud.google.com/support/

[1] https://madeby.google.com/phone/support/


I am aware of these support channels, but there are a lot of stories of paying customers getting stonewalled. Not to mention cases where non-paying customers or content producers get simply kicked out without recourse - though sometimes Google (and others) are right to act in a certain way


I used to have project fi, their customer support was quick and helpful, even for complicated things like when the porting out of my number got stuck (not their fault, it was the other carrier)


Listen, you have been hired by the greatest software company on the planet, you survived ridiculous recruitment process with multiple pointless whiteboard interviews, CLEARLY you are special. How dare those unworthy peons slander the name of your company? they arent qualified, you are. "customer support"? You are not being paid >200K to sit 8 hours in a chat telling people to turn it off and on again.


This to me looks like an initial phone screening interview. It's not actually a "technical" interview (there is no code to write and the person that interviews you is a technical recruiter and not an actual engineer). As far as I know (I might be wrong so take this with a grain of salt) your first screening interview is usually used to decide in which direction you want to proceed (for example if you want to be hired as a SWE-SRE or SE-SRE position). It's not far fetched to think that they were just applying some standard questions without having an actual clear position in mind yet.

I also agree with the grandparent, I'd be very sceptical about this transcript being 100% accurate.


This.

I passed several rounds of interviews at Google over a number of years (phone screening, phone interview, on-site). This is definitely a phone screening, where the recruiter expects "standard" answers to "standard" questions. Remember that interviews are somewhat of a game. Trying to be smart at this stage is the wrong move.


> This is definitely a phone screening, where the recruiter expects "standard" answers to "standard" questions.

I went through a Google phone screen once. (For full disclosure, I've interviewed on-site twice and failed that both times.)

One problem posed on the phone screen involved finding the last 1 in an infinite array consisting of a finite number of 1s followed by an infinite number of 0s. I described the search strategy "check index 0/1/2, then progressively square the index until a 0 is found, then use binary search to find the first 0". The screener objected to that strategy on the grounds that successive squaring "grew too fast" and successively doubling the index would be faster overall.

Once the call concluded, I looked into it and determined that those two strategies are almost exactly equivalent. This didn't leave me impressed with the phone screen process.

Then again, I apparently passed the screen despite making that "mistake". Still, I think the least courtesy you can extend to interviewees is to not correct them when they're right and you're wrong. :/


> ordinary site maintenance position

Seems you don't know much what Google's SRE job is about.


They really don't. I have been called by one of their headhunters recently who explicitly refused to tell me what sort of position I would be interviewing for. He told me that:

"That is not how we work. We will evaluate your abilities and then, if you pass, offer you a position on a team that we deem best fitting your skills."

Needless to say I have thanked him for his time and declined. I am not going to fly to another country to be grilled with stupid coding interviews only to be offered an entry level job on a team I am not interested in.

Another such thing was an invitation from Amazon's HR for an "accelerated testing session" where I was expected to go for a full day of coding tests (together with many others) and then they would pick who they invite for real interviews later where you may learn what sort of position they might offer you. Again, no idea what position/job you are interviewing for and wasted entire vacation day - for their convenience. System clearly targeting 20-somethings straight out of school. No, thanks.

The questions from the original article are familiar - but these are often external staffing agencies doing these pre-screens today. Google used to do it in-house with actually technically very competent HR staffers (I have done a few phone calls with someone in their California HQ back in 2002ish), but now if I get contacted by them every now and then it is always an external headhunter.

The staffing agencies employees tend to be very technically incompetent. Basically, they often have no idea whatsoever about the technical requirements for the position they are trying to fill. They only match keywords on the CVs in their database (often LinkedIn profiles, etc.) against the keywords in the job description, then they spam everyone that matches with an excited mail about having a "perfect match job". The matches are usually on the completely generic stuff like "C++", "Python" that everyone has on their CV, so in most cases the "dream job" is anything but - in a field the person knows nothing about or is not interested in.

I have been literally hounded for weeks by a headhunter once for a position that I had zero qualification for (Windows/.NET stuff - I was mostly Unix guy back then). It finally turned out that she wanted me only because I spoke/understood the Czech language. And she fully expected me to move to a "sweatshop" that company had in Czech Republic, trying to do a job I knew nothing about and paying less money that I had as teaching assistant at a university at the time. Some people are just nuts.

The phone screens are the same story - the headhunter has a script provided by their client with a bunch of keywords they are looking for in the answers. They are basically playing bingo with the candidate's answers, ticking off the "correct" keywords. Don't expect them to actually understand what they are asking. They can't - this week they are recruiting a Google engineer and next week they would be trying to fill a civil engineering position and a week later perhaps a chemistry lab technician.

I believe this is exactly what happened here. I have been in a similar situation before myself (not with Google). The hiring managers are complaining about how hard is it to hire talent, but why are they then wasting everyone's time with incompetent HR agencies, pointless phone screens that filter out even good candidates and stupid coding tests. Ask for references (I will be happy to provide), ask to see some code at the interview, check my public code (Github for ex), hire for a trial period. But give me a break with this ridiculous testing/screening nonsense. Nobody else except software engineers seems to have to put up with this type of crap.


Both sound like IT Hunger Games ;-)


This blog is exactly what an SRE interview is like.

I breezed through these kinds of questions with the recruiter since I'm younger have a fairly fresh CS background.

Then, my first SRE staff interviewer primarily asked how I would build a data center on the moon. I work on the FreeBSD kernel and TCP full time. I know what BDP, window sizing, head of line blocking, etc are way beyond what a typical SRE would and how communication latency would cause major issues. That confused the questioner. I can't think of anything else I'd have said wrong, my background is systems engineering and I know more about power distribution, HVAC, and data center design than I care to. The lady was skeptical of my answers and it felt really humiliating even though I would rate myself more knowledgeable than my questioner, because of the candidate/interviewer positioning and failure.

The next man, on another day, asked me a bunch of math trivia like estimating the angles on the hands of a clock and orders of magnitude guesses of a small item like a marble to fill a room. I told him I was no longer interested in working for Google and he was really startled because "he didn't get to ask me systems questions yet".. well, good luck with that.

Everyone was really sad at that point, including the recruiter. Nobody from Google has contacted me again, which is a relief. I found the entire process gross.


> even though I would rate myself more knowledgeable than my questioner

Don't get me wrong, I'm guessing you know your stuff, but you also strike me as someone who may likely have failed on culture fit down the line. Interviewers are often more sensitive to attitude than they ever are to aptitude, and for good reason, your HVAC knowledge may be irrelevant once you discover the custom designs in use behind closed doors, and a bad attitude toward learning where you're weak can be a much more fatal problem for a new hire.


At the same time, it's a useful filter for the candidate.

Last year when I was job hunting I kept getting fizzbuz-style phone screens, even from companies who'd specifically contacted me because they knew who I was and what my skills/experience were, because they have to be sure to filter out those unqualified core committers of software they use on a daily basis.

Anyway, I got asked the "write a palindrome checker" question multiple times in those screens. I guess more companies than I thought have a Department of Palindrome Quality Assurance these days. But after about the fifth time, I just started going overboard on the question to make a clear point to the interviewers. I got a pretty good patter down where I could write out the code while describing all the random quirks phone screeners never have heard of: I'd start lecturing about combining characters, right-to-left directional shifts, the tradeoffs of considering solely Unicode code points versus graphemes, using the character database to identify categories of characters to ignore when considering whether a string is palindromic, etc.

Interviewers who took that poorly did not get my further cooperation. Interviewers who took it well (by being positive/polite about it, or even admitting that yes, this kind of phone screen is a waste of everyone's time when you already know you're interviewing someone who can code) got to talk a bit more. But I ended up accepting an offer from a place that actually worked to make their interview process better then this, and which continues to evolve it all the time.


In the time since that Google interview, I've moved into management and have built a very high performance team recognized as such by peers and the executive team.

For internal hires, I convinced people to come work for me that I had immense respect for by using casual conversation and pitching the idea and vision for a new operating systems team.

I've found this is also a classy way to hire external people. I've since hired people off freebsd-jobs@ mailing list and twitter by being upfront about the good and bad of working at this company. No trick questions, just a conversation about what we like to work on. This was easy because I had an idea of what they have accomplished by their commit logs.

Most recently I hired two women, masters students, for summer internships. This was very different because I had no idea what the candidates had done as coursework or projects beyond a simple resume. I again used casual conversation, no trick questions. I posed some real world situations, passively seeing if they understood concepts like deadlock, manual memory management, indirection, and had very good working CS/OS vocabulary. This eliminated most of the other candidates, and it was pretty clear who had slogged through their OS and networking classes without passion. I let each person tell me about projects they worked on which really excited them. One had done Linux USB driver on her own time, among other interesting things. The other had implemented a scheduler and file system on a teaching OS as part of her course work. Both worked out phenomenally and both have patches in the FreeBSD.org source tree from the 2 month internship experience. I am very proud of this, and of my team for mentoring them so successfully.

The people I hired were often confused; "That's it?" at the end of the phone or in person interview. They thought they had done something wrong because they are so used to being sweated for the sake of being sweated.

I am now convinced this is the only ethical way to build teams and hire -- start with some seasoned vets then grow new talent while refining and reinforcing shared values.

I don't really see what the stereotypical SV tech interview accomplishes. Blind leading the blind. Leadership is piss poor in this industry.


Wow.. I would like to work for you.


That may well be the case. The reason I quit the interview is because I evaluate a potential employer while they are evaluating me.. I really love when someone asks "do you have any questions for me?" and take full advantage to try and get candid knowledge of a team and more importantly leadership. I determined Google would not be a good fit for me.


Disclaimer: I also work for Google, opinions are my own, etc etc.

> "i always take "transcripts" of interviews (or anything else) with a grain of salt"

I mean sure, a single instance of this might be overblown, exaggerated, or false in some way.

But there is an avalanche of reports like this, to the point where it's become widespread industry insider knowledge.

I enjoy working here, but the interviewing practices are such that I actively warn friends applying/being referred to temper their expectations of a repeatable/reliable process.

Most colleagues I've spoken to about this, including myself, have strong doubts we would have made the cut if we interviewed again - even though all are strong engineers with great perf records.

At what point do we start taking reports like these seriously? We don't have to accept every detail of the reporting as gospel, but there's clearly something here.


The problem with Google's interview methods is that they all select for a very specific type of programmer: heavily math oriented, deep knowledge of obscure Computer Science theory, but not one test on knowledge of languages, architecture, design, or actual real-world problem solving. I walked into an interview with one guy and he literally did not even say hello: he just jumped straight into some problem I had to solve on the whiteboard.

The problem with that approach is you end up with a very homogenous team of really smart, logical people, but without the balance of more creative, empathic types. Ideally, a well-functioning team will have both, and will have people from many different backgrounds and educations, because that's when you get true collaboration and innovation: by mixing unrelated disciplines.


(Standard disclaimer, speaking for myself)

1. Your interviewer didn't give you a good interview or follow guidelines. In interview training they tell you the first thing you must do to start an interview is to ask if the candidate would like to get some water / use the restroom, then break the ice before starting any questions (applicable also during phone screens).

2. Proper interviews actually are supposed to lean heavily toward real-world problem solving approach rather than arcane knowledge. For example, when I interview I look for rational decisions at every turn (not a random example but considering boundary cases, adding a new example to help you visualize the solution should give information gain rather than be something random). My questions are not math oriented, nor do they require deep knowledge of obscure theory. Based on what questions my coworkers ask, I know at least for my team this is not a correct characterization.

What we do test for: understanding of fundamental data structures and algorithms, ability to thrive in uncertainty (ask clarifying questions! state your assumptions!), ability to break a problem down and solve it from first principles.

Good interview questions are required to have multiple solutions.

And then you have the generalization at the end about creativity and diversity; in my limited experience we seem to get pretty decent diversity and even if there is some homogeneity (we need more women and minorities) it's certainly not the kind you described. No, it's not a bunch of mathy theory wizards writing code at Google, it's way more diverse than that. Not perfect, but not awful like you're describing.


> "The problem with Google's interview methods is that they all select for a very specific type of programmer: heavily math oriented, deep knowledge of obscure Computer Science theory"

I'd be marginally okay with it if the interviews actually selected for this sort of engineer! I've seen multiple people who fit this description to a T who flunked the process, hard.

If the goal here is "pick the hyper-mathy, deep-CS types out of the crowd" I'd argue the process isn't even very good at that.


Having a high degree of false negatives doesn't mean the positive signal isn't reliable.


Off-topic pet peeve but why is "OK" now apparently spelled "okay" these days? (especially in bandwidth-limited situations such as SMS or IM). OK is not short for "Okay", OK?


"These days"? It's been spelled that way for nearly 100 years.

> Spelled out as okeh, 1919, by Woodrow Wilson, on assumption that it represented Choctaw okeh "it is so" (a theory which lacks historical documentation); this was ousted quickly by okay after the appearance of that form in 1929.

http://www.etymonline.com/index.php?search=okay


I think "okay" looks better than "OK".


But it is ...wrong. OK originated as an abbreviation so why spell out the pronunciation of the letters? Makes no sense to me.


Because the connection to a 180-year-old fad for misspelling "all correct" is so obscure and non-obvious that it was lost long ago? Do you think we should be capitalizing LASER, too?


It has always been an alternative spelling.


Not so; as you can see in my other comment's link, we can cite OK about 90 years before we can cite "okay", and more reliably than that we can cite the alternative spelling "okeh" to 1919, which establishes pretty well that "okay" was not standard then.


by talking about this you've wasted all the bandwidth your two bytes would have saved a year.

As far as I understand this has been common vernacular since before my lifetime, I'm not usually one to welcome evolution of the base language but this one is before our lives we need to let it be.


Google's interview process, and interview processes modeled on it, do not select for "math-oriented CS-conscious" engineers.

These processes select for recent CS graduates from a handful of universities where Google expends recruiting effort, and anyone not from that background mostly only gets in by blind luck or by knowing someone already in Google who can navigate them through getting hired there.


> but not one test on knowledge of languages, architecture, design, or actual real-world problem solving

Maybe they fail to do so but I do believe the goal is to test real world problem solving. However, I think they stray from specific language or domain knowledge because they want you to be able to work in different roles, since you don't have to interview again to switch teams.

From what I've read, the idea is to hire people who would be smart enough to learn any specific domain knowledge necessary, because the expectation is engineers might have to tackle problems they wouldn't have seen elsewhere. I don't really know whether thats true anymore as my impression is now Google just has a bunch of overqualified people though..

> The problem with that approach is you end up with a very homogenous team of really smart, logical people, but without the balance of more creative, empathic types. Ideally, a well-functioning team will have both, and will have people from many different backgrounds and educations, because that's when you get true collaboration and innovation: by mixing unrelated disciplines.

Can't disagree with you there, but its a weird assumption to say that people who are logical are not creative or empathetic. I do think that they hire for "Googliness" whatever that means, which may lead to a monoculture though.

In any case, I guess you can call me a Google fanboy. I don't agree with everything they do but I feel like bashing Google's (or most other company's) interview process is the cool thing to do here, but most people don't seem to have tried to understand why it is the way it is, and thus don't offer any true alternatives that meet the same goals nor do they reject the goals in the first place.


This explains why Google is incapable of attacking any product that requires an understanding of humanity to be successful rather than just raw data. (e.g., google+, youtube comments, etc)


these are just lack of good product management, nothing to do with engineers. I do suspect Google's PM culture is... lacking.


YouTube is pretty successful (even if they acquired it).


It tells more of the interviewer than the interviewee. They don't ask the questions you mentioned because they don't know or they don't know how to best judge the answers. They asked the questions they know well, that show you what the breadth of their knowledge is. It's like the saying A-players hire A-players while B-players have a hard time judging A-players.


B-players know an A-player when they see them, they don't hire them because they feel threatened.


I got pinged by Google about a year ago (after being narrowly rejected 9 years ago) asking if I'd be interested in re-applying. I said, "Sure, why not?".

I was immediately asked which department I wanted to join and why. I said, "Err, not sure, how about SRE?". To be told, "Oh, well that's not my area, let me ask them."

Shortly after that I got a curt message saying "Thank you for applying to Google. We have no vacancies that would suit you right now, thanks for applying, goodbye."

Somewhat bemused by the whole process (you contacted me, dude!), I went about my day.


LOL. That's happened to me before too. And I've been on the other side, where I reach out to people to ask if they'd be interested in a role on my team.

However, I immediately tell them the role that I have, to avoid the whole, "We have no vacancies that would suit you right now" answer. Seems like their recruiters should have done that up front, to save you and themselves some time.


(Same disclaimer)

If you interview frequently, at least for SWE, this is certainly not how we go about things. ghire guidelines for SWE don't allow for questions like this, or behavior like this.

Is it possible this was an SRE interview? I guess, but it really sounds ungoogly and these questions sound like they don't give great signal. I'd be ashamed if this is how we hire SREs.

Is there really an "avalanche" of reports like this? Most negative reports I hear have to do with our SWE questions which tend to be difficult.


You should be ashamed then, because these are definitely the questions used on SRE phone screens.


The questions are fine for SRE. The problem is the behaviour and the expectations of the interviewer.


The people doing these interviews are non-technical people who read off of a cheatsheet. The cheatsheet covers alternative answers, but a situation like the OP describes can never end well.


If that's true, making non-technical people conduct technical interviews is also a pretty big failure.


Thanks, I am realizing now that it was SRE. All I can say is I'm definitely a fan of how we interview SWEs and I'm sorta bummed this is how SRE interviews go. TIL.


I got these questions when being interviewed for SWE and SRE. My answers suggested to the interviewer that I should go down the SRE route. I passed in the end, because I felt the culture regarded everything as a technical problem (which has worked out for them so far) and Google was a company where you did things their way, and don't rock the boat.


Are SWE and SRE interview the same? I thought they were different enough job descriptions that it would require different interview questions.


They are different, but a lot of the skills are similar. SREs need to problem solve and while they may need to have more domain-specific knowledge I'm not sure a facts quiz administered by someone non-technical is the best way to do that.


I got rejected just this month and I can certify that there was no crazy bullshit in the process. I mean, I feel like you made a mistake rejecting me, but I also can imagine a valid process behind the scenes which would reject me based on my "ok but not great" performance. I do hear anecdotes from people I trust which sound crappy (being asked very specific technical questions on subjects that candidate doesn't have experience on and not being flexible about it, being rude etc).


Google turned me down a couple times before I got hired. The first two were definite mistakes (false negatives) where I should have been hired. Just treat it like a process to be optimized. You can reapply every 18 months.


> At what point do we start taking reports like these seriously

My guess is when the number of applications per position actually drops far enough that the false negative rate starts to hurt.

Until then, an interview processed optimized for avoiding false positives at all costs will persist. Totally makes sense for a company worth hundreds of billions though, can you imagine if they had a few more bad hires sneak in? Oh my god, it would destroy everything.


In my (albeit anecdotal) experience, incompetence is the norm rather than an outlier in BigCo SV land.

Frankly, if you are more qualified for a position, chances are you will be rejected because your interviewers will fear for their own job security.

I've always found that type of logic strange, though. Wouldn't you want someone who was better than you currently are on your team? Wouldn't you be able to learn from them?


That only applies if you want to learn. If you just want to coast and be "the best" at something at your company, you dont look for people better than you. At best you look for people that are better at the tech than you, but who are passive or easily browbeat so you can claim their work as your own.


That depends on if the interview process actually does protect from bad hires.


It happens all the time Google acquires a new company, those employees aren't going through these crazy interview processes.


actually, acquisitions normally trigger full interviews- see "Chaos Monkeys" for a description how this happened at Twitter/Facebook.


Is this an US thing?

I went through three acquisitions and never had any interview, besides the set of meetings for each of us to decide to go along with the acquisition or get a severance package.


I think it all depends on the requirements of the company doing the acquisition. I've gone through several myself and did not have to re-interview however I have heard of a employees claiming Google required them to re-interview which has got to be nerve racking considering Google's heavy focus on avoiding false positives a lot of good people don't pass them.


I had similar questions for a SRE position at google a few years back in fact, so I found it interesting and does not surprise me.

I eventually refused the position without going on-site just because of how ridiculous the questions/replies were (and frankly, because I had another good offer elsewhere, but it did contribute greatly).

While my experience wasn't as bad by a long stretch, I can see how this is plausible. In particular I immediately figured out that the recruiter wasn't very technically inclined, had a "heres a list of correct responses" spreadsheet to help him, and had very little time to waste.

Due to taking that into account - I was always accommodating instead of confrontational (which got me more interviews, which were better/with real engineers, yay - though not great either). Had I been confrontational, pointing out mistakes and misunderstandings, I'm sure it'd have gone pretty bad.


I was asked pretty much the same questions for an SRE position at Google. Note that I only found the recruiter phone screen to be this kind of 'pop quiz'. The engineering interviews were more detailed discussions with engineers.


About 8 years ago I had the same questions.


Mine was 2 months ago. I'm surprised how stale these questions are.


I can confirm that a recruiter contacted me and asked pretty much these exact same questions when trying to recruit me for an SRE job.

My recruiter was reading these questions off of a sheet of paper, but when we had discrepancies in our answers and she would say something like "It says here an inode holds metadata", and I would respond with something like "oh, metadata and attributes are the same thing", she would say "oh, well you are correct then!"

I made it to the first phone interview but that's where the path ended. I was bummed for a little but then remembered I prefer small business anyways :)


On one hand you say that these are "bog-standard" SRE questions, and on the other you say it's "super strange".

What exactly is super strange? That a non-technical recruiter asked the questions? If that's not the strange part, then surely it's believable that the recruiter would not recognize some of the subtleties involved?

That said, if this guy is the creator of GWAN, then it's entirely possible that his personality rubbed someone the wrong way and he was nixed for "personality reasons" in the only way they could.


It's super strange that the recruiter would have no understanding that alternate answers are possible, and would end the call abruptly claiming the candidate didn't know their fundamentals.

I've done this kind of phone screen for an entry level position at Google, and while the recruiter wasn't an engineer, they did have some basic knowledge of the concepts involved, and were able to prompt me with follow-up questions if I missed something or got a question half right. The questions themselves are not strange, it's the alleged attitude of the recruiter.


"What exactly is super strange? "

Sure. For starters:

1. This guy apparently did not know he was interviewing for an SRE position.

2. The recruiter was looking for very very specific answers and immediately rejected any others.

3. There was no other discussion of anything, at all.


This is exactly why I think that transcript is a bogus, one-sided take from someone who's dejected and hurt by the fact that they weren't chosen. It reeks of the smell of someone that thinks they were smarter than the interviewer.


the questions asked don't strike me as strange, but the corrected answers/explanations by the recruiter are very strange indeed.


This was very similar to my first experience interviewing for an SRE role at Google. After about 20 minutes I got tired of arguing with a clearly non-technical recruiter and politely excused myself.

My second interview well, that was a whole different bucket of problems.


Yeah. I'll never interview with Google again. I've got friends who work in (mostly nontechnical) roles there who had very different experiences, but my interview was such a hot mess of disorganization, cluelessness and arrogance that I ended it early and told them I had no interest in the role or the company.

Two years on, I think I made the right call.


Yeah I had a somewhat similar experience. My first technical interviewer was 15 minutes late (so it was now a 30 minute interview). Then, after being asked the typical slew of questions (what is a hash table, etc) I was asked to implement a basic data structure (Set). Which was easy enough but my interviewer wouldn't let me finish writing up my implementation and, instead, insisted I focused on optimization of a particular, custom method he asked me to implement. I protested (premature optimization, etc) but ultimately went along with it. I finished optimizing but didn't get a chance to finish my Set by the time the interview was over.

I got an email a week later saying thanks but no thanks with zero explanation. I had gotten everything right, what went wrong? So I had some of my Google friends track down the interviewer and ask. Apparently I didn't continue forward because I didn't finish my Set implementation...

I've had Google contact me on occasion since then. I have not re-applied / re-interviewed with them. Their interview process is already bad enough.


I was interviewed by Google for an SRE management position and I got asked 6 of the OP's questions.


These are not director questions. I wonder if this is a case where a team didn't want to hire:

A team gets pressure to hire, but they don't want to.

A team has a great internal candidate but can't push it through without going through external candidates. (Expecting any director couldn't answer these -- which they shouldn't).

A team can get 2 for 1. Usually an H1B situation, which has the extra benefit of chaining the 2 candidates to the company. Former H1B employees love this option.

A team has a 'friend' in mind.

I honestly think, this isn't a question of a dumb recruiter, more like a way to just push something through. The recruiter was probably taken to lunch with a high five. This is very normal. I wouldn't freakout that they have a 'dumb process' you need to read between the lines here. The saddest part, this pawn gets a "PASSED" on his record at Google -- but it was just internal politics.


> team can get 2 for 1. Usually an H1B situation, which has the extra benefit of chaining the 2 candidates to the company. Former H1B employees love this option.

Are you saying google pays their H1B employees half of what they do others?


I would 100% back you up in my mind had it not been for the "Why Quicksort is the best sorting method?" question.

I hope you'll agree that there is no way a correct answer would ever validate this question.


I'm another Google employee. I really don't think that's an accurate transcription. There's a standard SRE question which is related, but different. I won't give the exact question, but you could try searching Glassdoor.

If "Why Quicksort is the best sorting method?" really was the question, then the recruiter must have asked it from memory and misconstrued it. It's certainly not a standard Google SRE interview question.


Having only seen the candidate's paraphrase of that conversation (and never having worked for or interviewed with Google), I would STILL be inclined to give that candidate a thumbs down.

To solve difficult technical discussions, it's important to be able to restate the other side's arguments in the light most favorable to THEM, while the candidate was entirely focused on paraphrasing the interviewer's argument in the least sympathetic way. Would you want to work with a person like that?


So what you want to see from the author is a Mao-era self-criticism stating why Google were 100% right in rejecting him.


I think they mean that if you realize you're talking to a non-techie then you should make an effort to use simple words. Your TCP hex opcode knowledge does not impress someone who doesn't know what hex or TCP is. Figuring that they must be looking for an answer along the lines of "SYN/ACK" is a skill as well. A people skill.


This is the SRE prescreen. At least it's the one I was asked in 2007, almost verbatim. Possibly too verbatim.


Also, his answer on #9 is wrong, or at least <EDIT> his explanation of the conversation is terribly confusing </edit> With 10000 numbers, it's only efficient to create a lookup table with 8-bit integers, not with 16-bit integers.

Based on his LinkedIn profile, I don't think anyone at Google would have thought of him as a "director of engineering". Being an "R&D director" at some unknown company at 24 is entirely un-comparable to a director at Google, and since then he's worked at his own very small company. He was probably a candidate for Senior SRE.


Who's answer is wrong? Cause no-one suggested a 16bit lookup table.

His answer was to look at 64 bits at a time and do a [0] Kernighan style count. The "correct" answer was an 8-bit lookup table. Which is right is going to be highly dependent on the data and the architecture you are using.

[0] http://stackoverflow.com/questions/12380478/bits-counting-al...


You're correct, I misread what he said the recruiter's answer was.


Do you recall the question? Site's down. I recall thinking the right answer was to use POPCNT but maybe I'm misremembering the question.


The question was "how do you do bit-counting of a bunch of numbers". The two canonical answers are "lookup table" and "using bit shifts and multiplying by magic numbers".

The fact that there's a machine instruction for it does makes it a bad question.


Honestly I would have said popcnt as well. Lookup table or bit shifts when I can have the cpu count the 1's? I guess i'd need to benchmark it to be sure. Either way I can't say its a good question.


Popcnt isn't particularly well optimized in most micro-architectural implementations.


Looks that way with a quick test. But it looks like there may be a better way with SSE3 PSHUFB: http://wm.ite.pl/articles/sse-popcount.html


Is it? It looks like on most recent Intel CPUs, it's 3 cycle latency, 1 cycle throughput on a 64-bit register. A 8-bit LUT solution is going to less than 16-bits per cycle on any recent Intel/AMD CPU (maximum of two load ports).


Hmm, much better than I remember. I guess this goes a long way to explain why this wasn't always seen in practice: http://danluu.com/assembly-intrinsics/


just prepend "cache://" to the url


Woah, I've never seen this before -- thanks for sharing!


Just FWIW, about 5 years ago someone also contacted me for a google interview. The questions were very similar, so I'm not surprised. I don't know what's going on behind the scenes, but after a few mins I didn't took the call serious any more and in the end after 30 min it felt more like a way too long prank call.


I gather you didn't run this by your PR folks:

> Particularly, when one side presents something that makes the other side look like a blithering idiot, the likelihood it's 100% accurate is, historically, "not great".

I get that you are happy at Google, that you want to defend your employer. But implying the guy's a liar or a fool does not help. If anything, it makes me more likely to believe that Google has something to be touchy about here.


Well, I mean, he's right: any time a story seems outrageous and unbelievable, it's often because it has been embellished at least somewhat. That doesn't mean it's completely false either, though.


"Often"? Would you be so kind as to show me your data on that? Maybe it's just this election season, but I seem to be hearing about quite a lot of outrageous things that are perfectly true.

Even if you're correct and he's merely saying something generically true about almost any concerning story, him saying it in the context of the post reads to me as a veiled accusation.

For example, suppose you posted an open-source project of yours here. If I were to comment, "Open-source projects are often half-finished, buggy messes," how would that seem? It is factually true; randomly looking at GitHub projects is enough to show that. But in context, it's an unkind thing to say because encourages people to look at your project as one of them.


It's an interesting recruitment setup where overqualified candidates are rejected.


You can have situations where you are prepared to recruit someone with potential versus being the final article. In those situations the overqualified candidates may not compare well with what you consider the potential of the slightly under-qualified candidate, and may not have some of their other attributes. It's obviously a risk, but it happens more than people think.


Still, overqualified candidate supersets qualified/unqualified. He/she should pass the test.

Failure of recognising overqualified candidate from under qualified is a failure on the recruiter side, not the candidate side.

Recruiter is of course allowed to say "I'm sorry, but you are well overqualified for this position". In this case he was falsely recognised as somebody under qualified.


I realize this is a long shot, but seeing as you're actually in a pretty high position, could you try having the recruiter screen changed to be multiple choice?

It's simpler when having a non technical person asking the questions and I imagine it would lower false negatives. Frankly I don't care personally but it seems like most people would like this better.


IE maybe he applied to a position labeled director of engineering, but they decided to interview him for a different level/job instead.

A "different" job, several grades lower in responsibility (and pay). Without in any way prefacing him, beforehand.

Is that the way things usually work in the Google hiring process?


I agree with DannyBee. The author of this post seems to suffer from a bit of arrogance and inflated sense of his own abilities. It's likely he tweaked the story a bit to make himself feel better. I've seen many high level directors do this since at that stage in their careers, they can't possibly imagine themselves to not get an offer from a company.


Your comment combined with this one...

https://news.ycombinator.com/item?id=12701650

...creates an alternative interpretation in my mind that's not as bad. That is that the questions were a filter attempt done wrong in that it didn't account for stronger candidates giving better answers with a way for interviewer to confirm them. On top of that, a simple, data, entry error by HR person or whoever forwarded his name to them might have put him in wrong interview category. That's two, small problems vs a huge one implied here.

Although the damage appears huge if they're filtering out candidates with his track record with the pre-screens.


5 years ago after applying for a unix systems engineer position at google, I had a phone interview with more or less the same questions - more probably as this interview was interrupted, so for sure these are questions asked from google recruiters but for sure I would also be surprised if such interviews take place for such high profile position. I would assume things are done differently in such cases and not via the "usual" way.


> These are bog standard SWE-SRE questions (particularly, SRE) at some companies, so my guess is he was really being evaluated for a normal SWE-SRE position.

This makes the most sense to me, why would a director of engineering be responsible for getting Google back online if it went down when there are SREs.


Yup, those are SRE questions, but the fact that Google didn't interview him for the position he applied for makes them out to be even bigger idiots than I had them pegged at for using SRE questions on a director role. Regardless, just having such a stupid process exposed reflects badly on Google. In my own experience, Google isn't even able to call at the scheduled time so, while not the worst interviewers ever, they're pretty close and very far down the ladder. Put it another way, I doubt they could convince many to even interview without their extremely hefty compensation packages.


From what I've heard, google intentionally screws with timing, who you will be speaking to, and other factors in order to try to understand how you deal with changes in circumstances.


This is blatantly false.

Source: I work for Google. Our daily schedule is packed with meetings and we try to be as on time as possible, interviews (which are something that everybody should be doing) work exactly the same, we don't try to screw people over with bad timing just to "test" them. Sometimes it happens that people miss interviews and somebody else has to show up, this is unfortunately a problem and it shouldn't happen but sometimes it does (accidents and unforeseen things happen). It's not done on purpose.


I know you don't set up the system, so I'm not blaming you here. But this is the problem:

> Our daily schedule is packed with meetings and we try to be as on time as possible

If Google's goal were to respect the candidate's time, interviewers wouldn't have daily schedules packed with meetings. The less slack in a system, the worse the failures are. That employees try to be "as on time as possible" is a sign that everybody understands the scheduling is unrealistic.

This isn't unusual, by the way. Most hiring processes don't optimize for candidate experience. Or even for good hiring decisions. Indeed, if you take a POSIWID view of typical hiring processes, the point is to make interviewers feel powerful and selecting for people people willing to put up with inefficiency and suspicious power dynamics.


I remember reading an article that this is how they interview for product managers? Not for engineers, but product managers are people who maybe need to deal with more craziness.


I admit I am not familiar with the interviewing process for the non-engineering sides of the org, but I'd still find this very unlikely and weird. We are very very strict in our meeting and timing policies as our meeting rooms are usually packed with people all the time so we really need to be in and out at the given time. It's counter-intuitive that somebody would delay a meeting on purpose...


I interviewed for product manager at Google. It was tough, but no craziness involved: flew in, had a day of interviews, flew back.


If that's true (and I would love to know if it is), then Google should know that is a big no-no in a lot of cultures, and there are much better ways to test for it. Some still hold the broken word to be an indication of people you don't want to deal with, ever.


A counterexample can't prove that that never happens, but I interviewed with Google this year and every interaction on the phone or in the in-person interviews began within 3 minutes of the agreed-upon time. So I have a hard time believing this is a general policy.


This is not the case (I work for Goog).


You unintentionally screw with the timing? (I had the same happen)


Nobody intentionally screws with the timing like was speculated. It's possible individual meetings might be a few minutes late I guess, can't comment on that.


And you are on a hiring committee?


The hiring committee meets after all the interviews are conducted, so isn't really relevant here.


Yeah, I've heard this too. If it's true, it just shows what assholes they are when it comes to respecting people's time. Which I think is the original point of the article which comes in loud and clear: Google will waste your time; don't interview there.


When I was starting out, Doing well in the Google interview process was a nice confidence booster.


Unfortunately they will just ignore the complaints and move on to the next resume in the pile.


This reminds me the story of one of the WhatsApp founders getting rejected by FaceBook. [1]

Hope this doesn't turn out that costly for Google. But I'd be happy if it does, if it is the way the interview was really conducted.

[1] https://twitter.com/brianacton/status/3109544383


Wow. If this is actually what happens, even a little bit, then Google has a huge, huge fucking problem, and it needs to be fixed.

I've never bothered to apply to Google. But if this happened to me, I'd just walk away. You don't know me, but you don't want that :-)


I think in this case, it's more like there are "independent" recruiters with their fixed Q&A sheets sitting around, somewhere, and fishing around for possible candidates to make it to a second level.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: