
Debunking the Google Interview Myth - BarkMore
http://www.technologywoman.com/2010/05/17/debunking-the-google-interview-myth/
======
bnoordhuis
From anecdotal but first-hand experience (went through a couple of interview
rounds), Google interviewers indeed don't ask the questions on that list. But
the questions they do ask are hard and often academic - as in, you have likely
never run into them in a real-world scenario before and you likely never will.

At first I was like 'wow, Google must be populated with demi-gods'. Then I
spoke to some Googlers off the record and it turns out that those interview
questions have little to do with the reality within Google.

Not that Google isn't a company lousy with smart people. They are.

And it's not that Google doesn't work on hard problems. They do.

It's just that the interviews are an extremely efficient dud filter, probing
you about stuff that even at Google's you won't be working on 5% of the time
(if at all).

~~~
moultano
_probing you about stuff that even at Google's you won't be working on 5% of
the time (if at all)._

In any highly skilled profession, 95% of the time someone with very little of
your skill could do your job. The valuable part is that you can also be
counted on for that 5% as well.

Most of my time at Google hasn't involved any substantial theoretical work.
One time though, I did have to come up with a algorithm for computing
connected components that could run in reasonable parallel-time in mapreduce.
The resulting algorithm ran in log n parallel time passing n log n messages
over its lifetime. (This probably is in the literature somewhere considering
that a coworker and I had to solve it for unrelated reasons and independently
came up with the same thing.)

I'd like to expect my coworkers to be able to do that sort of work in a domain
that they're familiar with, because it takes certain problems and moves them
across the boundary from impossible to possible. Working at Google scale means
that you'll inevitably have to solve some problems that have never been solved
before, at least until a lot of universities find it worthwhile to build a
thousand node cluster for research purposes.

~~~
wallflower
I'd like to believe this story (haven't found an authoritative source yet)
that I heard once from a professional speaker of the FedEx main processing
facility in Memphis.

One day it completely stopped dead - usually it is controlled mayhem with the
packages and machines running - the silence was deafening - thousands of
dollars lost every minute. They brought in the best expert they could find. He
investigated and went to a single box in the plant, opened a door, and turned
one bolt with a wrench and everything restarted. The plant came back to life.
He sent them a bill for $10,000. When FedEx protested the bill - "You just
turned a single bolt! Anyone could have done that" - he itemized it and sent
it back. They paid him.

    
    
      Turning a bolt                $    1
      Knowing which bolt to turn    $9,999

~~~
gyardley
Unfortunately, this is likely just a modified version of a (almost certainly
also fictional) anecdote about Picasso. I've heard this one from several
different places:

A woman walks up to Picasso, later in life, as he sits at a cafe table. "Could
you draw me a sketch?" she asks, thinking she'll make a quick buck. "I can pay
you for it."

Picasso shrugs, says "sure", and quickly scratches out a little something on a
napkin. "That'll be $10,000," he says.

"$10,000! But it only took you a few seconds!"

"Just the drawing of it took a few seconds," Picasso replies. "Learning how
took my entire life."

~~~
util
That sounds like a paraphrase of a quote from Whistler:
[http://en.wikipedia.org/wiki/James_Abbott_McNeill_Whistler#R...](http://en.wikipedia.org/wiki/James_Abbott_McNeill_Whistler#Ruskin_trial)

------
endtime
For a first round phone interview for an engineering internship at Google, I
was asked to solve a reasonably challenging question (I don't want to give it
away, but it was algorithmic in nature and involved no code). When I got it
right, the interviewer asked me to _write an inductive proof of my solution_
and email it to him.

I like doing stuff like that so I wasn't upset in the least, but I can
understand why many people would find that scary and I certainly think
Google's interview process is more extreme than others'. I've interviewed
successfully with Goldman Sachs, Amazon, and Microsoft (among others) so
that's to whom I'm comparing them.

------
wooster

      That whole “Google cares about GPA even for people 
      years out of college” thing?  I supposed I can’t 
      speak for every hiring committee, but I never 
      remember my hiring committee discussing the GPA 
      of a professional candidate.  For that matter, 
      we were never even given a candidate’s GPA unless 
      he/she elected to put it on their resume.
    

This doesn't jive with what I know of the process. Tom Galloway's response in
the comments, however, does:

    
    
      GPA/SAT/GRE: Perhaps not in Engineering so much, but 
      a *lot* of non-Eng HR and HCs did do cuts based on what 
      school someone attended and/or their GPA and test scores. 
      And I know for a fact that even Eng applicants who had 
      been out of school for over a decade were asked for such 
      on a regular basis. There were a number of us, mostly 
      older types, who semi-actively campaigned against Google 
      asking for such from folk who’d been out of school for a 
      while as we felt it 1) was useless relative to their 
      actual job performance and 2) it was embarrassing *to* 
      Google for us to ask for such irrelevant info and gave 
      up a bad impression/rep.

~~~
gcb
Sales used to have a list of approved universities. Was not even a rank.
Mostly a filter

------
draz
I got to the onsite, day-long, interview stage at Google (about a year ago).
It's been awhile, but I do remember they asked some VERY real world questions,
ranging from lists related traversal questions, to "how would you implement
xyz." Some running times questions, etc. The questions were challenging, but
do-able, I suppose. My biggest issue was not so much with the questions, but
rather with 1 or 2 of the interviewers who felt it was more of a pissing
competition, always trying to undermine the answer. The rest of the people
were just _phenomenal_ \-- super professional and a pleasure to talk with. In
retrospect, it was the younger interviewers who were "problematic." I suppose
it has to do with insecurity (I later found out that they weren't at Google
for that long, nor were they such hotshots). Lastly, I believe that none of
the interviewers asked for any particular coursework or GPA, but the internal
Google recruiter asked for either a transcript or flat out GPA, to send to the
committee (who reads the interviewers' comments and makes a decision whether
to continue further with the interviewee, or not).

------
dkarl
_Coding on the spot might seem surprising to those outside of the software
industry, but it’s standard practice._

Sounds like wishful thinking to me. At least _I_ wish it were true. I've been
involved in a lot of interviews at my company, and I'm the only one who ever
asks anyone to write code.

~~~
Periodic
I was asked to write very little code for my current position. One of the
questions was being presented with some code and figuring out what it did. I
didn't realize how apropos that question was at the time.

I just got done doing a series of 10 1-hour interviews for contractors for an
enterprise Java web application contract lasting about 6 months. Only 2-3 of
them could write correct Java on a white board for simple questions. We had
three who weren't even close to Java syntax. Code quality seemed to be
directly relational to algorithm quality, in that those with better syntax
also knew how to solve the algorithms best, and those with terrible syntax
were way off on the algorithms.

Up until we asked these candidates to write code on the board, a few of them
sounded like great hires. They could rattle on about design decisions and
enterprise web application stacks, about configuration and organization. But
when it came to the white board at least one of them just stood there and left
it mostly blank.

Some of the positions here have had interviews where the candidate is expected
to write code but only one, if even that, interviewer asked to see any code.
And even then the managers don't seem too worried when I tell them the person
can't code, because everyone else likes them so much.

~~~
blacksmythe
Did you have a chance to rate the performance of the candidates that were
hired that had trouble writing code at a whiteboard in the interview? Were any
of them productive in a different setting?

I am more inclined to give people a chance to work at a terminal (with
internet access) instead of a whiteboard.

~~~
ghshephard
It's rare that someone who can write excellent code on a whiteboard can't do
so on a terminal. The goal of an interview, in my experience, isn't usually to
find a way to let the candidate shine, but find only the best person for the
position.

If the candidate draws a blank writing code on a whiteboard, that indicates
they have poor whiteboard skills - which is a pretty important skill to have
in a team environment.

------
cdibona
So this is a good post from Gayle, but I would add that the times when the
committees I've been on cared to see grades are when a candidate is on the
bubble between hire and no hire and they are within 2 or 3 years out of
school.

So grades matter, but not forever...

------
jamesaguilar
Yeah, the Business Insider questions were certainly not real. I actually got
several questions from family members and friends about it after that article
ran. The reality is at once much more pedestrian and more interesting (if you
consider algorithm or system design questions more interesting than brain
teasers like I do).

~~~
andylei
i agree with the general idea of the article, but I actually got some of those
exact Business Insider questions during my interviews with Google. obviously,
some of them are probably fake (like the manhole one), but there are couple
that definitely real.

edit: typo

~~~
Gayle
@andylei Yes, some do happen to be real. The point is that if several are
definitely fake, why believe the rest?

For example, take this one: "A man pushed his car to a hotel and lost his
fortune. What happened?"

Come on, no one was asked that. If _even if_ they were, the hiring committee
would take one look and that and throw out the feedback.

As for some of the other questions, like "How much would you charge to wash
all the windows in the Seattle?" (1) This question seems fishy, just because
it's about Seattle. Yes, Google has a Seattle office (where I worked), but it
has very few PMs. It's far more likely that this question was asked at
Microsoft. (2) This question is actually very standard for consulting
question. It's really a totally fair problem solving question, as the accuracy
of your final answer doesn't matter.

The Business Insider article is designed to be link bait - reprinting fake
questions and building off the "OMG IT'S GOOGLE" stuff. The legitimate
questions on this list are actually totally normal for tech and consulting
companies.

------
jerdfelt
In my experience interviewing at Google, the article is accurate. Granted this
was a handful of years ago and things may be different now, but I wasn't asked
brain teasers and I was asked to code on the white board.

Where my experience does slightly differ than the article, was their use of
your academic records. They didn't care about my GPA, mostly because I dropped
out of college very quickly and I don't have a GPA to care about, but they did
care I didn't have a college degree and used that to justify the offer I was
given.

------
taylorbuley
Found this incredibly interesting: xx let’s look at the very widely circulated
“15 Google Interview Questions that will make you feel stupid” list. You want
to believe these are real questions, given that Business Insider feels like
such a reputable source. Except that they didn’t get this list from a direct
source. They borrowed their questions from some blogger (I won’t link back
here) who was posting fake questions. Now, I don’t know that said blogger was
intentionally lying – he probably borrowed them from someone else. Whatever
the original source is, these questions are fake. Fake fake fake. xx

~~~
redthrowaway
I'm not at all surprised that bloggers, then media, recycle garbage stories
that get pageviews. "Bullshit laundering", as reddit recently coined it.

------
joshklein
Google's online application asks for your SAT scores. That always struck me as
odd.

~~~
magicalist
where is that? I see GPA but no SAT request, even if you apply as a current
student/new grad:

<http://www.google.com/jobs/application/>

~~~
Periodic
I just went through the application process (declined by committee) over the
last two months and the application did have a space for GPA, but I was never
asked about SAT or GRE scores as far as I can remember.

------
silverlake
I recently got rejected by Google. Most interview questions were straight out
of an algorithms book. Like write an interval tree, a hashtable, some dynamic
programming question, etc. I probably failed because I misunderstood a trivial
question as something far more complicated. As for the interviewers, half were
quite nice but the others seemed bored and distracted. I won't try again.

------
ajg1977
I wonder about the technical knowledge of this person who's writing articles
about hiring technical people.

"Explain the significance of ‘dead beef'" is a perfectly valid, if somewhat
unimaginative, technical question.

~~~
InclinedPlane
No, it's not. It's a trivia question. Nobody uses such techniques any more
because looking at raw memory dumps is not the right way to debug these days.

~~~
strlen
Using a "magic marker" like this doesn't imply looking at a raw memory dump
(which, however, is still needed in several places Google is involved e.g.,
kernel hacking, embedded devices). For example, I used it myself when
debugging an issue where our processes morphed into un-killable zombies. I was
dealing with crash dumps, where I was walking through the process list in the
kernel (it's a doubly linked list) in crash/gdb (looking at kernel memory dump
obtained via netdump from a production server that manifest this condition) so
I could find a process that has acquired a specific lock to test whether a
theory that I had was true. I wasn't (and still am not) at Google and I don't
even claim to have any expertise in actual kernel development: it was a bug a
customer was experiencing and it needed to be fixed.

Lastly (and I am completely going on a limb here, without any concrete
information), this could also be used to gauge team/culture fit Google. This
is a good way to see if you'll be a fit for a team that spends a huge amount
of time finding and fixing low-level bugs: it's fun to read about
infrastructure that Google has built, but the actual process of building it
may be very frustrating to some. Not being a fit for such a group doesn't
imply rejection, it could simply mean you're brought back to a different round
with another set of people.

~~~
InclinedPlane
If you want conformity that's fine. What would you think if your company
instituted a very specific dress code? Requiring specific trivial knowledge is
just as restrictive and constraining.

If you want to find engineers with talent then you need to do a lot more leg-
work in plumbing their knowledge, experience, and skill than merely ticking
off a checklist of shibboleths.

P.S. Personally I love hacker jargon and lore, I like knowing about scratch
monkeys, core memory, the usenet cabal, and even 0xDEADBEEF, but I wouldn't
take ignorance of those things to translate to ignorance of engineering
fundamentals or of passion in software development.

~~~
strlen
> If you want conformity that's fine. What would you think if your company
> instituted a very specific dress code? Requiring specific trivial knowledge
> is just as restrictive and constraining.

It's not about hacker folklore, it's about a debugging technique.

No one implies that this question is asked of every candidate. That's why
these lists are useless. It's also not the way I'd ask a question about memory
markers (I'd ask them about how they go about debugging such a problem and how
the tools they use e.g., gdb and valgrind work), but it's a fair question to
ask _those claiming experience with or interest in low-level development_ much
like it's fair to ask candidates who claim experience with data mining what
the kernel trick is (...but it would be misleading to ask a Linux kernel
hacker that question as that would mean something else to him, much as dead
beef would mean something else to a data mining guru).

~~~
InclinedPlane
Keep in mind, I'm responding specifically to the idea that _"[explaining] the
significance of ‘dead beef'"_ is a valid interview question. Have you done a
survey of memory debugging experts to determine how common knowledge of "dead
beef" is?

It's not a knowledge question, it's a trivia question. A shibboleth. It is
neither a necessary nor sufficient pre-condition for knowledge of memory
debugging techniques and it probably has about as much correlation with them
as asking whether they have read The Lord of the Rings.

~~~
strlen
> It is neither a necessary nor sufficient pre-condition for knowledge of
> memory debugging techniques and it probably has about as much correlation
> with them as asking whether they have read The Lord of the Rings.

It's a hexadecimal number, that should be quite obvious. It's a noteable and
easy to notice one. Others may have used a different one (I used 0x12341234
for the task as others have used 0xdeadbeef) but it should be obviously what
you'd use it for.

Is it one of the better questions? No. Is it merely a shibboleth or a piece of
trivia? When asked to low-level hackers, no.

~~~
InclinedPlane
When shorn of hexadecimal connotation as in the original question what value
is there to it? If I ask someone verbally "what is the significance of dead
beef" should I put any value in their either having already learned about
0xDEADBEEF or in their ability to appreciate that it could be a 32-bit word
aligned hex value?

If someone were to change the question to "what is the significance of bad
food?" would it be considered equally valuable?

~~~
strlen
O is not a hexadecimal number \- even if it was "BAD F00D" it still wouldn't
be a full 32-bit word \- it's also not use by existing libraries (libgmalloc)

cafe babe, dada dada would just as well as dead beef

In short, either you are familiar with magic numbers or you're not. If you're
not, you're certainly not qualified to work on anything really low-level and
may not be qualified to do C/C++ development in userland either (without
strong evidence to the contrary).

------
highwind81
Two questions pop to my mind when I was interviewed by Google:

1\. Given n number of computers with x number of numbers stored. Find the
median of all the numbers distributed among those computers. Catch is that no
single machine can store all the numbers. (Hope this is clear.)

2\. Given a text file of logs, write a threaded program that scans the logs
and produce some statistics. (My answer was to create n threads that reads
every nth line.)

None of those brain teasers were asked of me.

------
sabj
You can do useful mental exercises for Google interviews, or other interviews,
by working through "puzzle questions" and similar problems. But if a world of
limited time, this might not be the best use of your prep time.

A good source for the history of these questions and interview style, as well
as a bevy of sample questions, can be found in "How Would You Move Mt. Fuji?"
[http://www.amazon.com/Would-Move-Mount-Microsofts-
Puzzle/dp/...](http://www.amazon.com/Would-Move-Mount-Microsofts-
Puzzle/dp/0316919160)

~~~
ianl
The best exercises are to look for ACM Programming Contest Questions.

Here are some programming contest questions from 2005 from an Atlantic Canada
programming contest <http://projects.cs.dal.ca/apics/contest05/>

------
meowzero
This article does make Google's interview process more bearable. I always
found the Google interview "horror stories" were a deterrent. Perhaps that is
a good thing for Google since it already filters out certain people. But I
always avoided even considering applying for Google because I didn't want to
waste a whole day getting grilled with brain teasers and academic CS concepts
that I learned 10 years ago.

------
dspeyer
Google interviewees are asked to sign an NDA regarding the interview
questions, mostly so that no one can look smarter than they are by studying
those specific questions. Some candidates do forget it and leak information,
but since bullshitters sign nothing, the bs quotient is high.

~~~
Nitramp
I actually wasn't asked to sign an NDA (in Switzerland). Maybe someone just
forgot that, or they realize that creating a legally binding NDA is quite hard
in Europe.

------
ottbot
I've heard these type of brain-teaser questions, not coding or algorithm
questions, are still common in the finance sector, when hiring "quant"
developers.

Anyone know the extent of this?

------
j_baker
I don't doubt that hiring committees don't look at GPA. It's usually HR who
does that sort of thing.

------
yanilkr
spending time to Build some thing useful > (far greater than) spending time to
prepare for some thing so complex that only few people can crack.

~~~
moultano
Exploration vs Exploitation tradeoff: <http://en.wikipedia.org/wiki/Multi-
armed_bandit>

------
gcb
What about all the plugging for that interview site that suspiciously only
have google account login? Nobody noticed that?

Also, glass door has better content.

~~~
brown9-2
Are you referring to careercup.com? The author of this article is the owner of
that site.

