Hacker News new | past | comments | ask | show | jobs | submit login
Google says tricky job interview brainteasers were ‘a waste of time’ (venturebeat.com)
81 points by grzkap on June 21, 2013 | hide | past | favorite | 22 comments

The software industry is odd. Things change so fast, and new things comes out so often, but at the same time bad ideas masquerading as universally applicable best practices get adopted and linger forever. Google and Microsoft favoring this style of interview has given it incredible weight. I fully expect some interviewers will be giving this kind of interview in 10-20 years, still thinking they are doing the right/safe thing.

I think the problem with software industry is inertia. Someone well respected says everything is an object (or use dynamic typing everywhere, or functional programming, or node.js) and majority will blindly follow without ever thinking about more suitable alternatives.

> I think the problem with software industry is inertia.

What you say here and what you say below actually disagree with each other. Inertia would be developers never changing their tooling and never following the next shiny technology.

If anything, I think alot of programmers could do with a bit more inertia:)

> Someone well respected says everything is an object (or use dynamic typing everywhere, or functional programming, or node.js) and majority will blindly follow without ever thinking about more suitable alternatives.

I completely agree with this statement. Our industry is always quick to follow the leader:)

If you two think so, I'd be a fool to think differently!

I think people don't know how to think and evaluate. I tend to break it into Goals/strategies/tactics, though there are other useful breakdowns. I.e. start by asking yourself "what is my goal" - and it is not to ask graph algorithm questions, to use Django, or whatever. Once you have your goals, you can with hard work come up with strategies. From there flow tactics. Then, this gets communicated and everyone latches onto the strategies and tactics, never asking if the goals have changed, or are inappropriate for your use. Google wants to hire a bunch of young people that know graph theory and are willing to do anything to work for you (they have graph theory problems after all). Asking grad level CS questions is a way to get those people. So every start up follows suit even though they need something completely different, and Google persists even when they need something different.

OO solves some things well, and is an appropriate strategy for some situations. It does not follow that you should use OO for everything. But, non-thinkers learn 'OO good', or 'CS questions good', and just run with it, never thinking about why those things are good, and in what context.

Agile is everywhere these days. My company has adopted it wholesale, even though everything we do is extremely schedule oriented (due dates that are unchangable), and involves heavy coordination between many moving parts. But want to use a Gannt chart or a frigging calendar organizer? Get mocked, censored, and shot down for not being a team player. Agile, Jira, and all that has become the goal, and you are not allowed to think about whether it makes sense or not in a given situation. Decisions are judged on whether they are "agile", not on whether they are the best way to accomplish something. So we fumble about with a 'priority' list that makes no sense (everything must be done, there is no priority) and that completely obscures the very real relationships (scheduling and otherwise) between all the items. Utter, complete madness.

Umm, are you sure you replied to the right comment an story. I've read what you posted twice now and I can't see how it has any bearing on what I wrote;)

Very true. Being the best in an industry can give you the illusion you're the best at everything. Even in areas where you have no basis for being the best.

Once Microsoft, now Google, by virtue of their high salaries, prestige and market leadership are virtually self selecting in the candidates they attract. Of course there are plenty of poor candidates, but both companies essentially hire based on brand, either that of the candidate's past employers or alma mater.

I'm glad this useless technique is in the recycle bin of history. Now if only they started making people write code in interviews then we'd be getting somewhere (maybe they are? are they?)

> new things comes out so often

New names for old things come out often. New things are pretty rare.

New concepts yeah, you're right. But we are endlessly barraged with new tools, frameworks, libraries, etc., all of which have their own details to learn which is the "new" I had in mind.

Discussed previously at https://news.ycombinator.com/item?id=5911017 (submitted 23 hours ago, 265 comments)

I second this magical de-duping technology called "corin_".

If there's a better way to keep dups from the front page, someone please tell me . . .

I hope they will soon see the difference between Software Engineer and Algorist (TopCoder(tm) sportsmen)

The whole Google (and silicon valley) interview process is a head scratcher to me.

So, I'm 46 years old, kind of old compared to their usual hire. Their recruiter (they contacted me) suggested some books, and about a month's study time to be prepared for the test. Based on what I've googled around the web (which, as others have pointed out, might be mythological), the test questions all seem to come directly from the exercise pages of Knuth.

What's my beef? Knuth's a great work.

Okay, first of all, why do I have to study a month? Isn't that remarkably indicative that this stuff is not used day to day in programming? I don't spend my days programming red-black trees. Not because I'm dull witted, but because 1) there are great libraries available already, and 2) I work in a different problem domain. I do a lot of vision processing, math, physics, and so on. I live with textbooks cracked open on my desk, and several google panes open. As I write this I have Theoretical Mechanics and Pearl's Heuristics on my desk. I'm inventing new ways to filter very noisy data, and so on. Lots of AI, heuristics, etc.

Bellman-Ford algorithm? I know of it. Haven't touched it since grad school. If I need it, or to select the right shortest paths algorithm, first thing I'll do is go to Boost or LEDA. It's more important that your code be right than fast. Only then, if not fast enough, will I roll my own, at which point I'll pull Cormen off the shelf and code it up, and the boost code will serve as my unit test code. It's not a big deal to code an algorithm. And I'm not putting it down, its fascinating, and interesting, to talk about worst case performance, how to distribute it, and all that. But it's not the work I do. Want me to think originally about an algorithm? (coding Prim's is not original, after all)? Fine, ask me about the algorithms I'm working on, and I'll go on for hours. I'll break your brain, and you'll have a good measure of my algorithms work.

So, this is the real crux. I'm supposed to spend a month preparing for this and they won't tell me what job I'm interviewing for. I have to go through possibly more than one phone screen, then go in for at least one round of interviews, have them spend several weeks mulling whether I am worthy, and then, only then, will they try to match me to a position. No match? "thank you for your time, try again in six months". A match? Expect another round of interviews. All of that against the backdrop of very high false negative rate - you can be perfect for them, and still have a very high probability of being rejected. The recruiter kept emphasizing the very large acceptance rate they have. Of course they do. Who but someone passionate about working at Google would subject themselves to all that. I don't care about working at Google, I care about working interesting problems. And they have them, but they have a lot of stuff that I have no interest in doing, either.

I have a crazy idea. I know I'm a very good programmer because I have been thrown against very hard problems, solo usually, and solved them. I've invented new products and taken them to market. My VP reads my writings (I keep lab notebooks) and scours them for IP and patent ideas. (more back patting goes here...). You get the idea. So, I have the crazy idea that the best predictor of on the job performance is ... wait for it.... on the job performance. It's crazy, but it just might work!

Seriously. Give somebody a programming assignment to make sure they can code a for loop. Preferably have it be something you can talk about - how to make the API robust, how would you distribute or scale it, how to make it exception safe, how to report errors, and so on. Give them a chunk of mediocre code and have them give it a code review. Give them a design problem. Probe their knowledge about a subject they claim to know and live. Follow up with references and make sure that what they said they did in their resume was what _they_ did, and not what their teammates did. And, did you enjoy talking with them? There's your interview, and by and large it has worked for me. Don't bother with routing algorithms unless that is the job (I do recognize Google does some routing!)

I've talked to so many high achieving graduates of places like Stanford that couldn't code a for loop, that couldn't make reasonable engineering decisions about how to solve a problem, but could blithely go on with "Oh, I'd do a Hackman-Stein reduction, run that through a Bidderfinch Tickler, invert the Fliberty relations...". Great theoreticians, probably terrible about keeping a start up thriving.

I harp on this because people just aren't that good at whiteboard solutions. I recall a chem professor - he would make some statements, ask a question, and there would be silence. Then he would berate our intelligence. Except the question was something that took a Nobel prize laureate to solve originally. No one can solve this stuff in a few minutes - you are selecting for actors. "Oh, let me see if I can think this through...." and you pretend like you are inventing Prim's algorithm, or binary tree delete, or whatever. All these questions were once original publications in the literature. They are HARD, made easy for us by repetition and familiarity. Any fool can tell you the structure of DNA - discovering that structure was a "bit" harder. You will not be testing my ability to discover DNA by asking me to 'derive' the structure on your whiteboard.

I guess that was too long and rant-y, but I really wonder what people are trying to measure in their interviews, because by and large I walk away feeling that they have no idea about my skills and limits. I get job offers when they should run away, and no offer when I would completely change their business. All based on interviews that seem entirely disconnected from trying to discover how I perform on the job. You want to be my surgeon? Fine, first do a textual interpretation of this passage from Shakespeare...

So long as Google has a large pool of applicants they can afford to interview that way, I guess. I wonder what they give up? I do work at the level of their self driving car, and refuse to even phone interview with them. But, that's fair enough because with my attitude I'm probably not a good fit. They are looking for cogs - do some ad banner work, move over and gmail it for awhile, and so on. Yawn. Everyone knows their hashing and tree balancing. No one has been vetted for being pragmatic over running a project, getting a product out the door (they have so much money they can afford to meander and not ship), turning out high quality code and designs, documenting their work, creating original ideas, and so on.

So, perhaps their interview process is fine for them. But if you are a startup surviving on fumes, don't emulate a company with billions in the coffers and that can afford to meander, and fail to ship products for months. Patting yourself on the back for smarts is fun (I did it above!), but it doesn't ship products. Interview for achievement. Us "old guys" be smart too, even if we did forget our Floyd-Warshall, and thus can't pretend to derive it for you, from scratch, on a whiteboard.

This reminds me of a time when I didn't do well on an interview that involved data structures. I usually review my old data structures and algorithms book prior to an interview, but I didn't this time because I was very busy.

A few months later, I needed to do a lot of tree traversal for an algorithm. I studied up on it and started to feel embarrassed about the questions I had missed. Now that I'd reviewed tree traversal, these questions seemed less esoteric and more fundamental.

I mentioned this to a coworker, saying that I could see how an employer that would need someone with knowledge of trees and traversal algorithms wouldn't have wanted to hire me. His response was insightful. "Actually," he said, "what you showed is that as soon as it came up on a real project, you were able to do the research and implement the algorithms."

The problem is, how does the employer know you're one of the people who can actually do this? I guess it comes down to confidence. Google accepts that you probably don't know certain algorithms like the back of your hand, but they want to be sure that you can get up to speed with them in a month or so. So they tell you that you should spend a month reading and front loading algorithms and data structures. Then you interview, and they get to see how good you are at this when it is fresh in your mind.

Anyone can claim they'd be able to learn, I guess this is how they force you to prove it.

BTW, I'm only writing this to explain why they might want to do it, not why it is a good process. I completely understand how it feels from your point of view - I've been there too. Unless I have a particular problem I need to solve, there isn't really any reason for me to reload certain data structures and algorithms into my head. An interview could be a reason, but if I don't get the job, then I just spent the month studying for nothing.

It really is nothing, because this is stuff I've already learned. To me, this would be lie having to re-lean difficult integration by parts techniques from calculus . I've done it more than once, I know it's there, but I don't keep it all front-loaded.

I'm at the point where I'm not sure I'd do this again. The final straw was an 8 hour take-home assignment where the company didn't get back to me for a month and then had the recruiter call me to say they'd decided not to pursue this anymore.

Google should do what it wants, of course, and if people desperately want to work there, they can get away with this sort of interview process. A lot of people (including me) just aren't willing to put in a full month of study just to get an interview. If I was certain to get the job, and the study is part of necessary preparation for the type of work I'd be doing, then sure... (but then why is this completely uncompensated?)

My guess is that google is fully aware that they will lose some people who would have been hires, and that they are ok with this tradeoff (ie., they are willing to tolerate a high incidence of false negatives to avoid the harm of a false positive). I respect that, though it does rankle when they also complain to congress about the shortage of good programmers when this shortage is almost built-in to their hiring process.

Yes, they are vocal about their false negatives, and they are happy with it. As long as the resumes keep flooding in, and they are hiring who they need, why do they care about rejecting qualified people, or wasting their time?

OTOH, I care about them wasting my time.

"The problem is, how does the employer know you're one of the people who can actually do this" - I addressed this, I guess unclearly. Ask me. Ask me to describe my algorithms. You'll know very quickly if I am misrepresenting myself or not. You'll also see if I get passionate about my work. If I have 37 ideas that I haven't had a chance to develop yet (I hypothesize that any good brain develops more good ideas than it has time to pursue). Did the code ship? That is not ironclad proof, as I spent 4 years at a company and at the end the project was cancelled because of a political shift in Washington (this was defense work), but it can be a useful measure. I know people that spend forever 'tweaking' algorithms, but never producing anything that generates value for the company. Those people are largely useless unless you have a research branch, but would (and do) pass a Google interview with flying colors.

I've been told that they do not care about your resume, and that you will not be questioned on it. OTOH, expect red-black tree questions, scaling questions, and so on. But, they selected me. I have not worked in that area of CS. Ask me what I have been doing for the last 25 years, give me some kind of test to make sure I did what I said I did, and you'll quickly know if I can help you out or not. If I can code probabilistic graphical models and make them work in real products (which is what I'm doing at the moment, more or less), I can re-learn graph theory if I need to. On the other hand, if I can learn book answers to Dijkstra's algorithm and code shared_ptr on a whiteboard with no errors, I've proved very little about my productivity and creativity (a question I was asked to do by a Chicago HF trading company, and 'failed' because I didn't do it the 'boost' way, but spent time talking about all the tradeoffs one could make in such a design and made different choices then boost).

And that squares with the conclusion of the article - talk to the person about what they did, how they did it, how they dealt with problems. Trouble is, I see no evidence of that in Google's actual interview process. You better be 20 and fresh out of grad school, or willing to grind through a lot of CS theory that you haven't had to do for a long time.

I know this is google, and people there are smart, but there is a chance that you're overestimating the interviewer's ability to know quickly if you have these skills. If they're very unfamiliar with the problem space, they may lack confidence in their ability to interview you. On the other hand, if they know the answers in advance (or at least understand the problem very well), they'll be better able to gauge your performance. So they go with red-black trees.

Not your skill or ability, as you pointed out, just your performance on some narrowly chosen questions from an area that the interviewer understands well in advance.

I've thought about false negatives a lot, but you've raised a good point about the high risk of false positives in this sort of interview culture as well. I will grant that it isn't easy to study for and pass a data structures grilling that can last all day (or longer), even if you have studied for a month. Anyone who can do this is at least capable of writing and understanding fairly complicated code.

But you are absolutely right that this may lead to "false positives" as well... they may simply be hiring people who are great at studying for and passing algorithms quizzes. This certainly doesn't mean they'll be successful. In fact... well, you've essentially pointed this out, but what does it say about a motivated and creative person that he or she will drop everything in order to study for something with only passing relevance to the number of projects he or she would like to be pursuing?

Well, desperate, I suppose, could be one of the motivating factors. But how many top programmers are desperate? Many would be interested in working for google, but aren't desperate enough to put aside the interesting things they'd like to be working on in order to spend a month preparing for an exam on red-black trees.

Maybe this article suggests google may be coming around to realizing that they are doing some harm to their hiring efforts with this approach to technical interviews?

Google never, ever used brainteasers in interviews (unless you count coding exercises - which you shouldn't).

More interesting was the fact they have given up screening on academic credentials.

"On the hiring side, we found that brainteasers are a complete waste of time" -Laszlo Bock, Google's senior vice president of people operations

How did they arrive at this conclusion, having never, ever used brainteasers? Why did they allow this perception to persist for so long (and apparently continue to play along even now)? Why do the websites like Glassdoor, in which Google interviewees share the questions they were asked, contain hundreds of brainteaser and guesstimate questions?

From an ex-member of Google's hiring committee in 2010:

Brain teasers? Banned. (Of course, everyone has a different definition of a brain teasers.) If an interviewer were to ask a candidate a brain teaser, despite the policy, the hiring committee would likely disregard this interviewer’s feedback and send a note back telling the interviewer not to ask such silly questions. ... Now, let’s look at the very widely circulated “15 Google Interview Questions that will make you feel stupid” list. You want to believe these are real questions, given that Business Insider feels like such a reputable source. Except that they didn’t get this list from a direct source. They borrowed their questions from some blogger (I won’t link back here) who was posting fake questions. ... How can you tell that they’re fake? Because one of them is “Why are manhole covers round?” This is an infamous Microsoft interview question that has since been so very, very banned at both companies . I find it very hard to believe that a Google interviewer asked such a question.


In my experience (10+ Google interviews) there's been a lot of brain-exploding algorithmic questions but nothing that would be called a traditional brain teaser.

Try & find a first hand, non anonymous account of someone being asked brain teasers at a Google interview.

Not exactly. Google has never advocated these kinds of questions, but they also allowed engineers to ask whatever they wanted (within certain parameters).

I know of one guy who did get a "cannibals and nuns crossing a river" question, and others who got some complicated thing about how much to bet on the outcome of drawing black or white stones from a sack.

But the majority of a Google interview has always been (to my knowledge) more practical.

> More interesting was the fact they have given up screening on academic credentials.

I agree. I've always thought screening based on a three tier system based on how quickly you can solve problems, quality of solution and personality (ability to get along with co-workers) would more accurately reveal the best candidates.


I'm proud of them for reflecting inward.

The HN mods should once again change the headline to the nytimes article as the original source just like the last time this was brought up. Despite the fact that for many, this was the interesting part. This is clearly blog spam and link bait.

Not changing it would be hypocritical at best.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact