Yes that happens quite often. The really good interviewers know that and understand that their job is to help the candidate to present his/her skills. That means additional questions, rephrasing the question, patience, empathy and first of all: taming the premature judgements.
It's really hard but it can and should be done, if interview is to have any sense.
If you just take their initial response and move on to the next question, you fail as an interviewer. You've got to go deeper... that's also the way that you can quickly spot a bullshitter or someone making a proverbial mountain out of a molehill project.
I think a lot of candidates think they're immediately screwed if they don't have a compelling answer to every question. It's not seen that way on the other side of the table, at least not when I'm there. Often the interviewer himself is just trying to find a foothold that he can take to really start digging into something. Good interviewers throw out a bunch of questions that are designed to get you talking until you start talking about something they can sink their teeth into. Bad interviewers have no idea what to do and throw out stupid, random questions that sound like job interview questions to them, accept your answer, and then do it again for the entirety of the 30-60 minutes. At the end of your interview, they'll make the same decision they would've made if they'd only seen your picture and were told they had to decide from that (unless you managed to offend the interviewer during your responses ;) ).
Any nervousness I had as a candidate was permanently resolved after I started interviewing people and seeing how others reacted to interviews. Most interviewers are not disqualifying you for all the stupid crap you imagine. If you're good and they pass, they're either that incompetent in which case you don't want to work for them anyway, or they have many good candidates and the choice is then based on unknowable differences in perception like perceived likeability or other superficialities.
In that situation, if someone asked "tell me what special activities you've done in the past month", I might point to that one time where I washed the dishes, whereas my wife, not seeing dish washing as anything impressive, might say she's done nothing special. Even though I only washed the dishes once, I come off looking better there since I have an example I can point out.
Simply saying "I did my job every day" isn't good enough. However, not doing your job except on one occasion makes that one occasion seem special.
What actually happened was you just sat there browsing reddit until the deadline had almost arrived, then worked like crazy trying to get the site finished. Normally your team wouldn't work contracts like that (which means normally your team would do their job properly). And it was a challenge and a test of your skills, because normally you don't do jack squat at work. In fact, the reason you're looking for a new job is because if you don't quit your current employer will fire you.
Now you have a great story to tell an interviewer about how you completely fail to do your job properly. But by dressing up the words a little bit and using some creative euphemisms, you can make it sound like you went above and beyond. And it's not really a lie, because you did go above and beyond... eventually. You just didn't have to do that if you had worked properly. All you have to do is leave out the fact that you're a massive slacker.
So you get the job because you told an awesome story of how devoted you are, and all you had to do was leave out one single fact (that the deadline was only tight because you procrastinated). Meanwhile your really awesome coworker tells every single fact 100% true and you get the job instead of him, because your story sounds better.
If the interviewer follows up with all the team members who were actually there it wouldn't work, but I kind of doubt this happens a lot.
For example, one could make the argument that they favour people with good situational recall, because they're better able to think about occasions where they've done something that applies given the criteria specified. Or that it favours people who have depression, as they tend to be more self-aware and self-critical, so have a larger bank of answers readily available.
Unless one is talking many years prior, these situations are still in the memory. The question isn't meant as an approach to rehash a particular solution. It is meant to set ground work to explore decisions, options presented, etc.
The worst thing you can do is unstructured interviews. Unfortunately, that's exactly what almost every company does: they create ad-hoc committees of workers and have them each design an interview, often on the fly, for each candidate.
Everything you do to add structure and remove degrees of freedom from interviewers will improve outcomes. Every company that hires more than one tech worker in a year should be working on this problem, and hopefully documenting their result.
Our structured interview process differed from Google in that we (a) completely standardized interviews and (b) designed the questions in those interviews to minimize the need for free-form follow-up questions ("Tell me about a time..." and then digging into the answer to mine conclusions). We tried to flip the script: instead of having the interviewer ask followups of the candidate, we put candidates in a position to ask lots of questions of the interviewer, within a framework that generated lists of "facts" that we could record and compare to other candidates.
Example: "I am going to describe a trading system in broad strokes. I know lots of things about it works, and I want you to ask me enough questions so that you can roughly diagram it on a whiteboard, capturing its components." Then, later, "I want to have a conversation where we rank these components in order of sensitivity to latency". The interviewer captures the answers, but the back-and-forth is dominated and mostly directed by the candidate.
The problem I see with the Google strategy is that it combines interviewer-directed free-form questions with an open-ended and somewhat fuzzy question that really evaluates less the candidate's ability to do the work and more their presence of mind during the hostile, stressful interview process.
Things I positively responded to in this article:
* The notion that we come to snap judgements about candidates within minutes of the interview and are then hostage to confirmation bias for the rest of the interview
* The effectiveness of combinations of evaluation techniques; for us, a combination of work-sample testing and scripted interviews were extremely effective.
I've been out of work for 11 months. I've been on a large number of interviews and it seems like after that many failed interviews, the answer is probably with me.
The most frustrating thing is the lack of consistent information about why I didn't get hired. Some people said I had good theoretical grasp but can't code. Others said I could code really well but didn't have the grasp of fundamentals they wanted. Most stay silent.
I've grown to hate and resent myself.
All that to say; I'd love to conduct a few mock interviews with you. My contact info is in my profile.
Do it. Contact me. Seriously.
Edit: this offer extends to anyone that wants help. I am a front-end engineer so that means you'll get more mileage out of me for JS, CSS, and HTML; but I am totally willing to help with the subjective side to interviewing as well.
(And you also, papercruncher)
I interviewed more people than that per year while I worked at Google, and I'm doing even more now that I'm at a smaller company.
The funny thing is that you may not even need a job once you do something great on your own :)
If you are short of money, you can always get a simple part time job to survive.
1) It let me set the tone and path for much of the interview, since so many places do unstructured interviews.
2) It was a concrete demonstration of my skills.
3) Java and mobile are both hot technologies, and the game also made effective use of other common stuff like Open GL, multithreading, network, database storage (via sqlite), etc.
4) Spending a few minutes playing a game sets a relaxed tone for the entire interview, which makes things easier for everyone involved.
Structured interviews really are a brilliant strategy. Whenever I interview, I do my best to subtly direct the interview in a way that exposes my strengths and leads the conversation into areas that I am most comfortable with (and trust me, whipping out a concrete example full of technologies that you're absolutely comfortable with helps). A structured interview, to an extent, would allow the employer to retain better control of the interview (whether or not the realize it), which is probably to their benefit. For instance, I teach Java, C#/VB, and some other modern languages at a local college after work, do tons of C / embedded / network / etc. stuff at work, and do digital electronics for hobby ... so if you let me push the interview in those technical directions, I'm at an advantage.
I am always surprised when I meet mobile developers who have never created a mobile app for themselves.
This is a good idea.
I have no idea what people would want or what I could make that people would want or what open problems are there that I could make. Every idea I have someone else already has a better solution.
I keep trying and put it in github but it's mostly for me; no one looks at it.
The Julia people are complaining that their language is great but it is not being picked up because their standard libraries are missing a ton of functionality.
Start knocking out some standard library functionality.
Sure, you may not be interested in Julia. But there are plenty of other projects out there that are understaffed. IPython Notebook needs developers. Octave needs somebody to write a good front end. It goes on and on.
The advantage is that you don't have to be super creative (compare Julia's libraries to Python, and start implementing something that isn't yet in Julia), nor predict the future (meaning, you can make a brand new X, but if the world goes to Y, you will never get picked up). What you write will be used by (at least) thousands, and you will have to write production quality code to get your pull requests accepted. I'd be impressed with anyone that did that, even if I had no interest/need in the project that they contributed to.
Whatever you made will probably be something pretty sweet because you chose something close to your heart, not something that was simply 'unsolved' or 'needed doing'.
Employers will love that you chose to challenge yourself to build new skills as a hobby, and because you chose something that you are passionate about, you'll probably also have a good amount of enthusiasm while you talk about it, which is equally important.
The previous advice has been good.
If you opened up more about your specific troubles I think more on HN might help diagnose.
Maybe put some stuff on github, who cares how awful it is or if no one will use it .... The following statement is not globally true but I think it is probably true in your case --- you won't be able to succeed until you fail some more.
If there was a "hoboon" github we could like at when responding to your comments, the specific advice might be more specific.
Show us your code.
Hate yourself -- hmm, not good.
Everyone struggles and it's very hard. Looking in the mirror and figuring out what you can do better, not being satisfied with what you've achieved, I'm all for that. It's how we grow. But you cannot allow your assessment of yourself on a single dimension to become your entire assessment of your entire worth. I'm certainly not saying we're all special flowers, and our professional is hugely important. But it isn't all we are, and those skills aren't static anyway.
Also: If you want feedback, you can do things to improve your chances of getting it. Be incredibly positive in your responses -- "thanks for the opportunity!" People are obviously reluctant to talk with someone they've rejected, so signalling that you won't make this hard for them can help. If you have guesses at the problem, or suspicions you'd like to rule out, suggest them. Statements like "I think I need to build up Z" or "I worry that I come across as Y" might give you useful yes / no answers, if they are set up properly.
Think also about informational interviews as a means for seeking out fit. If you're just having coffee with someone, about what they look for and how people succeed, they are much more likely to be brutally honest than they are after rejecting you. If I'm having coffee, I can politely tell you why your resume or body of work look off, because I think that's helping you.
Example: Interviewer asks me a question. I start brainstorming outloud because they coached me to talk through my ideas. Of course some of my ideas are not going to work. The interviewer will shoot the first wrong idea down immediately and peg me as an idiot for the rest of the interview even if my second idea is the perfect solution.
We right away knew if they were persistent, skilled, problem solvers, and if they were meticulous.
I may not be a scientician, but I'm pretty sure there's something in the Big Book of Science about controlling for variables...
We were also rigorously testing to see if they could get acclimated to the environment. They would have to do this routinely since there were so many new lab techniques to be learned.
(I don't know who's downvoting you but I upvoted.)
I'm curious about the phone screens - how do you screen out the candidates that are grossly unqualified so that they don't progress to the interview stage?
In other words, how do you screen out candidates like those Joel describes here: http://www.joelonsoftware.com/articles/ThePhoneScreen.html
We virtually never selected out candidates based on phone screens. We had a work sample process that kicked in after phone screens, and that cost us almost nothing to run. So we never had an incentive to prevent a candidate from going through that process.
Unlike phone screens, the work sample results were strongly predictive. You could bomb phone screens, ace work samples, and end up coming in for in-person interview.
(We did "nudge" candidates we felt wouldn't do well on the work sample tests, solely out of concern for not wasting their time, but anyone who wanted to proceed was able to).
Phone screens are a waste of time.
Spolsky did a really good job of documented the best practices within the framework of unstructured interviews as they were practiced 10 years ago. The problem isn't Spolsky's tactical suggestions; it's that the strategy they're a part of is being discredited.
If my first contact with a company is them asking me to put in hours of my time, when it costs them "almost nothing to run" (your words) then I'd be inclined to pass.
The company I work for has a work sample exercise, and we intentionally place it after a phone screen and first interview, because we feel it is (and appears) more fair to the candidates.
From our point of view, we'd love to put it up front, as it is the best source of information we get, but if it caused us to lose good candidates before we even started (and we believe it would) that would be a show-stopper.
We make it really clear that there will be work samples before people even apply. I think the real question would be "how many qualified applicants don't bother applying because of the work sample," which is a question we can't answer. Given the paucity of unemployed qualified infosec folks, we're comfortable with the tradeoff.
Thomas describes the Matasano process as costing “almost nothing,” but that includes running a couple exploit training websites and sending an 852-page book to applicants. Which I’m not inclined to go through immediately, because I need money right now.
Do I have to do the microcorruption.com and the cryptopals.com and The Web Application Hackers Handbook before even trying the technical screens and challenges? Or should I try seeing if my existing web application security best practices and rusty MIPS assembly experience are enough to get to where I have enough breathing room to do these exercises?
Undefined. Personally, I applied twice, ended up coming to work here the second time through.
> Do I have to do the microcorruption.com and the cryptopals.com and The Web Application Hackers Handbook before even trying the technical screens and challenges?
Nope. But they might help, and you may want to anyhow; they're fun :)
We phone screen anyone who has a resume that's vaguely relevant. If you're applying for a software engineering role and have never done any development, then we'll reject straight away, but that almost never happens.
We simply don't have a high enough profile to receive a flood of applications that we need to filter before we phone screen.
And for a software engineering role, evaluating the work test takes as much as the phone screen (sometimes more). We're not simply checking whether you can produce the right output for FizzBuzz (checking that could be automated), we're looking at your choice of algorithm, design trade-offs, unit testing approach, etc - things that require a human to be involved.
Different companies and different roles will have different time investments.
A CTF style task can be evaluated more simply (at least on first pass) - did you capture the flag.
And if you are looking for a few amazing candidates form amongst a mass of poor ones, then you can afford to optimise your process so that you test every candidate at as low a cost as possible, even if that causes some candidates to self-select out.
Of course I think everybody would agree that if such a person progressed to the interview stage, there was a failure somewhere.
Google interview questions are leaked so often that standardization is virtually impossible...
There's a simple method of doing this that doesn't require hard AGI to generate the questions, either: just standardize on a list of the microskills the job requires. Then, get the interviewer to create a small question to test each microskill. Keep previous interviewers' questions together with the microskill in the list as examples for the next interviewer.
Your question isn't very good if you are selecting for people that know your question vs are smart/skilled.
Doesn't this just make the system easier to game? It's like SAT prep. SAT scores correlate with the amount of time/money preparation for it, not your actual knowledge.
I disagree that a whiteboard session can be considered a work sample test. Whats wrong with 'complete steps 1,2 & 3 and i'll be back in 30 minutes'?
For engineering we actually found giving a 2-3 hr 1 problem test has been a better predictor, and 3-4 30 minute face to face (usually 2 on skype prior to onsite Test, then 2 follow ups). Alone time gives the candidate time to digest and solve the problem on their own which is more 'real world' like and shows what can be done on their own. Some questions are intentionally vague to compensate for 'able to make decisions with little information'. We have no right answer but will ask why you made these decisions. This frees up our staff time tremendously and weeds out those that may need a lot of hand holding early on.
* Eliminate live audiences
* Let people work in the environment they'll be able to choose on the job
* Ideally, let people work in their own comfortable environs, even if they won't be able to do that on the job
* If you're worried about cheating, build that assessment into your in-person followup interview
Originally he'd simply asked if I could be there whilst he did the test so that he could talk aloud and bounce ideas off of me as it was timed and he gets nervous in tests and was afraid he'd not think clearly.
He froze. Totally.
I took over and did the test for him, and he aced it and was offered the job at the top salary band.
He is a perfectly good engineer and the company were very satisfied with their hire, but he never did that test. The test in it's entirety was completed by myself.
I cannot imagine this is such a rare thing with remote technical tests.
(Matasano is also not a company where it's easy to duck attention and coast; the tempo is 2-3 week engagements that wrap up with metrics that everyone cares deeply about).
The conclusion I draw is that cheating just isn't as big an issue as people think it is.
I still agree it's the best approach.
That actually sounds awesome. You could reduce your interview evaluation overhead by nearly 50%!
Anecdotally, I was prompted 'Zip code required' on a multinational company talent portal. Tried to fix and submit for at least 60 seconds before finally had to modify the CSS to show the input so I could put in a value, as Zip input was set to display:none. I figured at least there would be less competition for the role.
You did not get reply in the end, right?
Even more irritating is that recruiters don't appear to know the difference between C#, C, C++ and Obj-C.
i.e. You are using interviews in order to avoid paid in-house training.
The latter is a candidate I try to avoid like the plague.
Do you let your interviewees reach out (possibly via IM) to your interviewers with specific questions?
Edit: also for clarification, we discuss the problem ahead of time to make sure they understand it and ask any questions. For my group it's always a basic full stack problem. Here's a mockup and a basic install of visual studio and MSSQL, show me you can CRUD some data and present a decent UI. then we'll talk about how you did it.
Disclosure: I worked there for four years and interviewed a lot of people.
I decided to take another offer. I'm sure google's offer would have been great, but the timeline was taking too long, and I wanted to start at the other place, so I didn't wait for it (2 weeks at that point).
The bottom line is that, unless you request everything to be expedited from the beginning, it can take a few months to finish the entire process with Google. Even requesting a schedule change to be made would take about half a week for me. At least that was my experience -- I can't speak for others.
That is not too bad for such a large company really. Personally I have always worked for small software companies where the hiring process frequently consists of one interview, a bit of a programming test and an offer all concluded in 72 hours (frequently less). Compared to that a few months seems slow but I once had a friend get hired as a trader at a huge international financial organization and that took a glacial 6 months.
Whats more there was the expectation that you would more or less just walk out on whatever job you happened to have accepted in the interim, get on a plane and fly to where they wanted you as soon as you finally received an offer. Looking back on it I suspect that agreeing to that was itself part of their selection criteria.
Lean heavily on brand name and the preconceived prestige of a google position, give the same stock interviews that most companies give that are heavily tilted towards false-negatives rather than false-positives, and then pay well.
Speaking as somebody whose SO works at Google, I think it's fair to say that Google does not hire the best people, but hires a lot of people that think they are. CL's being delayed by a week or longer because of nitpicks regarding comment capitalization/grammar/wording, why didn't you follow the exact idiom of this totally non-related library/code component that I wrote, or just pure spite is common place.
Also, memes have almost completely subverted the english language at Google. Really, the window into Google that I have is hardly showing me "Best of The Best" material.
I'd like to think I'm a pretty good developer and have made a positive impact everywhere I've worked. Yet I'd never even consider applying to Google, because I know exactly how that process would go down.
1) Most likely result: I never hear back at all, because I didn't attend Stanford/MIT. (Never mind that I got accepted but didn't attend due to better financial aid elsewhere.)
2) If they do decide to interview me, I won't hear about it for months.
3) I'll have to drill on data structures and efficiency of sorting algorithms I've literally never implemented outside of an interview or academic setting.
4) Even having drilled on those arbitrary questions, I'll likely fail the interview because after a multi-day gauntlet of questions a random engineer didn't like the particular assumptions I put into my Fermi model of golf balls.
5) Of course, if they ever bother to let me know I failed, it won't be for months.
(Note: this is all based on actual Google hiring experiences.)
1) I went to community college, dropped out, then finished my degree at a state school 10 years later. Google didn't care.
2) I have no idea how long it took them to decide to interview me. Once they decided I got a call. People I've referred recently have received _very_ prompt responses, though years ago sometimes people would sit "in the system" for a while.
3) It is good to brush up on algorithms, like for most other SWE interviews. You do end up using this stuff regularly though - Even if say, your'e "only" doing UIs, it's how you avoid accidental n^2 algorithms in the middle of your UI code which pisses off your users. I don't get the attitude that algorithms are just academic. They matter. Besides, "use a hash table" is often the correct approach in interviews and non-academic code ;)
4) I have no idea what a Fermi model of golfballs is, but I've never asked or been asked such a thing. And "fail" is such a black and white term. Candidates are scored, and those with marginal scores the first time often get another chance later on.
5) Not my experience in general. Sometimes the interviewers lag on getting in their feedback for non-hires, which is really unfortunate, but the recruiters are pretty on top of things.
You say your post is based on actual hiring experiences, but I bet that the negative experiences get much more attention than the positive ones. People don't generally write blog posts about how wonderful their interview was.
2) I agree that algorithms are important, to the extent that you know the importance of "use a hash table." I don't think being able to implement quicksort on a white board is important.
They want to see your thought process as you tackle an unfamiliar problem. Do you think about the input and the end-goals explicitly? Do you consider multiple different ways of solving it, and cover pros/cons? Do you consider where issues may occur, and attempt to mitigate them? That's what they want to see.
I guess I could go dig up my old data structures and algorithms book that I haven't touched in 15 years to refresh my memory, but I don't want to work at Google badly enough to bother with that.
Presumably you were being polite and trying to develop a rapport.
That means you didn't get along because the interviewer brought their own baggage or narrow mindedness into the interview.
This is very common and shows the interview process was incompetent.
That may sound like a strong word, but if you were qualified and they didn't hire you because of what this guy said, then they are, by definition, incompetent.
Maybe arbitrary is a nicer word to use, but it's the same thing.
More importantly, conditions like "unless you were being an ass to that interviewer" make it an entirely sensible comment, in my view. I remember strongly disliking how fellow interviewers had zero empathy. They generally have more power in that situation, and therefore more responsibility for reasonable outcomes.
I also agree with the commentor that interviewers can be very arbitrary, and that is not good.
I only took issue with "That means you didn't get along because the interviewer brought their own baggage or narrow mindedness into the interview. This is very common and shows the interview process was incompetent."
Commentor doesn't know what went on in that room, and doesn't know that this is what happened in this case.
Most of the hiring snafus at Google are because they use a large population of temp recruiters, whose contracts may not be renewed before the candidates they're sponsoring get offers. When that happens, the candidates are often left in limbo.
When there's a recession, all the temps are laid off, and you only get the permanent recruiters who now fear for their jobs. They are extra incentivized to a.) make sure the candidate gets through the process, so that they have a reason for their job to exist and b.) make sure the candidate has a good experience, so they get a commendation. And c.) they're all experienced, so you don't get the kind of gratuitous screw-ups you might with temps.
The hiring bar is higher during down cycles, but that can also work in your favor as well; your resume doesn't just show "Worked at Google", it shows "Started at Google when nobody was hiring", and you occasionally get comments like "So how long you been at Google? Oh, that means you started in...January 2009, wow, you must be smart to have gotten through the hiring process then."
The next down cycle: You'll know it when you see it
The press never writes articles on startups, except to report their demise.
Everybody is seemingly in hiring freeze at once. (But don't let that stop you! Many times, companies that are reported to be "in hiring freeze" are actually still hiring for the right candidate.)
VC firms start telling their portfolio companies to conserve cash, shutter lines of business, and lay off people.
Plans for new corporate headquarters are shelved indefinitely.
You start hearing "Well, at least I still have a job" from friends.
You start hearing "Fuck, I just got laid off. Can I crash at your place for a month while I find a cheaper apartment?" from friends.
Searching for new jobs starts to seem dauntingly frightening - what if they go under? (In the context of this thread, this is ridiculous - Google is not going under.)
It's worth considering the possibility that while you may be a great developer, you're not as good as you think you are with respect to the caliber of people that work at Google.
(Note -- I don't work at Google and didn't do spectacularly well on standardized tests; but after working with many algorithms whizzes over the years I've learned that I'm not nearly as good a programmer as I once thought).
(Also, please recall that literally the only time I personally applied to Google was the freshman year of college. This isn't a case of personal sour grapes.)
By the way, I also don't think their hiring process is fundamentally broken. Just pointing out that this is the reputation it's acquired.
Since you brought up the SAT, it's an absolutely perfect and effective system with zero flaws—which I did spectacularly well on.
I never heard this before. I thought that SATs and other standardized tests heavily correlate with background / race. Which to me, means it's not a good indicator of intelligence, but rather education.
It's a small anecdote, but my high school got a grant to do a pilot program to incorporate SAT test prep into the school program back in the 90s. IIRC, the average score went up 100 points vs. the PSAT. With the old version of the test, I went from the 1200s (80th percentile) to the 1400s (95th). Writing was an optional test then, and the test prep didn't cover it, but I was already familiar with the writing process from AP courses.
30 points IMO would represent prep focused on test strategy exclusively. For example, with tests like the SAT, answering questions wrong comes with a higher penalty than not answering.
If you drill on vocabulary, tune your writing to line up with the scoring methodology and are familiar with the structure of math problems asked, you're golden. But knowing those things doesn't grant you greater general intelligence.
As for your anecdotal evidence, it sounds pretty flawed to compare results on the PSAT to the SAT directly—I also got a much higher score on the SAT without doing any studying at all, probably because they're scored and weighted entirely differently. Moreover, if this comparison was done over a year (ex. sophomore to junior year), the results are likely even more flawed—there's too much confounding development in that year to attribute the increase to SAT prep.
Anecdotally, I know I did much better than all of my friends who spent months studying for the SAT and drilling on vocabulary, math, and strategy. If the SAT is so game-able, they should have overcome me.
I was ecstatic and told my family and was planning on how I would tell my current employer.
The next day the recruiter called me and said that they rescind this and wont be offering me the job. He said he was not given a reason why.
Then, over the next ~4 years, they called me no less than 4 times for the same damn position. I had to remind the recruiter about the first interview experience, after which they said OK then they dont want to continue that process.
After the 4th time, I angrily said to the recruiter "You have called me for this position 4 times! Either give me the damn position of delete my damn phone number from your system!!"
They haven't called me since.
They tried to wear me down, to the point of contacting me multiple times a week, via every means of contact possible, for weeks on end, until I threatened to take them to court because they were harassing me after I asked them to stop contacting me.
Every single one of those contacts, by the way, was by a used-car-salesman type of "recruiter" who would blow smoke and not answer questions, etc.
There was no way I was going to ever work for google because of their lack of ethics... but this didn't matter to them. And their persistence despite being told that only confirmed the view.
Edit: Sorry my experience doesn't fit your desired view of Google. But if pointing out facts that don't fit someone's ideology gets me constantly downvoted here, what is the purpose of participating in this site? Is this only for circle jerks?
Edit: correspondence with the recruiter was no less timely than it was with other companies, and the only major difference I can think of is the host-matching process we went through.
By the way, since I'm getting the downvotes already anyways, I might as well say that you sound a little salty. I'm sure no one bases hiring decisions entirely on what university you went to.
Furthermore, all of the news articles we read love emphasizing the weird questions you might get once in a blue moon, the quirky perks you get for working there, etc. but in reality I doubt the interviews and work experience are much different from any other company (I asked my interviewers about this as well). Just some food for thought before you go accusing them of having "developed an overwhelming reputation" - a lot of this is the result of media hype, not their practices per se. Even though I agree that being employed at Google is probably overhyped, there's no need to be so antagonistic to the company for it.
My point is that even if they have improved (maybe they have!), their past reputation makes me reticent to even bother seeing if they have.
> I'm sure no one bases hiring decisions entirely on what university you went to.
Former Google managers have explicitly told me that this filter is used for some positions at the resume screening level.
Sorry if I sounded salty. I have a great job and probably wouldn't work at Google even if they offered—just trying to frame the common perspective.
Questions were reasonable and the interviewer gave nudges / prompts when it was obvious you were stuck on something.
They really seem to take recruiting very seriously and it showed.
Sadly not AI detecting hats and then finding colours based on lighting of faces and rest of scene.
> 2) If they do decide to interview me, I won't hear about it for months.
My gut feeling is that with the rise of Facebook, Google has really changed its hiring practices. Now they don't want to lose out a good talent to FB; and dicking around for months is a sure-fire way to lose that talent, because FB sure doesn't.
>> 1) Most likely result: I never hear back at all, because I didn't attend Stanford/MIT. (Never mind that I got accepted but didn't attend due to better financial aid elsewhere.)
I didn't go to college at all, and I've been invited to interview there by engineering managers. If I didn't have absolute golden handcuffs at my current job, I might have considered it.
Furthermore, I know plenty of people who haven't gone to any college and work at Google. If you don't hear back, it has nothing to do with the college you went to. Even if a single interviewer had this bias against you, the larger hiring committee would review each of the interviews you had with different people.
To name a couple of examples - David Byttow, founder of Secret, was hired as a Software Engineer at Google without any degree. Michal Zalewski is in an engineering director role without a degree. Once you reach the interview, the sole reason for a hire/no-hire decision is how you did in the interview process. Literally nothing on your résumé will disbar you at that point.
>> 2) If they do decide to interview me, I won't hear about it for months.
You won't hear about the interview date for months? I admit this can be frustrating for large companies, but no one I've spoken to told me it literally lasted months. The longest I've heard of was a month and a half, from first interview to hearing news.
How are you submitting your résumé? Are you just applying to the job and hoping for a recruiter to pick it up, or actually emailing a hiring manager at Google? If you apply without trying to get the attention of a decision maker it always takes longer because that's how the process works at large companies.
>> 3) I'll have to drill on data structures and efficiency of sorting algorithms I've literally never implemented outside of an interview or academic setting.
While algorithmic questions are asked, they do not make up the entirety of the technical interviews at Google. Generally speaking from what I have seen, questions are not "implement a linked list", questions are "implement this program" and you are free to choose which algorithm is best. However, I acknowledge that interviewers might not always follow this.
>> 4) Even having drilled on those arbitrary questions, I'll likely fail the interview because after a multi-day gauntlet of questions a random engineer didn't like the particular assumptions I put into my Fermi model of golf balls.
Fermi questions are actively discouraged by the Hiring Committee, and questions involving these will be thrown out when evaluating an interviewer's analysis of an employee. As someone elsewhere in the thread mentioned, this would involve a strongly worded email from the Hiring Committee afterwards.
The first thing to understand about Google's hiring process is that you are interviewed onsite using five or so 1:1 sessions. The interviewer has to perfectly transcribe everything that happens and needs to be able to explain their decision to the Hiring Committee. The Hiring Committee will not consider answers from an interviewer like, "He couldn't tell me how many stoplights are in Los Angeles" or answers like "He didn't give a good enough answer about why he wants to work here."
I realize this seems like "my anecdata is better than your anecdata", but what are you saying is actively negative and contrary to a lot of the literature about what Google's hiring practices. I'm not saying their hiring practices are perfect - far from it. But you seem to be ascribing malice to their methodology when it's really not the case.
The author of "Cracking the Coding Interview", who worked at Google, has publicly said that, at least in her hiring committees, your education and particularly the prestige of the school you went to was very important.
In all of the hiring committees I've participated in I can't remember a case where the candidate's school was a significant factor in the hiring decision. Even for new grads.
It's possible that it's given more weight in the pre-screening process, but once you get to interviews it just doesn't matter much at all. At least that's my experience as both an interviewer and a hiring committee member.
There are all sorts of reasons why they wouldn't take me, but there's no way that's one.
Want to work at El Goog? Cultivate as much weirdness as you can without becoming a criminal or jeopardizing your competence.
It seems to me that those types of questions are very different in terms of what they're testing for. Fermi problems in particular show how you might go about approaching a problem, and there may be many different correct answers or ways to approach it, and the goal isn't to get to the "right answer".
Brainteasers, in contrast, either test whether you can recall or figure out an extremely specific problem, or whether you're good at solving certain types of logic puzzles under pressure.
It may well be that both of these types of questions don't give useful information about a candidate, but they are vastly different overall.
Google banned them from its process ten years ago and most large companies did as well.
Disclaimer: recent Google interviewee.
Statistically, across multiple interviewers, Google does not allow these questions.
Condition 1: If you are actually as brilliant as Donald Knuth, and independently derive the Floyd's cycle-detection algorithm, then obviously you must have cheated, because only Donald Knuth could have come up with that sort of thing in 20 minutes.
Condition 2: If, on the other hand, you aren't brilliant like Donald Knuth, you'll probably come with the naive solution using a visited data structure of some sort, in which case you're stupid because you can't come up with the optimal algorithm.
In either case, you bombed with that interviewer.
Condition 3: Cheat. Do the naive algorithm first, then have an "ah-ha" moment that magically gives you the optimal algorithm, because you actually knew it before hand. I suspect, but can't prove, that some hires get in this way. During my time as a PhD researcher studying deception under a related NSA grant, I routinely found that a) people are horrible at lie detection, and b) people greatly overestimate their ability to detect lies.* The perfect way to game the system!
Condition 4: Inform the interviewer that you're aware of the cycle detection algorithm, and get another brain teaser that reduces you to Condition 1 or Condition 2 (and if less than ethical, Condition 3). Oops.
Ideally, you want interview questions from which you can start at Condition 1, and without deus ex machina, eventually get to Condition 2, perhaps having the interviewer give some hints along the way. Better is to start with a problem that has a reasonable Condition 1 solution, and then slowly modify the problem specification for increasingly complexity ("Now pretend this is an arbitrary graph instead of a tree, what would you have to change?").
Finally, Google maintains a list of banned questions which have such brain teasers (technically, they have an entire question pool), but unfortunately, interviewers don't seem to check them frequently enough and so brain teasers continue to persist (even in 2015).
* If you're fascinated by lie detection, start with scholarly publications from Aldert Vrij, and work from there.
I am not debating the utility of Fermi questions, I can see how they might be useful and/or might be harmful during the interview process. My statements have been simply that my experience differed from what others have been saying, in that I definitely had that type of question during an interview with Google, so clearly they cannot be "against the rules" or anything like that.
That said, these types of questions are a bit like any standardized test (such as the SAT/ACT, etc.), which may or may not be strong indicators of cognitive ability/problem solving depending on who you ask. I think there is enough controversy over standardized testing to be able to at least say that solely relying on such methods, especially in a high-stress situation or even due to cultural differences, might come with some drawbacks and not be an accurate indicator for all candidates.
Lest I forget, Google is a business, and if such tools are what help Google find the candidates it wants, then so be it. It might also be an indicator to candidates about what kind of organization Google is. As a business, the organization will usually prioritize its desires/needs/benefits over those of the candidate - it's not a charity, and I get that. All I am saying is, it may just be that they are excluding certain diversity or individuals unnecessarily without realizing it. Perhaps that is the motivation behind the reported change in attitude towards such types of questions, I'm not sure.
It may or may not be that coding skills were relevant to the specific position. That said, in my experience, product management in software companies in particular is not stovepiped in such a way that you need not have any experience in coding. In fact, I think that some of the best product managers in such companies have coding experience, business experience, hardware/software/etc., and/or cross-disciplinary skill-sets. Perhaps such strong candidates don't fit the standard model, I'm not sure.
Brain teasers are not as frowned upon as much for non engineering interviews but they are absolutely banned for engineers (source: former Google employee with about 300+ interviews during my time there and also former hiring committee member).
In my case, I was clear that I was unable to relocate, which left only a specific position available as a possibility which was nearby. The reason I thought that specific position had technical responsibilities was by the description of said responsibilities in the job description as posted. Also, I was told that I was contacted by Google in large part because of my technical background, but during the interview, that background was not discussed or explored.
Sorry I couldn't be of more help!
After the hiring committee has decided whether or not to hire you, your specific background will be matched with specific openings around Google. Naturally, some products require less technical expertise than others.
What I think end up happening with those Fermi Problems (or other problems) is that the interviewer knows one specific answer and then ends up rejecting alternative answers
(Especially because of limited time in an interview and the candidate is under pressure)
You'd have to capture the answer and then study it later, possibly multiple times to overcome any: "Hm, this doesn't sound right, why didn't he think of x, y and z"-feeling and understand that x, y and z are just your bias, not some kind of "correct" part of every answer. I've never seen that happening.
I really agree with it, and I hope more people are looking at it seriously instead of fixating on the fact that the word "Google" is in the title. Giving all the candidates the same questions and the same exact interview methodology is much more fair and empirical than simply having an interviewer wing it (which is virtually certain to bring in bias). Most interviewers I know think they are better than the average interviewer due to illusory superiority cognitive bias . However, when it comes down to it, you cannot easily judge the difference between candidates if you ask one a completely different question than another. This goes against all the principles of psychometric testing, yet it is still ubiquitous because no one has bothered to empirically look at whether or not they're really interviewing in a rigorous way.
There is a serious issue in the industry right now where otherwise capable people fail interviews due to their appearance, manners of speaking or other harmless idiosyncrasies. It's because interviewers are very personally attached to their subjective methods, and they tend to really enjoy having personal ownership over the interviewing process instead of surrendering control to a standardized script. This trend looks like the software hiring equivalent of a professor grading papers without reading the name attached to the paper -i f we can have several candidates answer the same exact questions and perform the same exact activities on an interview it makes it much easier to determine who is the real "best candidate" when it comes time to comparing their results.
If this really takes off, the only remaining problem as I see it is designing interviews that accurately correlate to the job activities.
Regarding prediction of 'employee performance numbers' based on 'interview scores'. It is not surprising that the result is completely random. As far as I understand it, in the big companies performance numbers don't show actual contribution of an employee. At best they are showing "how this person was able to use the resources around them to achieve goals", note, mostly via using human resources of informal social networks. But usually performance numbers are just random. And at worst they could be negatively correlated with actual contributions.
Is it only me that sees this as an unsustainable goal that will likely lead to idealist driven results?
While, when interviewing, I tend to be impressed with folks who seem more well versed and smarter than me, but realistically I also feel there's a potential risk that those folks may not be sufficiently challenged in our organization.
Wouldn't the (unrelenting) drive to find people who are "better than you" lead to some of the problems in bias-matching and with first impressions dominating your perception?
Sounds a bit dogmatic and contrary to the entire focus of the article which espouses a more rigorous, measurable and evidence-based approach to hiring.
The reason it converges to some relative percentile (apparently just under the 90th) is because of the noise in the interview process. If people could reliably measure the quality of an applicant and consistently hired only people better than them, they'd only hire the very best (100th percentile) candidate, who probably wouldn't want to work there. But because peoples' judgments are off, they end up getting folks who are good, but probably not absolutely best - but they also avoid the bozos, because it's pretty unlikely that a bozo would appear better than half the people you know.
Also the assertion that "it's pretty unlikely that a bozo would appear better than half the people you know" sounds just full of assumptions that may not be valid.
Just thinking from a non-Google point of view, not every wants to work at new_untested_startup or boring_postIPO_company. It seems unrealistic to a) set hiring targets and b) keep raising the bar unless you have incredible cachet and a large pool of candidates to work with.
So what does the rest of the industry who don't have that luxury do?
I think Google is unconcerned with that option. They can get amazing candidates to do run-of-the-mill work by simply paying more and promising more than other companies. For them, keeping a great talent from joining a different company is potentially worth the increased compensation.
This is logically true, but it's hard to imagine how the human interviewers/hiring-committees would consciously consider that.
Reminds me of another adage:
>Never be the smartest person in the room.
Can't pretend I was happy with the process. Sort of got the sense that I was replaceable commodity, which I'm sure I am. They didn't particularly care about me. I wasn't applying for a technical role, and I probably had the same qualifications as dozens of other candidates, so they really didn't care about what I thought about waiting for months at a time with radio silence.
I'm at a start-up company now, and very happy. I'm working on a Google-X style moonshot, and I know if I was at Google I would have no chance of working on one of their Google-X projects, because everyone at Google is trying to work on one of those.
I also get the sense, based on stories, that there is a lot of politics now in Google (as there must be in most big organizations), and so somebody with no political skills, like me, is better off in a start-up.
Getting a "reason" from a prospective employer isn't really helpful; it's usually not "oh if you knew what a skip list is then you would have gotten the job". It's more, "are you a better fit than the other candidates", and actually talking to them can help you gauge your strengths and weaknesses relative to them.
> After six weeks of this, 99 are rejected. They’re not told why. “If somebody just breaks up with you,” Bock says, “that’s not the time to hear: ‘And really, next time, send more flowers’… For the most part people actually aren’t excited to get that feedback, because they really wanted the job. They argue. They’re not in a place where they can learn.”
There are also different 'best' engineers for the role/company. A company may say they want the 'best' but they really want an engineer that is going to stick around and solve their boring problems in boring ways. Even the edgiest startups likely have mostly boring problems to solve.
I see it as sort of a moneyball situation. You're looking for value. You can spend a lot of money and time searching for that unicorn 10x engineer when you could have hired 2-3 3x engineers and overall spent a lot less time and money and got your product out the door months earlier.
Can anyone explain this? I don't understand the explanation that follows.
Like, from your post, I would say that Crew is an obscure reference for anyone living in the inner-city, not just minorities.
I'm a white male but I know that personally I had no clue what Crew was during my high school years. Anecdotally, I went to school in an area where the public school population majority was, well, what we consider the minority when discussing race relations.
Individually adding ~30 points might mean little, but when you look at large groups of people such small changes become important.
If you cannot eliminate any answers, then your average expected score gain will be the same as not answering.
If you can eliminate choices, then your average expected score gain will be proportional to how many answers you've eliminated.
If a group fails to realize this I would say the SAT was successfully in measuring their cognitive abilities in this regard.
If there are 5 choices and you randomly guess, you have 4/5 chances to get -1/4 and 1/5 chance to get 1 point. That means if you had no idea which answer is correct, you will net 0 points over the long run, and this will be the same as leaving the question blank. 4/5 * -1/4 + 1/5 = 0.
If you are able to eliminate 1 choice, you will on average get 1/4th more points by picking a random answer among the remaining choices. If you are able to do that, then you deserve the extra points, because it took more knowledge for you to eliminate it.
Anyone that doesn't realize this deserves to have a lower score, and if you can't eliminate answers then you also deserve a lower score. Bringing in racial and gender biases into this is ridiculous.
Someone who has the ability to eliminate at least 1 choice understands the question better than someone who doesn't know what the question is asking at all, and is appropriately rewarded for it.
So, it is an arbitrary choice.
You want a test to introduce MORE randomness just to please 1 group of people on a test that is supposed to measure your math ability including the ability to understand their simple guessing penalty?
You want everyone to think in the back of their mind that they got screwed by the SAT's random number generator or that some idiots hit the jackpot and get a much higher score than they should have?
In any case this specific rule happens to benefit white men more than other groups so clearly that's going to bother people. As to the math idea, they score the English section independently so having people do math as part of the English section seems counter intuitive.
PS: I am a white male that happened to crush the SAT, but I also accept the test was biased in my favor.
If you get rid of guessing completely, then there's no way to differentiate between people who have no clue what the question is and people who actually have a clue. Giving 3 points for a blank question rewards people who are clueless and punishes people that aren't.
If you really did crush the SAT I am rather confused how you fail to understand any of this.
PS: this thought just ran past my mind, it seems like you actually think your scoring scheme is actually mathematically fair. Someone that is able to eliminate 1/5 answers will average 1 point per question (4 points / 4 choices) Someone who cannot eliminate any answers gets 3 points per question??? Where is the logic in that?
If you crushed the SAT they must have removed the probability questions these days lol.
"We find that when no penalty is assessed for a wrong answer, all test-takers answer every question. But, when there is a small penalty for wrong answers and the task is explicitly framed as an SAT, women answer significantly fewer questions than men."
Retaking the test: http://philvol.sanford.duke.edu/documents/SAN01-20.pdf
PS: Making an unbiased test is really hard, the SAT comes reasonably it’s not there.
Put another way, the discrimination is the fact, demonstrated by a disparity in prediction vs. outcome. The reasons why this happens are hypotheses.
Google interviews a lot of people in any given week. Not everything in this article applies to Engineering interviews; it's a broad overview. The way the interview process is designed needs to be interpreted in that context.
What I'm really excited about is patio11 and tptacek's Starfighter. I really want it to be the "Khan Academy" of interviewing -- best-in-class, various progressions, and really good suggestions on what puzzle to tackle next. Up until now, I've been directing people at the USA Computing Olympiad's training server, but its one-size-fits-all approach doesn't resonate well with people who don't have confidence in puzzle-solving (and give up on the first problem) or people who don't have the leisure of n years of high school/collage to work through all the problems (e.g. women who are going through HackBright and similar accelerated learn-to-code programs).
Unfortunately, as developers, we're keyed in toward quantifying everything, even if it's to our own detriment.
I cut my google interview short when I discovered that their process was to interview me, then bin me, then pick who I'd be working with after I'd gone through a pretty arduous process. I told them that I was interviewing them as much as they were interviewing me, and I had no interest in 'getting the job', then being told after the fact who I'd be working with. The idea that I'd keep interviewing them without even knowing who I'd be working with just seemed absurd to me.
But if you already have been in industry a while, then you don't qualify for an internship and that's a bit broken.
All of these companies, including google, are following a silly, company-centric process. Putting junior engineers in there to make candidates jump thru hoops to get a job? Why are you even bringing in people you don't already know can do fizz buzz? Bring in people who couldn't have the resume they have without being decent programmers. Google is doing cattle calls? Seriously? That reflects badly on them.
You should have senior people review the resumes. They should be able to tell from the resume whether the candidate is a good fit or not. Seriously. I can. Bring them in, spend the interview time talking to them. Ask them about a project they are proud of or liked or was challenging and get them to explain something technical to you. That's all it takes.
Then spend a significant amount of time selling them on your company and why they should want to work there. They should be asking you as many questions as you're asking them!
I do like to ask a little brain teaser, but it's relatively quick. IF you're making them write code, you've failed. I'm dead serious about this. I've hired a lot of people, never asked them to write code, then had them turn out to be great hires. Never hired someone who couldn't code.
I've seen people lie on resumes (actually got a resume from someone who claimed to be on a team I'd lead, but that he hadn't been on!) Should take very little time to figure out if they're lying on their resume or not.
Cultural fit is very important, but people apply that wrongly. They seem to think "I'm a nerdy white male who hates the new star wars trilogy, so they should too". Wrong. Cultural fit is about finding the guy who will show up to help you move without being asked simply because you mentioned you were moving and he's the kind of guy who jumps in and does shit like that. The kind of person who is brilliant but also able to communicate those brilliant ideas with others without it always being about drama. The kind of person who has enough backbone to improve the final product. The kind of woman who takes bugs and gets them cleared even though she could have reassigned them to someone more appropriate, simply because she knows that other person is overloaded.
You don't find that on a white board.
I had explained to the recruiter that I had spent the last several years programming almost exclusively in C on embedded systems. The call (~45 minutes) was spent doing a programming exercise involving string manipulation where the interviewer was essentially silent, while I dealt with the details of getting string manipulation on C correct (the actual problem was trivial from an algorithm standpoint, and could be banged out in python or a similar language in ~10 minutes with access to a REPL to check the details). I wasn't about to try to remember the proper syntax for another language on the fly (especially since the interviewer didn't want me to use anything other than a shared text buffer during the interview), and didn't have the prototypes for various string-related functions memorized, so I'd bet I came across as incompetent in the reviewers eyes.
Never mind the fact that they could have asked interesting questions about how I wrote a rather complex piece of an IP stack from scratch recently, and successfully deployed it to various customers.
I suppose if you have a big enough candidate pool and a reasonable compensation package, you'll find someone acceptable with this approach - but you'll spend a lot of interviewer-hours doing it. To be honest, I don't have anything against them, but I'm unlikely to accept another interview if their recruiters call again.
Just about everyone reading these articles will never work for google. I'll never work for google. Anyone qualified to work there doesn't need these write ups, and for everyone else, myself included, it won't matter.
"We found that brainteasers are a complete waste of time." "They don't predict anything. They serve primarily to make the interviewer feel smart."
They have a TON of applications, so they are more worried in filtering and discarding people than reaching out candidates, or even encourage people to apply.
Their offer (prestige, great perks, great salary, etc...) is obvious to start with, so they are not bothering is going for you. You are the one that should prove worthy and eager to work for them.
I guess that's similar to places like Ivy league Universities, etc...
That's ok. It obviously works for them. It's just that I think it's not the most common case for all companies.
Google does not have a lot of prestige, they have a long record of questionable at best ethics.
However, people who are just out of college are much less likely to be aware of this, and thus more likely to apply.
Which means google does reach out to higher skilled, higher experience people like me.
They pursued me more aggressively than any company has ever in my career.
By the way, if you feel you need to prove worthy and eager to work for a company, then your esteem of the company is out of place. You are likely going to end up taking a worse job or taking worse compensation because you aren't valuing yourself highly enough.
"All our technical hires, whether in engineering or product management, go through a work sample test of sorts, where they are asked to solve engineering problems during the interview."
is a work sample test.
All: please let's not argue about that ludicrous title, which I'm sure the author had nothing to do with. At first glance, this piece looks a lot more substantive than the usual posts about hiring. Let's discuss the strongest bits. (Come to think of it, that's the Principle of Charity applied to articles.)
Best is really dependent on many factors
(And it's pointless to hire "the best people" then set them on boring tasks, which seems to happen a lot at Google)