I get into the interview and the person on the other side seems to know very little of my background. He says he is a PM and starts with how much is google's spend on storage for youtube on an annual basis. Knowing very well, I walk through assumptions like the average youtube video size, no of formats based on screen res and video quality etc etc and give him the logic. He pauses and says give me a dollar value. He doesnt want to understand the logic behind the calculations. Anyway, next few questions are more of the same.. code optimizations etc etc. After 3 or so questions, we were done. No, do you have any questions for me. No customer related discussions. No what I have done in the past and how I've been successful.
I feel like these kind of interviews are not judging what the person brings to the table, rather do you know what I'm gong to ask you and that's all that matters.
I always look for 2 things in any interview. Are you smart and motivated because nothing we do is rocket science. If you are smart and motivated, you will succeed. The other is, will I (and the rest of the team) get along with you. Teams need to work together and people who lack tact in personal skills end up being very difficult to work with.
The interview was just a diversion.
He asked you for "how much is google's spend" and you finished your estimations without giving him a dollar value? Did you forget his question?
> He doesnt want to understand the logic behind the calculations.
From your description that sounds like a false assumption to me. It sounds like despite your estimations you didn't give him an answer to his actual question, and so he had to prompt you.
He didn't say he finished without ever giving him an answer. He was simply explaining that he tried to give an explanation of calculating the amount and the interviewer cut him off and asked for a number. Its absolutely stupid that the interviewer for such a question would only be interested in the number and not the candidate's thought process for arriving at that number. For a question like this its also absolutely reasonable for the candidate to assume that the interviewer wanted his thought process, because from an objective standpoint, that's the only scenario where this specific question wouldn't be a complete waste of time.
>From your description that sounds like a false assumption to me. It sounds like despite your estimations you didn't give him an answer to his actual question, and so he had to prompt you.
This sounds like an assumption on your part. You might be right, but you could also easily be wrong.
It's a great interview question in the sense of a Fermi problem (I was once asked by a startup "how many window washers are there in [CITY]?"). The premises you choose and how you evaluate them, however, are so much more important in a Fermi problem than a correct answer. I remember being about 5x off but it was only because of a drastic underestimation on the time it takes to clean a window.
The point of a Fermi problem is to arrive at a reasonable estimation for a fairly unknown value by extrapolating and connecting from known values (by known, I mean there's a more narrow lower and upper bound).
I wouldn't say that a Fermi problem about how many window washers are in a city is a terrible question; it is more challenging than something in which you already have domain expertise, but that just makes you have to extrapolate further, which is the real point of the Fermi problem. In fact, pretty easy (and knowable) starting points are the population of the city, windows per individual, etc.
Doesn't a Fermi problem without domain expertise display more critical thinking and reasoning style, while a Fermi problem with domain expertise is less of a Fermi problem, and more of a knowledge test?
A Fermi problem is both a test of knowledge and a test of reasoning skills. The types of estimates Enrico Fermi was known for were only possible because he had the domain knowledge for his reasoning to leverage. If you remove the domain knowledge from the problem you remove a significant amount of the signal from trying to concoct the estimate. It is a much easier problem if you can just make up numbers rather than infer accurate guesses from the domain.
You can also make estimates for compute, network, and storage costs based on the prices Google charged its Cloud customers for the same.
Let alone the economics of personnel costs or the non standard way google builds its infrastructure.
Make a guess at total cost for an hour of compute time and how long it might take to transcode the average video. Guess at how many videos are uploaded on a typical day. Guess at how much the typical SRE costs Google and how many SREs YouTube employs. Do the same for software engineers, or explicitly exclude R&D. Guess at networking, storage, etc. Then roll all that together with some hours of video * (cost to transcode + cost to storage + cost to upload + cost to playback * average viewers) + sre cost +.... Bonus points if you can account for elasticity and peak load instead of just averages.
The point is to show that you can think through the problem. If all you can say is "I don't know what your networking costs are", then you come across as useless.
The question is perfectly reasonable (and it sounds like the interviewee was providing a reasonable answer). The issue is the way the interviewer ran the interview, not with the particular question itself.
[EDITED to fix a typo.]
This behavior of yours is weird and obsessive. Maybe reevaluate what it is you are trying to accomplish here and in life.
If you simply try to elide all bias, then you would never be able to make a decision.
The objective of asking these leetcode style questions is to find candidates who are willing to put in the time to study. Success signals that this person is willing to commit to performing well at something that is reasonably challenging. The end result is that they are trying to hire worker bees. This makes sense as the bulk of work at any large company is largely mundane and relatively routine. I'm willing to bet that when a company wants to hire a two sigma candidate they don't go through all this nonsense, although at that point the candidate is already well known in the industry most likely.
In some ways, this is absolutely terribly for hiring. I worked in a part of Google where domain specific knowledge was key, and it was next to impossible to hire people because nobody on our team could interview them "officially". So we'd pre-screen people with the knowledge we needed, and then pass them off to others to officially interview. They would almost invariably fail, because their non domain-specific knowledge was not in the top 1% or whatever it takes to pass a generic interview. So we were left with the choice between hiring contractors, or training somebody internal who could leave at any time. We hired contractors.
EDIT: See the sibling comment regarding the homebrew author. This situation was exactly like that. Imagine wanting to hire somebody to make an internal package manager, and not being allowed to hire the author of one of the most popular package managers because his whiteboarding skills were lackluster, so he got a poor interview score from people that have no idea (or concern) what he does or what he's being hired to do.
When a company policy is so obviously broken in this way and there is no way to route around it then something is very broken.
Here's how it should work. You explain the issue above to someone above you and they either have the authority to get round it or they pass it on to someone who does.
If a policy is failing in such a profound way and nobody can change it this implies a level of organisational dysfunction that must be affecting multiple aspects of the company.
I don't think a way to route around it for an individual position or candidate exists.
It would be nice if interviews really were a “collaborative” experience to see “how you think” where finding the optimal answer “doesn’t matter”. But it 100% does.
The relevant interview was a "find the palindrome" question for which the desired answer was Rabin-Karp. The rejection was explicitly "your solution to the whiteboard algorithm question worked but wasn't optimal, so study your algorithms harder and try again in the future".
At the time, I thought it was pretty stupid that they tried to judge skill by requiring candidates to either have one random thing memorized or reinvent a publication-worthy algorithm in 30 minutes. But given the advice I got, it seems like the core motivation was "we want people who are willing to devote a bunch of free time solely to impressing us". And for that, asking impractical questions and demanding optimal answers works perfectly.
Everyone (regardless of background) has to sing for their supper, so to speak. Sure, having good pedigree can make it easier to get scheduled for an interview, but that in no way means you're slated to have interviews with people who will handle you with kid-gloves. On the contrary, the whole process is designed to zap out any bias and cover a large amount of breadth. Interviewers have questions they're calibrated themselves on, and if they get picked for the loop they just usually ask those questions. This is true from the phone screen all the way through the on-site.
You may feel that established engineers cross-pollinating from other competitive companies don't have to jump through as many hoops but that's only because they've been through this rodeo enough to know how to prepare for it well. Many are hobbyist competitive programmers and people who build stuff on the side so the interview questions don't blindside them completely.
Do you really want to remove bias for experienced candidates with a track record of substantial and nontrivial contributions?
To put in bluntly, if you are such a candidate you are expected to know your algorithms cold and be expected to field your domain specific questions. So if you're a well-known compiler designer you would still need to know how to wield algorithms/datastructures for parts of the interview loop and then get into detail about compilers in the domain-specific parts of the loop.
That being said, it totally depends on the position you're interviewing for. If you're interviewing for Director/VP role in an engineering ladder, then some of the interview will need to be repurposed to getting signals about leadership and impact. That doesn't mean that you don't code at all though, it just means that the focus/weightage will be moved towards other dimensions (in addition to evaluating your system design and coding skills).
You're holding candidates to high standards, fine, that's great. Compiler designers even need some deep algorithms and datastructures knowledge; you can't ensure your hashes are thread-safe unless you know exactly what they are, how they handle collisions, and so on. All that's totally fair to ask as part of their domain knowledge.
But the criticism of this sort of interviewing (and Google in particular) overwhelmingly describes interviewers judging domain experts on whatever algorithms puzzle they give every candidate from every domain. Why do we care whether a compiler design expert remembers how to implement Ford–Fulkerson on a whiteboard? Is there any reason to think that's predictive of talent? Might it even be anti-predictive because it favors generalists?
Google (and FAANG, and everyone else in that league) hires great people. I don't doubt that. But it was also hiring great people back when it posed random brainteasers and emphasized GPA, and Google has publicly said those things turned out to be totally uninformative. If you filter down to a list with more highly-qualified candidates than you can hire, I think there's a tendency to come up with random extra hurdles to avoid resorting to a coinflip that might work just as well.
Not all current "experts" are necessarily hired into corresponding roles needing the same expertise. To put it more concretely with an example, it's entirely possible for you to be a payment system expert but have a recruiter reach out to you for a general SWE role (you're welcome to decline). The slate of roles that do require said expertise are limited. If there's 10 qualified candidates, they could all technically pass the hiring bar. But if there's only 7 positions that require that exact expertise, then what happens to the other 3? Well, they're broadcast to "matching" teams that have available headcount. This "match" could be approximate, but it underscores why getting a signal beyond just the domain expertise is important.
We could of course debate whether something "advanced" should be asked, but that would require us to agree on what constitutes "advanced". In general, it's difficult to pin point some threshold and unanimously agree that that's the level of difficulty of generalist SWE questions that a domain expert should get asked. For what it's worth, there's additional intervention by committee(s) that look at this on a case-by-case basis to make sure that everything's on an even keel. Concretely, if you're an ML expert
interviewing for a role that specifically demands ML-expertise and you were asked some advanced question about data structures (ex: something needing the Hungarian algorithm) and you're upset that you bombed it, the don't be; the committee(s) would weigh that interview's score less if they expect something alone the lines of Maps/Queues to be asked instead. These are all made up examples, but I hope that it conveys why there's a need to extract signal beyond just core-expertise. Google doesn't just artificially create hurdles for sadistic pleasure ¯\_(ツ)_/¯
Most interviewers will ask the same question to multiple candidates. They will compare how you the candidate is doing compared to other candidates on the same question. Most interviewers will provide hints.
I'm pretty sure that all candidates go though the same interview process.
There is really not much that is unique to Google's interviews. The only thing that is unique about the process is that your interview feedback is reviewed by a committee of peers.
Having just gone through 10 days of on-site interviews (including two at Google), there are some things that are peculiar Google's process:
- No talk about software engineering,
- No talk about software design,
- No talk about project management, working on a team, culture of any kind, caring about customers etc etc...,
- No debugging,
- No using unfamiliar APIs, no reading documentation,
- No actually running any code.
In particular, Google was the only place I didn't actually test and run my code. It was the only place where I was given the option of doing all of my work on the whiteboard, too.
At least one time I had convinced my interviewer I had a working solution and then realised it didn't work and had to convince them of that...
I just interviewed at Facebook, and I was only given the option of doing all my work on a whiteboard. I hate that medium for writing code (or any dense text, really).
At Google I wasn't allowed to bring my laptop, but they let me "program" in a text editor with syntax highlighting and automatic indentation. Most of the interviewers hadn't seen that before though.
The others, ehh, practically every interview I've conducted has had some debugging when people encounter issues in the code they wrote. Sure you're not spelunking logs or stack traces, but reasoning about what causes an issue on a smaller scale is still useful signal.
I wouldn't even say that sounds unique, as I was at multiple companies 20+ years ago that did the same.
Can testify to this, interviewed via referral, was still asked some algo-trivia, didn't do too well on that but still got a decent offer, mainly because referee vouched for me based on their experience working with me.
It has also never been conclusively explained what “invert a binary tree” means (see the tweet a sibling comment linked to — that’s what the Homebrew guy claimed he wasn’t hired for not being able to do).
Well, no I didn't [write something worthy of Google]. I wrote a simple package manager. Anyone could write one. And in fact mine is pretty bad. It doesn't do dependency management properly. It doesn’t handle edge case behavior well. It isn’t well tested. It’s shit frankly.
But he goes on:
On the other hand, my software was insanely successful. Why is that? Well the answer is not in the realm of computer science. I have always had a user-experience focus to my software. Homebrew cares about the user. When things go wrong with Homebrew it tries as hard as it can to tell you why, it searches GitHub for similar issues and points you to them. It cares about you.
1. https://www.quora.com/Whats-the-logic-behind-Google-rejectin... (Quora link, sorry)
We can apply the same logic to almost anything.. Because nothing is truly original and "difficult/impressive" are pretty broad metrics. Feel free to give us your definition.
Software engineering is about building functional systems for humans. Quite frankly, Google needs to hire more people that are able to deliver products that actually work.. From the looks of it, hiring people that can solve "impressive" and "difficult" sorting algo puzzles yield to the Sluggish Gmail mess we have at the moment.
Maybe google needs to start hiring competent people instead of anyone that can recite the corporate gospel.
Just my 2c.
in my job i've dealt with dozens of Google engineers across a few teams... 95%+ of the time I don't get useful answers, and sometimes no answer at all. We have found many bugs in their systems, as well as broken API endpoints that did not work as advertised. Dealing with one issue right now that no one at google has been able to assist with in over 12 months. Of course with all the turn around on some teams, the issues just keep getting passed to the next guy.
they are by far the worst of the FAANG-level engineering teams we deal with, competency is severely lacking.
It's a lot like syncing files - e.g. how Dropbox looked similarly deceptively easy to build but really wasnt.
IMHO the fact that the homebrew guy didn't get hired and that golang package management was a dumpster fire for years isn't coincidental. Both were part and parcel of a systemic bias that plagues Google's culture that they're blissfully unaware of.
But I don't think you can generalize to the whole company. For example, the Dart package manager is pretty nice.
Regardless, I don't think qualifications like this should bypass the interview. The author almost sounded a little entitled when describing the rejection. That said, I don't think his achievements got sufficient weight while evaluating him as a candidate.
See my comment  regarding what these interviews are selecting for -- it might be a bad process, but it might come closer to the intended goals than other processes do.
Here's the thing - outside some special projects for which people are hired without going through the standard interview process Google does not want those who can take shitty lemons and convert them to tasty lemonade. Google wants specific kinds of butts that fit into a specific kind of seat. It wants the most homogeneous high output monkeys that would do what they are told, such as spitting out a code to invert a binary tree.
The traditional sit and have a chat interview is insanely biased towards people like the interviewer and doesn’t at all guarantee technical competence. The take home project feels like a big imposition to a lot of people, especially those that are currently employed, and raises concerns about cheating. The whiteboard interview except on a laptop has some advantages but has problems of comfort with the environment.
I think a whiteboard or laptop interview with a cleaned up realistic problem, not a thinly disguised implement Dijkstra’s, is the least bad option around.
There's way more to hiring decisions than raw engineering talent. Teamwork matters a lot too.
For what it's worth, all my Google interviewers were nice.
I haven't bothered with Google this time around - combine the risk of jerk interviewers with the glacial pace of their interview process and it's not really something I'm interested in doing. And that's not even considering any personal reservations regarding the Damore firing or the massive payout to Rubin.
As a commenter mentioned in a different thread, two sigmas get acqui-hired. I remember reading about FB acquisition of Instagram/WhatsApp, there was a requirement that they had some understanding of cs concepts.
If that were true then why not simulate those situations rather than riddles, google-able CS trivia, or whatever the interview flavor of the month is?
I'd actually argue that for many companies this post is true (i.e. that getting it "right" is less important than the journey) but I still won't forgive companies that design the most hostile interview questions possible, and are then surprised when interviewees complain.
You can read the types of questions Google asks here:
I'd never interview at Google or any other company that operates this way. If the very first interaction I'm going to have is off-topic trivia, we're done, the company failed MY interview. The questions Google asks are disrespectful and unprofessional.
But disrespectful and unprofessional interview questions ("why are manhole covers round?") has become the new normal in this field.
The entire experience lasted less than an hour - from the time I walked into the reception till I walked out. Best job interview ever!
This codebase was probably 0.0000001% as complex compared to Google's (I can only imagine). Still, there is no reason more companies can't follow this style, at least for one of the several "rounds" of interviews, instead of whiteboard, trivia etc.
I've seen this accusation thrown around dozens of times. I'm convinced that it's not really an actual issue, and just another way for bad candidates to transfer blame elsewhere.
I had to tell one moron on phone interview that they are looking for free consulting or what because they keep harping a very specific project (Java/xml etc) setting that they were struggling with. It could of course be solved in hour or so by closely looking at product documents instead of endless googling.
When you are the guy who has to meet payroll for your team, and some (insert appropriate NSFW description here) person/company deprives you of pay for the work they are taking advantage of to further their goals ... think how that makes you feel.
There are many adjectives. None of them are synonymous with hilarious.
This doesn't seem reasonable to me at all.
On the other hand, I can believe if this happens with those "take home assignments". Those tend to be larger (few hours to a few days) amount of work.
BTW, at least in this case, I did get the offer as they promised, within 48 hours. So I don't think they were doing anything shady. My guess is that they simply got tired of normal style interviewing and found this to be an efficient way to hire.
I'll take that module that auto-completes code with Stack Overflow answers first, thank you very much.
At the end of the day, this is what sucked the optimism out of me at my previous company. We had lots of people trying to steal from us. Even after I closed it down, I had two people in particular, ask me to help them design something, or give them detail domain/design knowledge, for free.
Yes, it happens. Far too often. Makes people like me jaded.
One uni called me up to tell me they liked my bid, but they wanted me to teach another company how to to what we could do, so they could buy from the other company.
The second uni issued an RFP, required that they get to keep all the docs, including detailed design specs. After interviewing all the companies, mine included, they selected 2 finalists, us and someone else. They probed us hard for details. Wound up buying our design from the other guys.
The prop shop did a similar thing, though we won the RFP. But then custy went silent while working on PO. They then awarded it to our competitor, as long as they used our design.
The semiconductor firm had a problem they claimed they could solve internally (they couldn't), and wanted to see what we could do. We met all of their (aggressive) performance, capacity objectives. But they kept pumping us for "free work" with the promise of a very large contract later. Against my business partner's recommendation, I called them on it, indicating that we'd be happy to work on the project with them. But not for free. So they "fired" us, as in asked us to stop working on it. Now, 4 years later, they are still struggling with the task.
Sadly, this is far too common.
> they wanted me to teach another company how to to what we could do
Here one might triple? their rate for a much shorter hourly design contract and leave implementation to someone else. The other two examples might have creative solutions as well.
I had a two-hour session with senior tech leads solely about solving a specific severe scalability issue in one of their database systems that they had not made much progress on. I showed them how I'd solve it using a technique unfamiliar to them. Next day, I get a "when do you want to start?" call -- they tried my proposed solution in a test environment and it had worked as advertised.
Two hours solving one real-world design problem in my area of expertise. Simple and on point, without any trace of culture fit, leet code, random whiteboarding puzzles, etc.
The sentence you're referring to was in the context of the FIELD which is why it said as much, and the specific question is a very well known stereotypical example:
> But disrespectful and unprofessional interview questions ("why are manhole covers round?") has become the new normal in this field.
If you browse my link you'll see that Google continues to ask off-topic questions, even if they aren't riddles or brainteasers.
Why don't they ask questions relating to the job? Why don't they do practice tasks that you'd be expected to do on the job? Those are the core issues.
I have read literally thousands of interviews at Google (both on hiring committees for 10+ years and in the group that reviews hiring committee decisions), and i just don't encounter them much.
I can't even remember the last time i read this type of question.
I suspect your definition may be different than mine.
I do see that new grads/folks without a ton of experience get asked more questions to test fundamentals (not riddles), and folks otherwise get asked to just solve problems.
I have also seen occasional a few unscored warmup questions that are more abstract/riddlish for nervous people, but even that's pretty uncommon.
It's usually small talk about resume instead.
Interviewers are also deliberately pushed to ask different questions if the candidate is not doing well (though it's slow to change interviewing behavior on this). Rather have signal on more questions than knowing that they really did badly on a single thing.
That means they may change style/abstractness of question depending on how interviewee is doing.
I've seen principal engineers (where this was the top of their ladder) that literally couldn't tell me what a hash table is.
For domain experts, Google actually does target domain specific areas (in at least 1 interview) and weigh it against how good people are generally.
However, I suspect most people think they are specialized experts in things they are not.
The really great jobs I’ve had didn’t really require interviews at all, they knew who I was and knew what I’ve done and what I could do, and that was good enough.
When I hear of this sort of behaviour in interviews, I have to assume that the culture of the dept is such that x% of your job will entail fielding issues that are things that you consider outside your 'job'. If 20% of the time you'll be talking about stuff that you consider tangential, and you don't react well to that, you may self-select out. ?
The rest are paywalled or something. So I'd implore you to be more specific about what kinds of questions you object to.
Since there is no incentive for a large number of engineers to cook up stories about their interview experience at Google, I am calling this smoke as having some fire behind it.
I think there is incentive for failed interviewees to cook up stories - or at least to tell something very one-sided - because none of us like to believe we failed at something and feel better if we persuade ourselves it was stacked against us.
An incentive to maximize the value of their job position and already acquired leetcoding skills?
And in fact there's no need for explicit animosity by the failed interviewee. Someone who misunderstands a question badly enough to call it a brainteaser when it isn't is also likely to have performed badly on it.
For my part, about every few months I hit something that would make a great interview question, and then have a wonderful afternoon coding it up... Ymmv.
This is an open internet forum for anyone to type anything they feel like in. Of course, as far as you know, I might be an Uzbeki 12-year old, so that argument only goes so far...
I can imagine that Google interviews for other roles than Software Engineers can have any kind of wacky questions. But mostly these stories sound so much like the stories that used to go around about Microsoft interviewing, only with the company name switched, that I have to think it has to do with their inherent virality somehow.
Google's interview training is pretty specific about what is expected of an interviewer, and the interviewer has to write up fairly comprehensive reports about the questions that were asked, and how the candidate answers them, what hints were given, etc. Most interviewers ask questions that they have asked multiple times before, so they will include things like, "The candidate (TC) took twice as long to code this a very basic warmup question that was designed to motivate a more in-depth distributed computing question; unfortuantely, what a strong candidate could do in ten minutes, TC couldn't get working C code in 45 minutes". (Current best practice is to try to avoid revealing the gender of the candidate in the interview notes, hence the use of TC instead of he or she. The goal is to try to avoid triggering any unconscious bias on the part of the members of the hiring committee.)
Now, there are some screening questions that are asked by recruiters as part of an initial phone screen when they are hiring for SRE candidates. Those questions are multiple choice and are really basic questions designed to screen out old-school operators who are quite good at mounting tapes (for example) but don't have any understanding of what TCP might be, and who think an SRE job is no different from a traditional system administrator position. Those are not asked by an engineer, but by a recruiter; the goal is to avoid wasting everybody's time.
Most interview systems don't have a systematic, ongoing way for accounting for this especially given that interviewers are self reporting.
In my opinion, most interviewers are undertrained/underskilled at asking specific questions and at interviewing in general. They don't invest the resources to systematically improve but rather treat interviewing as a burden to be minimized.
There is still a lot play in systems that allow for borderline smart ass behavior. Do you think an interviewer will write up that they sneered at the candidate?
The interview process is a factory. The company is mostly concerned with reaching outputs. The candidate is just a happenstance casualty.
Of course, that would be rediculously expensive, so at best some canned training is used instead. Still, it might make sense for higher value teams.
Afterwards, the new interviewer has to do a non-trivial number of interviews before they are considered "calibrated". When an interviewer is uncalibrated, their score won't be given much weight, and the hiring committee can see how their interview reports and scores compare against more experienced interviewers. This also gives an opportunity for the hiring committee to send interview feedback (e.g., you're asking a banned question; the coding question is too simple, so it's not providing enough of a useful signal; don't ask "trick questions" which again don't provide much useful signal whether or not the candidate can answer it correctly, etc.)
So there is certainly ways in which interviewers do get suggestions for improvement. And it doesn't have to be _that_ expensive. It's just a matter of making sure you don't have more than one uncalibrated interviewer per panel.
It won't be possible to extract any useful information from such interviews. But I guess it's possible to convince employees to haze candidates this way. You should feel bad for doing this to people though.
Empirically this isn't the case. It would be much more interesting if you took the time to elaborate on what about this process is cultish, why it won't provide useful signal. Without that, it just comes across as a mean-spirited complaint.
For a design question, the interviewer will write up a sketch of the design, what tradeoffs were identified by the interviewee; what hints, if any, were needed, etc.
Then the interviewer will rate candidate on various technical dimensions (coding efficiency, design, etc.) and non-technical dimensions (communication, leadership, etc.) For each of these ratings the interviewer has to justify the rating, by pointing at examples from the interview notes.
Finally, the interviewer will be asked to score the candidate along a dimension of "strong hire" to "strong no-hire" and again, the score must be justified with a paragraph. For people on the hiring committee, the justification for the scores are often far more important than the actual rating given by the interviewer.
The hire/no-hire decision is not up to the interviewers; the hiring committee is composed of a different panel of engineers who review the interview reports from the interview panel; and the members of the interview panel write their interview reports without getting to see or hear from the other members of the interview panel.
The system is not perfect (what system is?) but as someone who has conducted tens of interviews as a Googler and at other tech companies, I can say it's one of the most rigorous and fairest systems I've seen so far. Interviewers are trained to try and get candidates to a 'win', trained to be aware of their unconscious biases, and the hiring committee process reduces the significance of any one vote. In general, the process is designed to select the best candidates and give all candidates a positive experience.
It's not perfect -- indeed, I'm sure I've given 'failing-grade' interviews as an interviewer here and there -- but it's one of the least-worst human systems that I've encountered, from both sides of the process.
Besides being wrong and pathetic, why would anybody want to do that? The goal of maybe 90% of Google engineers is to spend as little time as the company allows on interviewing and writing feedbacks, and instead spend time on actually working on their projects.
Because when you try to simulate those situations in an interview that needs to be conducted in 45 - 60 minutes, the questions become the kind you hate.
I regularly interview candidates; and some find my questions weird. The point is that I'm trying to simulate everything in such a short time. Otherwise, each interview would take 16-32 hours, because we'd have to go through a massive onboarding process just to ask basic questions.
I hate this meme so much. I personally have seen situations in my own work several times in the last few months where knowledge of algorithms, even on the leetcode medium level, was immensely helpful in finishing the job. If you can do it quicker and more fluently, you're probably going to be more productive.
I remember that there are test patterns built into T1 CSU/DSUs, but not what they are or how to turn them on -- if I need to know, I'll look it up.
I remember that there are three QoS bits in the IPv4 header, but not where they are. Probably pretty early, because of hardware implementations.
I remember that lots of people look down on Perl 5's object system, but not why. I remember the existence and purpose of lots of Perl modules, but not the interfaces.
I edit JSON and YAML every few weeks, but I don't have a conscious recollection of the rules -- they're prompted by looking at what's already there.
I can guesstimate Big-O notation on most chunks of code, but I'll be fooled when there's a function that's hidden in a library because I generally don't have those memorized.
I can tell you the bandwidth of lots of hardware interfaces, and the relative efficiency of speed and efficiency of storage for a handful of RAID configurations -- the ones that I set up, and the ones that I avoid.
I know one firewall configuration tool reasonably well, which means that I look up esoteric bits, and have used so many that I expect to do common things in all of them with a quick examination of the language.
I currently know a fair amount about GDPR and why it doesn't apply to my company (and how we can assist customers with their GDPR requirements) and lots about the Massachusetts and Virginia data privacy law. Very little about PCI compliance, but my point is: if my company wanted to do card transactions, I can figure out what I will need to learn to write a good policy and get it implemented in a way that won't embarass anyone.
In short: when the job is the same thing over and over again, you memorize the details. When it's always something new, you need a broad overview about what can be done and where to find the details.
Yep. This is not the only instance were Google shows a culture of lack of respect for the dignity of the individual.
As compared to say "are you going to have kids - to a woman "
That's not only disrespectful and unprofessional but potentially also illegal. But it isn't a contest to see who can be the most disrespectful or the most disrespected, so I don't follow your point.
But the only reason people apply to Google and Facebook and Netflix in the first place is because they pay a lot of money. There are plenty of crappy CRUD shops that won't ask you to enumerate palindromic primes. As long as you can do Fizzbuzz they'll hire you and pay you 80k/year to glue libraries together. But people still try to interview at Google instead because they want to make 300k/year and not 80k/year. You can't have your cake and eat it.
Talent is not a single measure at Google. There are multiple facets to whether Google believes a candidate is solid. Strong technical talent is not an indicator of success, rather just one aspect of it that's taken into consideration by the hiring committees. So yeah, Google will say no to incredibly talented people because they fall short in other areas.
Crapshoot is table stakes practically everywhere you go. At least Google makes an attempt at making things objective and holistic.
There's a lot of myopia in this thread.
I have 16 years of total experience. Their initial offer to me indicated they thought I would take $231k/year for the privilege of working there. I got them up to $253k/year. Both numbers are less than I make now (though the latter is close).
As far as I am concerned, and based on my direct experience, Google pay is not the hit shit everyone claims it is. Maybe it would be if I played the competing offer game, but I shouldn't have to do that.
Check out levels.fyi.
Nope, every shop has convinced themselves they are changing the world, and do ask these questions, often on codepad.io. Even greeting card companies, hah.
I do contract work and interview often. Not one single place in the last two years hasn't tried these highly inefficient tactics on me.
They do it this way because they can, they have to weed out 99% of candidates, they are hiring for software engineers, it makes sens to weed out those who do not know perfectly core/advanced CS problems/solutions.
If you have such a high hiring bar — and you still get a sexist memo, or vocalised hate-crimes on your platform, or salary suppression, or workers being disallowed from using the bathrooms, or 20,000 employees staging a walk-out because of bonuses awarded to perpetrators of sexual assault... Maybe your a hiring bar is lower than it should be.
No. Google, FB, interview this way because you don't have the skills on their stack, and this is the common denominator. (Source: I'm an engineer/manager at multiple FAANGs for the last decade.)
There is no mystery involved, one of the more known books on the subject is "Cracking the Coding Interview", but really there is no mystery to crack.
There is a ton of literature, tools, examples freely available on the Internet, all that is missing is time and effort to go through them and learn them as if it were for an exam.
One of the main objection is "Why I should spend more time in doing something I already do full-time at my job, where I'm perfectly qualified?", but to me it's pointless: if you think getting hired by another company will improve your life significantly (or even marginally), it is something you should definitely put effort into.
At the very least, this process (aside from those lucking out) proves that the candidate is able to understand a non-trivial problem (getting hired) and have the ability and put the effort necessary to solve it (going through the bullshit excercise and questions), as it was in your case.
From subject matter taken from multiple 4 year degree fields. Science, engineering, theory, implementation, data XYZ, project management, sys admin…
Never happens at Uni, because it would be impossible. 99% of folks would fail, like they do in tech interviews.
Bias is bad. Discrimination is bad. And if you're big AND bad, you get fined, and have obstacles put in your path.
So what does a rational BigCorp do? They make everything "fair". So everyone has the same opportunities, everyone is colorblind about cultural fairness-obsessions x, y, and z, and the hiring process becomes as effective as cardboard cake. Tick the boxes and you're in. Otherwise, there's the door.
Of course, the irony of all of this is that they end up discriminating against the very people they need: The different, the strange, the quirky, the innovative - everyone who doesn't perfectly fit the criteria of a safe hire where nobody can criticize the hiring decision (and impact your promotion prospects).
This is the social process by which BigCorp stagnation works, and it's always how it worked.
My point of disagreement would be that BigCos making "safe" hiring choices and having overbearing, one-size fits all HR policies is a phenomenon that predates modern political correctness. Having a policy of making interviews into arbitrary objective tests isn't just to prevent twitter mobs and hold off political rhetoric, it's a way of making hiring more predictable, combat nepotism, and select for people who aren't too individualistic to succeed in the corporate environment. The downside, of course, is you throw away and turn off a lot of good candidates who don't fit the mold.
I work for Google as a software engineer and now also engineering manager.
There is no plausible mechanism by which a decision that anyone makes on whether to hire a candidate could affect that person's promotion prospects.
Interviewers don't make hiring decisions directly, they merely individually make recommendations to a committee that reviews these recommendations (along with other info like the candidate's resume). I have never heard of hiring committee "reporting" someone for recommending they hire someone "weird" or "bad" -- it would be kind of absurd. If you don't like the recommendation, don't take it! Similarly, hiring committee members are not going to be "written up" for making "weird", or "bad" recommendations. There isn't even a group of people who could plausibly do this.
There is plenty to criticize about how Google hires. But I disagree with what I read as the thesis of your post that we have to choose between a biased process and an effective one. Indeed the problem you identify is what I would describe as bias against people who don't fit a particular mold, people who are unusual or "weird", and so really what I think you're saying is that in trying to fight e.g. gender bias, we have introduced new biases.
I also don't think that's true. The counternarriative I'd propose is, we have always been biased against what we consider to be "weird" people, that's human nature. What we've been trying to do is change people's perception of what is weird, so that e.g. female or black coders aren't. That's what unbiasing is about.
There's several logical fallacies that wind up happening in these sort of processes (confirmation bias, survivor bias, etc) but the people making the decisions are fine with making them as long as things are working as expected. Eventually entropy takes over in all systems, including human social systems, and things break down irreparably. There's no stopping this in BigCorp, because that is their nature. There's no stopping this at Small Co. either, because in that system there's so few people that a single hiring mistake affects such a large % of the total work force.
TL;dr- All human systems involve humans, and that is best point of entry for problems.
- Police officer: Just follow our instructions, be cooperative.
- Car salesman: Just be upfront with what you want. Tell us about yourself, and we'll earnestly try to help you.
The message is the same; the authority "just wants to help," but in reality the relationship is adversarial to a larger degree than it is cooperative. On Blind, you know the real way to getting through the Google interview is LeetCoding like hell and not admitting you've seen the questions before.
- A Google interviewer's (and I would assume any interviewer's) primary goal is to come out of the interview with enough confidence to give a positive or negative score. If they sit down to write feedback and have to give a neutral score, the interview wasn't productive. This means that the interviewer is just as eager to find evidence for a positive score as a negative one -- there isn't an incentive to "getcha" with cheap or tricky questions.
- Doing interviews at Google is volunteer work. You are not interacting with a professional interviewer, you're interacting with someone whose day job is being an engineer. They don't have an evil agenda; they are doing this because they want to help Google hire the best candidates, and by inference make sure their future coworkers are good people to work with.
- Interviewers overwhelmingly _want_ their candidates to succeed. It's a true joy when I have a candidate who glides through a question (or finds a solution that was even better than mine). When candidates struggle, it's not a pleasant experience for the interviewer either.
- In the end, the point of technical interviews is to avoid the terrible experience that is working with an incompetent or uncooperative teammate. Interviewers are trying to find people that (a) can work well with others and (b) can get the work done.
- The system is _highly_ prejudiced towards suppressing false positives. This is the right decision, but it comes at the cost of a high rate of false negatives. Were myself or any of my colleagues to re-interview for our jobs, I would expect about a 60% hire rate. This is not even taking into account the constant ebb and flow of hiring demand. Sometimes there just isn't any headcount. And sometimes you just happen to get questions that you don't click with. This is also the reason that recruiters are so eager to bring you back to interview 6 months later.
- Recruiters and interviewers have very different incentives. Recruiters want to maximize the number of people they get hired; interviewers want to hire people they want to work with. This can lead to behavior that seems schizophrenic from the outside: the recruitment side of the pipeline constantly pestering people to interview, but once the candidate enters the interviewing pipeline the process is slow, deliberate, and careful.
This is textbook Google propaganda that has been repeated at least since I last worked there 5ish years ago. It's bullshit though because the ratio of competent to incompetent engineers was the same as at FB, MSFT and NFLX (with the latter tending to prune the fastest).
Just because you generate a system that spits out a lot of false negatives, it doesn't mean it has done anything to reduce false positives. This should be immediately obvious given that the relationship between the questions asked in G interviews and actual software engineering is non existent.
Don't repeat the trope that Google's hiring system is actually better at eliminating false positives. There is no evidence of it and if it truly was better, everyone would adopt it in a heartbeat and we wouldn't be working with bad engineers who spent a few months on leetcode to get into jobs way over their heads.
The reason Googlers never care to critically question the sorting hat is because it picked them.
I think this is a truism about the quality of most organizations - people who thrive in a specific organization coalesce into the organization and enhance those qualities within the organization that are specific to them.
I presume the fact that Google uses non-professional recruiters makes the recruitment process more about cultural alignment than it absolutely needs to be to gauge the capability to add value in a software engineering process.
Oh, but they can and do. The difference between 100 and 1000 resumes is not material—both too many to look at. What I've found is those in the second bracket simply throw out those without a degree and then cargo-cult common practice.
Year after year the Googlegeist survey finds that one of the things Googlers most enjoy about working at Google is their fellow employees
> if it truly was better, everyone would adopt it
Google has an abundance of money and an abundance of applicants who would like to work there. Companies with fewer applicants per position or lower salaries relative to the industry average may need to be more open to false positives if they want to be able to hire anyone at all. Smaller companies also have the advantage that they can usually fire people more easily than larger companies, which helps lower the cost of false positives for them.
Hiring has very little to do with that. Perf review, feedback mechanisms, and work environment are orders of magnitude more critical to that. I've worked two startups with completely different hiring processes from Google and the other employees were amazing to work with there too. The key is feedback to correct issues and a quick PIP/fire process for folks not cutting it.
But those companies use hiring practices that approximate, to a high degree, what Google does. Facebook and Netflix certainly do, and Microsoft has a high enough rate of bad hires that you are required to reinterview to switch teams, so that high performing teams can keep reject bad candidates who are already at Microsoft.
For example, you'll have people insisting (and consciously agreeing) that their objective is to come out of the interview with enough evidence to give a positive or negative score. In the back of their minds, though, they will often be projecting their own insecurities -- about their expertise, about their career, about their job, about their team. Halfway through the interview, things end up being about something else altogether, like interviewers trying to reassure themselves that they're better than who they're interviewing (it's especially hard not to fall into this if it's been years since you last had to implement a red-black tree and you're interviewing a fresh graduate who dreams this stuff in their sleep).
It's very hard to get past these things. I struggle with them every time I interview someone, and it's very hard to know when to chalk it up to "the system" and when to chalk it up to your own baggage. Pretending that it's only the former only perpetuates this stuff -- and empowers the ones who actively enjoy abusing candidates and making them feel like crap just for the heck of it. Which is very common everywhere -- including, from what I've heard among my peers, at Google.
So far, the most relevant compass I've found for these things is made out of two questions:
1. If I were a candidate, and I'd have gone through this interview, how would I feel about it?
2. If I were to go through this exact interview today, would I still get hired?
If the answer to #1 isn't too good, there's probably some individual-level things you can change, but if the answer to #2 is bad, the problems tend to be more systemic in nature.
I wonder if the algorithm centric interview style at Google can really achieve that. From my experience, algorithm centric questions + plus white board coding have bias towards academic people. (Maybe that’s fine for google) However, the way to crack that kind of interview is really just practice like hell on leetcode. Just take a minute and think, who are most motivated in doing that? Good, experienced engineers have no trouble finding jobs in Bay Area, and why would they waste their time on leetcode for skills that are mostly going to be useless in real work? New grads and engineers that have trouble finding jobs are most likely to spend hell of their time on leetcode. I think the interview style at Google is in fact increasing false positives instead of suppressing it. It also has high false negative for sure.
There are tons of other ways of doing interviews, in which interviewer gets a lot more and very relevant signals from candidates while keeping candidates pressure low, and not wasting their time, but Google is not doing it, like asking practical questions, letting candidates write and test their code in their own computers, has a debug session, etc.
The impractical part is that you'll probably be coding in a Google Doc rather than a text editor. It can be a bit disorienting.
I've also interviewed at Airbnb, where I was asked to code in Codepen for UI and in node for algorithms. I felt more comfortable in a more realistic coding environment, but one of their computers crashed mid-interview and the other's network access was broken. Coding in a doc and hand-waving when necessary is better than working in a more realistic environment if the hardware isn't reliable. (Realistic doesn't nec. mean unstable, but if you're expecting candidates to write working code in a fixed period of time, you need to make sure your communal interview machines are well-maintained.)
The current interview process at Google and probably at other companies as well seems to be frozen in time - one needs to be a fresh grad or prepare weeks/months in advance or be "into" competitive/sports programming, which in most cases has nothing to do with the daily job.
So no plans to have a fresh look on this?
Or maybe you keep these practices (and other companies follow) so that it is harder for engineers to change jobs easily?
I believe it is inappropriate to ask engineer to prepare in advance for the interviews.
Last time I got contacted by their recruiter and sent links to coding websites - I replied “Great for someone just out of college”.
Google, you are boring company with insane interview process. When I worked there 8 years ago I met many people, who thought passing the interview made them better than other. I regret I didn’t tell them that they should check their heads.
In every single time their recruiters were telling me our wonderful my CV was and they wanted definitely to have me there, naturally with a selection role totally unrelated for the kind of positions that I was applying for.
The third time I got a direct invitation from their HR team, I made it clear I wasn't interested if it was going to be again the same old way. Never got contacted again.
Google stands on feet of clay
This is only true if the interviewer is uninterested in what happens after the hiring process. If they want to make sure they're winning a reputation doing great interviews for more good hires than bad hires then there's an incentive to be cautious, and that caution could well manifest as trying to catch out anyone who might be 'gaming' the interview process. Those false positives reflect badly on the interviewer; the false negatives don't because they might have been real negatives.
Remember, interviewing is volunteer work, not something that will advance your career. The results of the interviews and committee deliberation are confidential, so there's no way to gain a reputation for being a "great interviewer".
One of the major flaws in Google's process is assuming that the engineers are incentivized to find good hires.
It's not a natural skill, dare I say it, especially for an engineer.
Sounds like a good interview question.
I've done 300+ interviews at Uber where the process is somewhat similar to Google, and OP's points are true. As an interviewer all you really want is get good signals either good or bad. And yes, an interview is much nicer when the candidate is doing great.
Source: I work & hire at Google. Opinions are my own.
Before interviewing at Google I spent ~3 weeks doing leetcode style problems on a whiteboard I bought just for this purpose. Did not make me any better as a SWE, but definitely helped me clear my interviews. Without the practice I would have failed my interviews.
Having said that, I don't think I have any better alternatives; any interview process is ultimately going to be game-able in some manner.
Consider their financial incentives first before ascribing some altruistic motives.
At British telecom you had to pass a 3 day course before you where allowed to sit on internal review boards.
Mostly I just can't picture people from companies like Amazon or MS posting such stuff.
My attitude going in was that I was evaluating what it would be like to work there, and I treated the interviewer as a coworker solving a technical problem, where I was taking the lead.
The questions I got were pretty much exactly the sort of hing you see on LeetCode, but I hadn't seen any of the questions before. I know from studying for other technical exams that once you reach a certain number of practice problems, pattern recognition takes you a long way.
I had a similar experience at Facebook. Everybody was super nice, they all wanted you to do well, and they all responded well to being enthusiastic and genuinely interested in the problems you're solving.
I had the same experience. Afterwords I thought I did pretty well because I got to a solution to all the problems eventually.
Later when I discussed not getting the offer when a friend who works at Google. He said they try to guide the interviewee to a solution before the time limit. Which is reasonable, you want the interviewee to feel more confident and not get tilted by messing up just one question. However in hindsight the interview is a lot less enjoyable. You can't show any weakness solving a problem, less they offer you a "helpful hint" and record a black mark against you. The interviewer is there to smile and act like your colleague solving a problem while all the while making a list of why your a bad problem solver.
I'm not that bitter though. The interviewers that asked me questions actually related to the position I was interviewing (embedded software) did actually feel fun and collaborative. I accept that if I want to interview well at companies that employ these types of interview processes I will need to do every problem in CLRS (or at least more than zero).
But the impression I got as the interviewee was that every question ramped up in difficulty. The slope of the ramp-up depended on the question. So maybe in your case, you didn't get as far on the difficulty ladder as other people that were interviewed.
I was pretty shameless in asking for hints when I needed them, but it was typically of the form "I see this tradeoff here, and my gut says to do X, is that reasonable?" I'm sure some of that counted against me, but I was blunt about what I needed to make a decision and solve the problem, and I think overall that reflected well on my ability to problem solve on a team.
I would definitely not recommend doing every problem in CLRS. I would recommend doing just about every problem in Cracking the Code Interview (because they have nice answers), and then getting into a rhythm of doing LeetCode problems every day until the interview.
YMMV with interviewer, but personally, this is the type of thing that gets positive feedback from me - someone who understands and explains the tradeoffs involved.
Also, fair warning: LC has its uses, but it is not the same as an interview.
Yes, ramping up in difficulty is certainly a thing.
What gave me trouble, besides being rusty with algorithms, were some trade offs I would never make in the domain I'm familiar with. In my day job my microcontroller has 192KB of RAM, so limiting the amount of data collected to fit is important. I never get to the point where I have to worry about my simple algorithm not scaling to GB of data. Another odd idea was doing lots of speculative computation to reduce the latency of a system.
All of these algorithm things can be studied and after failing a couple of interviews you can start to understand what the interviewer is expecting and how to clarify assumptions.
I do question what sort of company you build if you screen for people who can write code on whiteboards. There is no giant mono-whiteboard where all the software engineers check in their code, so the interview process is not related to what people actually do for the job. On the other hand I have interviewed hundreds of people and I don’t feel like my questions do much better with the same one hour slot given to me by HR. The whole situation is inefficient for everyone.
This is really the crux of the problem. And the definition of 'hint' becomes real blurry at times.
If nothing else, interviewing is an important part of your working life. To get a competitive job, you are probably going to have to practice something. I personally can get a lot more excited about competing on puzzle solving abilities than competing on social skills or social protocols, even if those puzzles might not be used very often once I'm at work.
(I'm not saying algorithms are completely unrelated to the work we do- I'm saying that even if they were, i'd still prefer to be tested on puzzles than to be tested on social skills)
I mean, the subtext here is that we're all getting interviewed for jobs we don't know how to do. All companies have their own infrastructure and their own way of doing things; you can't hire people with experience on that companies codebase unless you want to limit yourself to ex-employees. So all companies are interviewing to try to figure out who could learn how to do the job in question, and that's... pretty difficult to do.
But one interesting side effect of the interview process is that everybody at a place like Google or Facebook or Amazon has a non-trivial understanding of efficiency.
The poster is correct that open ended questions that reveals the work ethic of team members are much more important than solving algorithms and hard problems. Intelligence is over rated.
When I was at Google, I worked in a tiny office, and I was friends with the recruiter who worked that office. A couple of times she asked me to vet the answers to the non-technical questions, because if the candidate went off-script in any way, she simply couldn't tell if it's a wrong answer or a more in-depth answer than the script anticipated. This wasn't part of my job, and if our office wasn't so tiny we wouldn't have had any contact, and she wouldn't have had any engineers to ask.
On the other hand, there’s some signal (sorry) on the nature of the work you’re doing in whether SIGTERM (15) is imprinted in your brain from near-daily appearance on your screen, or not.
I would say a vital command-line skill is having a good filter to ignore what's irrelevant.
Signals 9 and 15 and much more well known by their respective value than their symbolic name. They are everywhere in program output and log files and if those things are your daily work then you recognize this whether you want to or not.
It's like if you have touched computer code in any language then you probably know that 65 is A. Not because you have useless trivial memorized but just because it figures every time you look at string values for some reason. In one sense it is useless trivia, but on the other hand a programmer who never stumbled on ascii codes must have lived in an unusually insular bubble.
I have never written a program that outputs a signal number to stderr or a log file. And grepping the logs on a server with nearly 800 days of uptime, I can find only five references to signal 15, from months ago.
> It's like if you have touched computer code in any language then you probably know that 65 is A. [..] a programmer who never stumbled on ascii codes must have lived in an unusually insular bubble.
20 years of programming and doing things like making custom fonts. I couldn't remember the ascii code of 'A'.
A bit more relevant would be what's the difference between this two
So you're left with bad questions which have single answers, mostly involving rote memorization.
Additionally, it's common for interviewers to interview for multiple different groups (as an additional tech interviewer, a xfn interviewer, a Googleyness interviewer, etc), so exposure to job requirements can be reasonably broad at the interviewer level.
As I imagine many suspect, calibrating interviewers is hard, and always has been, for a company hiring roughly 10,000 people every year.
The fact is that Googles interview process isn’t designed to work for you or I, it’s desiged to work for them. The key factors that influence it are that they get tons of applicants, and they have a strong preference to eliminate false positives (which they can afford due to the previous point).
I’m not sure why somebody (seemingly earnestly) trying to help potential applicants understand the process should be the target of anti-authority criticism.
In cases where the interviewer is hiring for their own team, I feel like it's reasonable to assume that the interviewer is trying to hire people they want to work with. Acting in their own self interests, as it were.
In some cases that may mean "I want to make sure you are someone I'm able to collaborate with." In some cases that may be "LeetCoding like hell". Regardless, if that's what the interviewer is looking for, the candidate gets a really important insight in to the team they'd be working for and not necessarily indicative of some larger corporate conspiracy.
I have a friend who solved/memorized ~600 leetcode questions and got offers from Google, Facebook, etc. He was average, not dumb but not well above average. When I apply to both, I will definitely be leetcoding as many question as possible as well, to maximize my chances.
IME, my worst interview process ever was at Google. All I remember from it was some question about burning a piece of string. The coding part was relatively easy.
The best interview process I ever did was at Yahoo in 2008. Everyone was very friendly but the questions were tough. I found the “expert” level of the people interviewing me higher than at Google at the time. It was grueling though - 9am to 7pm. Got an offer to “join any team I wanted”. Turned it down for a startup.
Good decision in hindsight.
You have two identical ropes. If you light a rope on fire, it will finish burning in exactly one hour. The burn rate is inconsistent - there is no relationship between how much of the rope has burned, and how much time has passed, until the rope has fully burned away.
How can you use the two ropes to measure 15 minutes?
Given that it's printed in cracking the coding interview, Google has almost certainly banned this problem.
Well then, to go through and solve 600 leetcode questions actually takes a lot of dedication and time investment, which alone to me shows that your friend has a lot going for him. TBH, as someone who has hired a lot of engineers, if I found someone who was well rehearsed, and it was clear this person understood a lot of the underlying concepts in a problem as I dug deeper, I'd be thrilled to hire that person!