Google's sort of "computer science trivia" style interview is the bane of software engineering.
"CS trivia" interviews select for people who know the mathematics of computer science, not for people who know how to ship robust software systems on time and within budget. The two are only loosely related and mostly uncorrelated.
In the real world, you use a well-tested premade off the shelf library to traverse a DAG or implement a red/black tree. Just because someone can recite algorithms and data structures in their sleep doesn't mean they can effectively scope, plan, or execute a complex software dev project; and conversely many of the best engineers either never learned or long ago forgot the details of the CS.
This mode of interviewing is like interviewing the contractor you're considering to remodel your bathroom by handing them a pile of iron ore and asking them to smelt high grade structural steel out of it.
So why do they do this? Lack of any better option. Their previous "brainteaser" style interview didn't work at all, and this is probably marginally better than that at finding good engineers. But more importantly, it's much cheaper and faster and less risky in terms of confidentiality than hiring candidates as temporary contractors for a real world test doing actual work, which is the only remotely effective way to tell who is actually a good engineer.
> In the real world, you use a well-tested premade off the shelf library to traverse a DAG or implement a red/black tree.
Unless, you know, you are implementing the standard library for a new language, so that there is no such "off the shelf library" available. Which its not been completely unknown for Google software engineers to do.
"Off the shelf libraries" aren't things that are handed down on stone tablets to the prophet on a mountaintop, they are things that are created by software engineers.
In any case, understanding which algorithms and data structures to use, even if you are using a pre-made library, and the impacts of that decision requires an understanding of the algorithms and data structures similar to that which is needed to implement them (it doesn't strictly require the ability to implement, but ability to implement isn't an unreasonable way of filtering for the requisite knowledge.)
When I worked at Google Search Infrastructure (not any ML stuff) I actually built a performant, persistent red-black tree implementation in C++ for creating persistent sorted lists with good insert and delete characteristics. It's was a pretty niche need, but we had it and the existing libraries we could find didn't cover it. We needed it for doing some speculative programing where we were doing some constraint solving and were using a greedy-ish algorithm to explore the constraint space. Arbitrarily solving the constraints was theoretically and practically impossible in a performant way, so we settled for a greedy exploration of the search space but we needed a way to back out of failing lines of exploration.
So, yeah, we needed engineers who were strongly familiar with the theory of how hard things are to solve and knew a lot about different types of data structures. We were in a space the existing libraries didn't really cover.
Maybe not all engineers need these skills, but it was valuable that our team all had them and were thus able to have deep design discussions about it.
The skill that's lacking the most at Google is project management, which is not necessarily a skill that every engineer needs.
There is another point to this: You shouldn't have all in your head. If you're actually going to implement a standard library for a new language it's reasonable that you spend some time and energy on researching and looking stuff up.
I don't see how being able to recite trivia questions from memory is a good indicator even for implementing a standard library.
Having a rudimentary understanding of algorithms and data structures is useful but in many cases it's also something you could look up.
Well said. If I ever did actually have to implement a red/black tree or a TCP stack for my job as a software engineer (which has somehow never come up yet), I wouldn't do it by standing at a whiteboard racking my brain for the details, and I certainly wouldn't set a one-hour timer. I'd pick up a book or read some reputable websites to refresh my memory first, and I'd take my time writing it, since it'd have to be an important corner case to necessitate rewriting something complex that's already pretty well implemented.
The real use-case for algorithm knowledge at Google isn't implementing standard libraries, it's in distributed computing. Okay, you know how to implement a shortest-path algorithm. Do you know how to implement a shortest-path algorithm when the data is spread over 10,000 computers? How about when paths become blocked by traffic jams or accidents? Could you adapt it so you can update that shortest-path in real-time as weights changed?
In these cases, most of your head-space is focused on the problem, which isn't written down anywhere other than PM's requirements doc. It's pretty critical that you understand the algorithm really well, because if you have to look it up you're going to have to unload all the messy problem-domain stuff that's in your brain.
You seem to think you'd lose points in a Google interview for not knowing some specific concept. That's not how it works. Interviewers want to know how your brain works, even when presented with something you don't already know.
I admit I have never interviewed at Google. I have interviewed at a number of companies that use the "CS trivia" style interview process. I got some of these jobs and didn't get others.
In the cases where I didn't get the job, I naturally asked why. Despite repeatedly having assured me that they just wanted to know how my brain works and not whether I'd memorized specific concepts, I am usually flat-out told the rejection was because I had implemented some minor detail of algorithm X incorrectly; that my implementation failed in some corner case.
I've also passed and worked at companies which do CS trivia style interviews. When they later interviewed other candidates while I was there, I sometimes observed how the people conducting the interview would first study algorithms and data structures for hours on Wikipedia prior to going in -- because they themselves can't recite any of that without preloading.
Maybe Google works differently; I don't know. At most places, the "CS trivia" style interview seems to primarily serve as a vehicle for bolstering the egos of the people conducting the interview -- they get to feel superior and ding people for not knowing the gotchas of algorithm X.
I'm working on Google ads where a ,,minor detail'' in an algorithm implementation can easily cost $50000/day loss. It also means that the experiment has to be stopped, everything fixed, tested, approved again, which can easily take a month to get correctly, so a lot of time is lost. I'm not saying the interviewing system is perfect, but getting edge cases right in a big system is critical.
From my experience, I agree with this. I interviewed at Google in 2014. I prepared extensively. However, I bombed the interview. The interviewers genuinely wanted to see how I think. I am not brightest guy around and couldn't provide great insights in my solutions. I got same feedback from recruiter.
Just came here to say that from my limited experience, I concur with this comment. Google also has improved their feedback loop after the interview, which I find severely lacking in other companies with similar interviewing styles.
> Unless, you know, you are implementing the standard library for a new language, so that there is no such "off the shelf library" available.
The ability to write down a complex algorithm on the whiteboard without any references is not a very good proxy for being able to implement the algorithm in a standard library.
It vastly over-selects for people who are good at memorization.
I know how to use these algorithms and data structures, and in fact do use many of them regularly. I can even implement them myself when I need to. But I certainly don't go to implement them without access to any references—if you're writing a standard library and don't consult references when doing so, then that's a bigger indication of you being a bad engineer than a good one.
It's really just an elaborate signaling ratio that you're willing to devote lots of time to studying. I personally don't think Google is such a mythically fantastic employer that I would spend tons of time studying just to get a job — and that's probably the point.
Let's be frank, most google engineers don't create new programming languages or work on library infrastructure.
The only common topic they all work on is selling our eyeballs and collecting our data. Interviewing for that would make more sense than the algo stuff.
P.S: there's a difference between knowing typical algorithms needed to solve a class of problems and studying to implement them on a whiteboard.
> there's a difference between knowing typical algorithms needed to solve a class of problems and studying to implement them on a whiteboard.
I'm not sure that's all that true; even if you haven't specifically studied to implement the particular algorithm, if you are familiar with the concept of the algorithm and proficient in the implementation language, then coming up with some kind of cut of an implementation isn't an unreasonable request. (Now, if the evaluation of the quality of the implementation is unreasonable for the context in which it was requested, that's a problem, but if its used to judge understanding of the concept, if potential problems in the implementation are the focus of probing questions to determine understanding rather than straight rejection, etc., I don't necessarily see a problem.)
Yes, that is true. A tiny percentage of Google engineers build new languages. If you're hiring someone to write a compiler, say, then it is absolutely appropriate to ask interview questions about compiler design.
The point is that that is not how these interviews work at all. They test for the ability to memorize and regurgitate on demand a standardized and generic set of "CS fundamentals" based on the curriculum of elite CS schools -- nothing more.
For almost all jobs, even at Google, almost all of that knowledge is never used at all. Even at Google, most coding jobs are unglamorous CRUD. You just don't hear about them in blog posts because it's far more compelling to write up their cool new programming language project.
>For almost all jobs, even at Google, almost all of that knowledge is never used at all. Even at Google, most coding jobs are unglamorous CRUD. You just don't hear about them in blog posts because it's far more compelling to write up their cool new programming language project.
I'm curious about this statement, you say this but... have you ever worked at Google? I'm genuinely interested, not trying to discredit you. It's a bold statement to make if you don't have direct experience on the job.
Morgawr I suggest you go around this thread and read that guy's comments. You'll notice that they're all in the direction of "other people merits aren't such a big deal". His implication that passing an interview at google is just memorizing trivia
> I don't have months to study fulltime to cram for the CS trivia challenge
His implication that the reason why he didn't get into google isn't because he's not smart or knowledgeable enough, but because he didn't go to a prestigious school
> No, I don't work there. I didn't study CS in school (much less attend the elite CS schools they are selecting for)
The implication that maths is somehow less important than nebulous concepts like "shipping code within budget"
> "CS trivia" interviews select for people who know the mathematics of computer science, not for people who know how to ship robust software systems on time and within budget.
That interviews are processes which dumb people pass, unlike him
> Their interview doesn't select for "ability to figure things out" at all. It selects for "ability to memorize and regurgitate academic CS curriculum on demand."
And then the comments through which I came to know this guy: when someone makes a good point, he replies that "correlation doesn't imply causation", and then when asked a simple question, instead of answering the question his answer is "I have X credentials, therefore I know the answer very well":
> I have a philosophy degree, and I can say with some confidence that I understand that particular catchphrase pretty profoundly. Not everyone who happens to disagree with you is merely an ignorant barbarian in need of enlightenment. See Karl Popper's theory of critical rationalism to answer your questions about causation.
You see, checking boxes (like getting credentials and academic knowledge) is very important, but only the boxes he's checked.
I'm haven't enjoyed a thread this much in a very long time.
No, I don't work there. I didn't study CS in school (much less attend the elite CS schools they are selecting for) and I don't have months to study fulltime to cram for the CS trivia challenge that you need to do to get through the door at Google. Fortunately, there are plenty of other jobs for software engineers.
I admit it is based on hearsay -- I know a number of people who work there. They say things along the lines of, "The pay is great and the benefits are generous, but the actual work is more or less the same as any other tech company." They tell me about their projects, and it's true. It's mostly CRUD.
Trust me with your attitude Google isn't missing you either. Maybe one day you will appreciate fundamental CS knowledge as more than some "trivia" to study.
Who writes those premade robust libraries? Who improves them, when a few percentages of improvement will be multiplied by enough machines to come out to millions of dollars saved?
If you have a pile of raw iron and you're currently running your own smelt for cost savings over the other remodelers, maybe even though you are in the remodeling business you could use those skills.
I agree though that contracting is by far the best way to grade performance. But the Google style interview is the only reasonable alternative I know of to filter out people who don't actually know the technologies they are talking about and can fake it with other people's code for months.
Sure, but a library writer can use libraries; the converse is often not true. And most places pride themselves on a) hiring really good people and b) giving them mobility within the company.
If I wanted someone to glue stuff together without contributing to company infrastructure why not just hire a contractor for the gig?
(Also Off topic, but should I know who Sheldon is?)
The point is the test doesn't select for "ability to contribute to company infrastructure" at all. It selects for "mathematical knowledge of theoretical computer science." These are related skillsets, but largely uncorrelated.
I've never been at a company that had a dedicated "library team". The common code and functions that go into the libraries are artifacts of programming the other stuff that needs it. It's not a job title.
(Unless the company's product to sell is a compiler. In that case, it would follow Conway's Law and you'd have a set of programmer's implementing the standard library. I don't see how that division of responsibilities applies to Google or other companies.)
Don't forget - Google has a culture they're protecting. I'll play devil's advocate and say while it's maybe not immediately useful, it's probably a very good test for culture fit. If you're the kind of person who thinks about the math of algorithms in your free time, because you enjoy it or think it's fundamentally important, you're probably more likely to be a Google culture fit. If you're a successful product shipper, but you have an aggressive "get shit done" attitude and don't care much about the science and math, maybe you are not going to make the discussions with fellow Googlers that Google envisions happening around campus, or maybe you're likely to be highly success/ money oriented or I don't know, go against the culture fit criteria in some way.
Google internal libraries and infrastructure are very good and way better than most "off the shelf" libraries.
Any rank and file engineer can and do contribute to them. In that case I don't see why they should lower their bar to people who can merely consume these libraries when they can take their pick at people who can do both.
Ha, yeah, if they could let some of that "magical internal tools" goodness seep out into the Android SDK I'd be pretty happy, since there's no sign of anything special there now, to put it mildly. Failing that, just buy Square and hand the whole thing over to them, since they seem to have a much better idea of how to write useful, usable, capable tools on the platform than Google does. Please.
As much as I'd like to see them as well, I'd imagine we won't get much as that's Google's IP, but would love to know generically if that's true from anyone who has worked with said libraries.
Apache Commons (Collections and some others) is a very old, basically legacy library, they are not nearly equivalent in terms of functionality so comparing them makes no sense. Google Guava is a nice library but it doesn't have any alternatives that I know of so it can not be used as an example to support the above generalization (which would need many examples to hold any weight anyway).
Angular vs React is a counterexample I can think of. Angular is shit.
The point was that their interview doesn't select for "ability to contribute to the company libraries" at all. It selects for "went to Stanford or MIT and has a good ability to memorize abstract computer science material."
Not at all. I love abstract computer science. I find it very interesting.
The point is that the job of "software engineer" is not the same as the job of "computer science professor." The required skill profiles are almost totally different. They require mostly disjoint skillsets, despite having some overlap.
If you're interviewing candidates to be a computer science professor at a university, then it is entirely appropriate to quiz them about abstract computer science during their interview.
Google, and other tech companies that inteview like them, are using a "CS professor" interview process to hire "software engineers." That's my point.
The material you get in interviews isn't CS professor level; it's basic undergrad algorithms level. It's a fundamental course in any CS degree, not just at elite universities.
> This mode of interviewing is like interviewing the contractor you're considering to remodel your bathroom by handing them a pile of iron ore and asking them to smelt high grade structural steel out of it.
That's a terrible analogy. I'm hiring a home contractor for a discrete, predefined set of skills. They need to be able to install a sink and retile a floor -- and once they do that I'll never have need of them again.
Software engineers are not tradesmen. What you know is less important than your ability to figure things out. If Google wants to select heavily for that quality it's a totally reasonable approach.
And that's why: 1) you train a lot before doing any of those alone 2) they change very slowly and every change is carefully evaluated (aviation/surgery) 3) aviation do have a lot of aids that are not memorized (SID/STAR, ATC information exchange, checklists, etc)
When my sink leaks, I'd rather hire a plumber that has done a standard job ten thousand times, than a bright guy with a pipe wrench, a bag of PVC, and an exacto knife, who can 'figure things out'.
That's the point. Google and the rest of the big 4 don't need ordinary plumbers, they need process engineers (the engineers who design the piping at oil refineries and chemical plants) because they are tackling problems that nobody has done ten times, let alone "ten thousand times".
This is how competent managers regard programmers, just so you know.
But that ten-thousandth time, when the standard job goes awry -- that's what makes the difference between a dude showing up to work and a tradesman. The ability to react quickly and appropriately when faced with a brand-new problem is valuable in any role. Pretending otherwise is hubristic at best and kind of gross in general.
Their interview doesn't select for "ability to figure things out" at all. It selects for "ability to memorize and regurgitate academic CS curriculum on demand." I'm sure many people who can do that are also good at figuring things out, but it's mostly uncorrelated.
Have you ever done of the interviews? Or given? I've done both, multiple times and currently work there. Generally speaking the interviews are not "regurgitate academic CS" on demand. It does test your algorithmic and data structure thinking. Reason being is it's hard to find questions that are complex enough to be hard, yet easy to explain in just a few minutes, AND has richness of details. Those kinds of problems tend to be mathematical, algorithmic, etc. Stuff that, yes, is unlike what you do 80% of the time, but that is expected to be able to do.
I made it thru my interviews without remembering obscure things, only knowing basic O-notation, and generally not being a math genius.
I'm not sure where this perception comes from -- my guess is it's a mix of cognitive dissonance and overgeneralizing from bad experiences.
The majority of interview questions I've encountered (at Google and elsewhere) are of the form: here's a data structure or API, can you use it to solve some problem? Nobody's ever asked me to prove A* from scratch. Even in cases where the question does revolve around a specific algorithm or CS concept they've always taken the time to explain it. I used to ask a question based around LZ77 encoding and I always went over the algorithm whether or not the candidate was familiar with it.
That's when I think there's some difference in the way mathematics is being taught in engineering and the way it is taught to mathematicians.
It is very different to attend a calculus class and learn to apply some formulas and that's all. You vaguely remember the formulas some years later, and the algebraic abilities are all you have, but they are still very powerful tools.
However, if you do a mathematics course like advanced linear algebra then you realize that for this particular teacher knowing the formulas doesn't matter, at all. You have to write proofs for everything. And these proofs require all of your imagination, all of your experience and then some.
It changes the way you approach every single problem from that moment.
Now, most people have never needed that ability. Most people will never exert themselves to the stress, the lost nights of sleep and the sheer single mindedness that drives everything out of your mind for days, until you get that proof. That damned proof you will never forget for months, if ever.
The end result is that it makes your mind work in overdrive for solving the same kind of problems.
So the issue is that some people have memorized stuff (and that never works well for too long), while other people has taken that stuff as the first step, and then through a long process of proving every single step of one or more theorems, have tattooed that stuff in some part of their minds, and are simply trying to find someone having a similar tattoo, to find someone with whom they can relate about their shared experience.
Anyway, the people I know that can do that, don't really like to program computers.
You have said that you haven't interviewed with them, and don't work there. You seem to be spouting a lot of bullshit for someone who hasn't had these experiences.
I am so happy that early on in my career I failed at multiple attempts to get a job at Google (after having graduated Stanford in CS in 2008). Despite having already built some objectively impressive and popular projects used by millions of people, I was asked to solve boggle, random C trivia questions, and write a sort algorithm I'll never need in my career, all on a whiteboard, and I just wasn't comfortable with that style of interview.
I was upset for a while but just kept iterating on my own projects and now I am passively making far more than I ever would at Google even including generous stock grants (mid 7 figures). It's much more rewarding too, knowing I created my own destiny, and not just some middle engineer on a mothership living off of an easy adwords monopoly.
My suggestion to anyone who has the perseverance to get through a guide like this is to just forget Google and build something new yourself.
Knowing what these structures are and having a working knowledge of how to use them is not the same skill as being able to implement them from scratch (on a whiteboard, in an hour or less.)
If you can drive, you probably have a pretty good idea of whether a car or a truck or a motorcycle or a tank is the best vehicle for a particular transport application, and you can probably actually drive all of those with minimal practice. That doesn't require you to know how to build a car, a truck, a motorcycle, and a tank from scratch. That's a different skillset.
Google pays far above market rate, so they have the luxury of hiring petrochemists to work a gas pump. Even at Google, once all the shiny is stripped away, most of the projects are simple CRUD.
> 1) If you know what a DAG is and how to use it, you can most likely implement and traverse one in way less than an hour.
I know what it is and how to use it. I can implement and traverse one. I seriously doubt I could do it in an hour, especially if I don't have at least standard library documentation available.
You should tell that to some of my whiteboard coding interviewers, then. Plus, I feel that many of the interviewers who claim to be interested in though process still have a "right" way in mind and will dock interviewees for not approaching a problem the way they would.
Anyway, the post I was replying to said, " you can most likely implement and traverse one in way less than an hour" (emphasis mine). That statement has nothing to do with thought processes. To me, that requires "flesh[ing] out such things in any great detail".
Ugh, this sort of computer science-shaming needs to stop. I've been building software professionally for 10 years now and not once have I ever needed to weigh the pros / cons of a DAG versus rb-tree. I'm too busy dealing with CSS bugs, figuring out why customer retention is lower than it should be, training my clients on how to use a bug tracker, scaling, and hundreds of other things which are magnitudes more important than this implementation detail. Somehow I've managed to make a career out of this. ¯\_(ツ)_/¯
But ask yourself: how does your browser layout engine and scripting engine work? Is CSS some sort of self-hosting language that just materialized out of nowhere? Or there's somewhere a group of people responsible for it? (e.g: Google's Chromium team, Mozilla's Firefox team, Apple's Safari team, Microsoft's Edge team, etc.) Those people DO need to know data structures.
For you to be able to focus on CSS, HTML and JavaScript, someone, somewhere needs to make CSS, HTML and JavaScript work for you in the first place.
My point is: he can focus on HTML, CSS and JavaScript all he wants, and remain abstracted from internals. That's fine.
But here we are talking about one of the companies responsible for implementing such technologies, where the science he implies as not being very relevant IS relevant.
It's the difference between applying to a truck driver work, to a truck engine designer work. For the latter, knowing the theory of an internal combustion engine IS relevant.
> My point is: he can focus on HTML, CSS and JavaScript all he wants, and remain abstracted from internals. That's fine.
The majority of engineering positions that Google hires for are exactly this. The Chrome / Chromium team is just one team at Google. Most of the engineers who work there are building/scaling iOS apps, web apps, and are doing the exact same work that I do all day long.
A better analogy would be hiring engine designers by spending 90% of the interview asking them questions about low level metallurgy, and 10% about engine design as such. I would reverse those ratios.
A lot more people do stuff that interacts with the details of the browser engine than they do that interacts with the car engine; much more common (and easier) to find bugs in the browser engine than in your car engine.
Now if I was just browsing the internet, then, sure, your analogy might be more apt. But that's not what we're doing.
How do you scale without worrying about implementation detail? You sound more like a product manager than a software engineer. Their interviews are very different.
Of course there are jobs that require deep knowledge of advanced data structures, algorithms and optimization.
But you can easily scale quite a lot of systems to millions of users without having to know any cs at all. You just need to know how to use and configure different layers of caches, design your database(s) properly and some minor tricks around code (and knowing how to analyze performance).
I built a web app that scaled from 0 to several hundred thousand users in a few days. Somehow I managed. A key thing in becoming a good engineer is learning how to delegate certain technical decisions to those who are better equipped to handle them.
Provided they exist in your organization and have the spare bandwidth for your needs, and itwould be them who solved the problem that needed to be solved not you.
This is more like Al Gore claiming he invented/solved/built the internet.
If Google was only about CRUD and banal CSS, it probably wouldnt have become the Google that it is. They became so by handling challenging and ever changing problems. In such a scenario you need generalists with strong fundamentals who can pivot quickly from one problem to another, quickly, not a stackoverflow cut-paster. The latter skill has ts moments but you cannot survive on that alone, if you are in the critical path.
> Provided they exist in your organization and have the spare bandwidth for your needs.
You can delegate to entities other than humans.
> If Google was only about CRUD and banal CSS, it probably wouldnt have become the Google that it is.
That's totally true, but how many times did PageRank need to be invented? After that the true problems at Google were scaling and monetization. Do you think the team responsible for making sure AdWords looked right on every device under the sun would agree with your statement? I'm sure a lot of the work that went into that "CRUD and banal CSS" is the same code that's made Google billions of dollars.
> They became so by handling challenging and ever changing problems. In such a scenario you need generalists with strong fundamentals who can pivot quickly from one problem to another, quickly, not a stackoverflow cut-paster. The latter skill has ts moments but you cannot survive on that alone, if you are in the critical path.
> but how many times did PageRank need to be invented? After that the true problems at Google were scaling and monetization.
Quite a few times actually. It was not quite obvious at that time how to run Pagerank and other algorithms efficiently at that scale while keeping running costs down. If it was just a library call and delegation away, they wouldnt have had such a meteoric rise.
> Quite a few times actually. It was not quite obvious at that time how to run Pagerank and other algorithms efficiently at that scale while keeping running costs down. If it was just a library call and delegation away, they wouldnt have had such a meteoric rise.
We're talking about two different things here—the theoretical PageRank and the practical one. My point is that the skills required to scale a thing like PageRank—writing code to parallelize tasks, divvy up traffic, etc.— are very different than the ones involved in inventing PageRank as an algorithm.
Pagerank at its mathematical core was not that novel, basic undergrad stuff. The application was novel, not the equation, you would find that in a beginners linear algebra book. The real deal was (i) realizing that those equations can be applied for solving an aspect of web search and (ii) scaling it up with cheap hardware of that time and keep operational costs low to be profitable. What I am saying is that one needs a good understanding of CS fundamentals and the ability to reason to pull that off with a competitive advantage. You dont get that just by tweaking CSS or for example knowing your Java platform well or by delegating. These kind of problems are not one off. You have to keep ahead of the competition constantly, innovate constantly, have to do stuff that your competition has not yet figured out how to do.
Now that this particular scaling problem has been in the mainstream it does not seem that big a deal to solve, it was at that time. If it hadnt been, every run of the mill tech company would have been doing it to eat Google's lunch. Their manager's ability to delegate did not seem to have helped them much there.
> Pagerank at its mathematical core was not that novel, basic undergrad stuff.
Whatever you say, Mr. Page. :)
> You dont get that just by tweaking CSS or for example knowing your Java platform well or by delegating. These kind of problems are not one off. You have to keep ahead of the competition constantly, innovate constantly, have to do stuff that your competition has not yet figured out how to do.
More false analogies here. The skills involved in solving technical problems are easily translatable from one technical domain to another. You make it seem like implementing an algorithm to scale servers is necessarily more complex than implementing an algorithm to stack shapes on a webpage in a space-efficient way. It's not.
> Their manager's ability to delegate did not seem to have helped them much there.
You're completely misunderstanding me. I'm not using the term "delegation" here as a managerial term. I would bet you that the team that scaled PageRank relied on countless open-source and freely available tools and tech that others wrote. This doesn't lessen their ingenuity at all, but it should be clear that even the most complex tech is built on the shoulders of others.
Lol do you know PageRank? At its core it's just a random walk on a graph; this stuff is taught in intro linear algebra courses. Of course, modelling the web this way and tweaking the middle to produce optimum results was a big deal. You'd be a fool to believe that PageRank hasn't evolved in 20 yrs.
Oh, please. This back-and-forth and condescending attitude is getting so tiring for me. Yes, I've read the paper multiple times.
> this stuff is taught in intro linear algebra courses
I graduated with a B.S. in Applied Mathematics from Yale. I'm familiar with this stuff, thanks.
> At its core it's just a random walk on a graph;
I'm tired of making this point. There is a big difference between understanding a ground-breaking discovery and making the discovery itself.
> Of course, modelling the web this way and tweaking the middle to produce optimum results was a big deal. You'd be a fool to believe that PageRank hasn't evolved in 20 yrs.
I never once said this. I said that after PageRank was developed, most of Google's engineering resources went into scaling and monetization.
Scaling has its own set of algorithmic challenges; at the small scale there's not much to be gained from asymptotic complexity improvements, but at the scales Google operates at it definitely is the case; I'd wager that a lot of work Google does on its distributed computing platforms involves algorithmic challenges.
Again, it sounds like your skills lie mostly in product management rather than software engineering. While you have a valuable skillset, it's not what Google needs from a software engineer. You're good at a different job.
> While you have a valuable skillset, it's not what Google needs from a software engineer. You're good at a different job.
See https://news.ycombinator.com/item?id=12653628. Far more "software engineers" at Google do the exact same thing that I do all day long than do the type of work you're referring to.
Horses for courses. There are people that would hate doing client training but love weighing the pros and cons of a DAG vs RB tree. They too can make a career of it.
It's a big industry, plenty of different ecological niches to fill.
I will hazard a guess that whatever you did it was not as impactful as GFS or mapreduce at scale.
> I'm too busy dealing with CSS bugs, figuring out why customer retention is lower than it should be, training my clients on how to use a bug tracker, scaling, and hundreds of other things which are magnitudes more important than this implementation detail
For hard-tech those become important only after there is an implementation that solves a high barrier to entry technical problem. Then you can get a good run of the mill PM to keep it chugging.
> For hard-tech those become important only after there is an implementation that solves a high barrier to entry technical problem. Then you can get a good run of the mill PM to keep it chugging.
I completely disagree, and your statement is emblematic of what is wrong with popular perceptions of what it means to be a good software engineer.
The best engineers I've ever worked with are fantastic communicators and understand the product that they are building to the core. Being able to ask a good question or drill down into correct requirements is far more important than knowing how to traverse a binary tree, at any level.
Where did I say engineers are not or dont need to be fantastic communicators.
All I am saying is that if you are in a hard tech area with problems that has not yet been solved well enough to be monetizable, communication and delegation is not what is going to solve it. Agile, extreme or whatever is 'in' at the moment is not going to do it. It gets solved by ability to reason about technical things. I can for instance communicate the need to cure cancer (bad example sorry) or delegate willy nilly, but sorry thats not going to solve it.
Why do they do this? The answer you haven't considered is, "because it works." I assume you haven't considered it because doing so would mean wrestling with the fact that your intuition isn't always right.
Google has turned interviewing into a science. They have interviewed millions of engineers, hired hundreds of thousands, rejected and later hired who knows how many, etc. Many incoming interviewees have also interviewed across silicon valley. And all along the way they've been collecting data and refining their process.
Do you have anything except your intuition to show us that the Google style interview doesn't work?
One thing you may fail to consider is that there doesn't need to be a direct and plain link between the interview questions and the work the engineer will end up doing. All that matters is whether the interview process produces a clean and strong signal that leads to quality hires.
As a person who's been through a Google interview, I can imagine many of my former co-workers performing poorly. Those are exactly the kinds of engineers I don't want to work with. I have spent too much of my time explaining basic concepts to engineers who should already be comfortable with those concepts, or who lack the critical thinking skills to be part of the conversation. So there's my anecdote.
Before telling a company how they should conduct interviews, perhaps you should gather some data.
While it's undeniable that Google has an excellent engineering team, it's also undeniable that their products often/usually leave a lot to be desired from the end-user perspective. There's definitely a missing piece of the puzzle there when it comes to UX and human-centric design. Who's to say that their focus on low-level engineering at all costs hasn't left something on the table?
I can't say for sure that their interviewing process is broken but as a user of many Google products (many of which get randomly discontinued) I will say there's something lacking.
This is a red herring. Name a company that doesn't leave something to be desired. No company is perfect. The idea that Google's imperfections have something to do with their hiring practices is not prima facie ridiculous, but you've offered exactly zero support for it. You're just lobbing out complete conjecture. How does that advance the conversation at all?
Why are you being so defensive? I'll go further and say that Google's products are _uniquely_ bad as far as UX go. There's definitely missing product thinking in that organization.
It seems you didn't understand my last comment. You're making logical leaps here without connecting any dots. Google's hiring is broken does not follow from Google's UI sucks. You need to put some connecting logic in there if you want to convince anyone that you're even on topic right now.
> The answer you haven't considered is, "because it works."
Just because something "works" does not mean it's optimal or even very good.
Google's interview strategy could also be to pick random resumes out of the pile, hire 10 people for every position, and fire the 9 who didn't work out. It would still "work" because enough people want to work at Google that they have a huge hiring pipeline and plenty of money for excess employees.
Worshipping Google as having a great interviews is not helpful to anyone. They can easily be doing it incorrectly.
In fact, the very data which you're talking about showed that many of their older techniques were in fact terrible indicators (brain teasers, GPA, etc.).
I never told any company how they should conduct their interviews. Google is of course free to spend their money however they see fit. This was not written for the benefit of Google, but for the benefit of the many others who are considering whether something is a best practice simply because Google does it.
As for why they do this, your conclusion is kind of exactly what I got to in the last paragraph, "for lack of a better option." It's not that it works in some optimal sense; it just works less badly than the other options available, according to their business constraints.
I base my claim on my many years of personal experience as a software engineer, both being interviewed and interviewing others, discussing engineering and CS with numerous people. My anecdote: I once worked with a guy with quite reasonable academic CS skills who decided to re-implement SSL from scratch for a bog-standard webapp, rather than use a library to do HTTPS. That was not a good use of the company's limited budget, which he would have known if he had had any engineering skills (as opposed to CS skills.)
In my experience, there is little (though not zero) correlation between "good at CS" and "good at writing software commercially." The Google interview process is probably better than nothing, and probably better than their previous "brainteaser" style interview, but far from optimal.
As for what I'd select for? I'd certainly reject candidates who manifest anything like the condescending and arrogant attitude you display towards people you consider to be your intellectual inferiors. Even if you're an amazing engineer who can recite the CS in your sleep, that attitude is toxic for any team environment and will be a net loss to the company.
Google does have projects that require clear understanding of the things you classify as trivia.
Their search crawler, their distributed computing technology from which projects like Hadoop drew inspiration from, their cloud platform with custom hardware, Tensorflow, their custom Linux kernel and Android, improvements to network protocols like HTTP, v8, Chrome, Project Zero, and user facing apps like Maps, Translate, etc, etc, etc. All those important projects will require you to know your way around the fundamentals from the ground up and think out of the box.
Then, they have lots of applicants. They can be more selective than they need to.
The second factor is far more important than the first.
There are any number of engineers who can simply go read up on the fundamentals as they are needed, or who understand them well enough to work with them but lack a mathematical proof-level command of the material. Google just doesn't need to hire them because they're paying much higher than market rate.
I don't completely disagree with you, but in Google mathy CS skills are probably better than robust SE skills Things need to operate on enormous scale.
It's easy to imagine a case where Google would need to implement something algorithmic because well tested libraries aren't quite doing the right thing. The libraries are generally designed to work in RAM / HD. Google needs them to work across multiple servers / datacenters.
> In the real world, you use a well-tested premade off the shelf library to traverse a DAG or implement a red/black tree.
I reject your real world assertion, unless you do not count Google as the real world. The reason these questions are asked is that they're relevant to the work engineers will be doing at Google.
You're not supposed to ask brainteaser or trivia type questions as a Google interviewer anymore.
But yea I agree with the rest of your post. Also another factor beyond confidentiality is that you can't hire people from other companies through that pipeline, cus they won't do it.
Also worth noting that for certain classes of engineers (new grads), Google does sort of hire people as contractors (though they're full-time with benefits), see the eng residency program.
Yes, hence the OR. You're not really supposed to use cs trivia style questions either. Both are deprecated. It will take awhile for interviewers to catch up though...
How does one hire a contractor for other industries? I'd garner that word of mouth is the way. The alternative is an association and certification body (not too disimilar to the medical association or the bar for lawyers), and gating the access to this certificate.
> hiring candidates as temporary contractors for a real world test doing actual work
While this is very effective at filtering out poor applicants, it is also very effective at preventing good applicants from applying in the first place. Why would a great developer who probably already has a great job quit for a chance to maybe get hired at Google? In most cases they wouldn't. And at the scale Google hires, they can't afford to lose all of those potential employees.
As someone who helped set interview process at my previous employer (not Google), and other companies now as a consultant, your reasoning of why these companies have a process like this, looks right. Let's dig a little deeper too.
Consider the following facts:
0. Google has some 30k-40k engineers, with average tenure of say 7 years. So every 7 years, they are hiring 30-40k engineers. That is thousands of engineers every year, even if they are not growing (and they are). It's a massive undertaking.
1. In the field of software engineering, unfortunately, experience has little correlation to expertise. That has been known for a long time. See [1]. And cognitive ability is at least a reasonable predictor of success, better than many others. See [2].
2. Companies like Google have no dearth of applications. Literally millions or engineers apply.
3. Interviewing is a chore, and mostly not fun. Most interviewERs hate spending time on it, and want to get out of it as quickly as possible. Interviewing is also on top of everyday work which is a lot at many growing companies.
4. As a hiring manager, I have to hire people. There is no choice. I must find a way to hire N people in X amount of time, or the company is literally doomed. e.g. if I'm eBay or Amazon, I can't miss the holiday season.
5. You have to involve multiple people in hiring. Just one person talking to the candidate and making a decision is not sufficient.
6. Programming is so vast, that engineers apply from all sorts of backgrounds and domains.
7. Companies are becoming more and more polyglot. You want engineers to move around different languages and stacks freely.
So when you have a lot of people to hire, a lot of people applying, work that's somewhat correlated to cognitive ability and very little time, what kind of process do you end up setting?
When you put these constraints together, you realize that your incentive is to design a generic process that's convenient to the company, and not convenient to the candidates.
You just want reasonably smart people, fast. You don't care about seniority much, and the type of problems asked, as long as it helps you close N people in X time, by putting in least amount of work. A different process might have selected for different kind of N people, but that process would take longer than this. And time is money too.
A process with DS/Algos is less subjective, fast (like you said), can be prepared for (you have to hire), has enough variety that multiple people can ask different questions, lets you interview across domains, and is at least somewhat defensible-ly relevant to the field.
And hence, here we are.
Not saying that the process is understood by everyone and executed well everywhere and every time, but for several years, we're all still looking for process that's lesser evil, and we haven't found one.
As long as the constraints outlined above remain, the process is going to stay. In some form or another. For a very long time. However much everyone hates it, including the interviewers, and the company itself. Many have tried otherwise (including myself), but most have come around to asking DS/Algos somewhere in the process in varying percentages.
"CS trivia" interviews select for people who know the mathematics of computer science, not for people who know how to ship robust software systems on time and within budget. The two are only loosely related and mostly uncorrelated.
In the real world, you use a well-tested premade off the shelf library to traverse a DAG or implement a red/black tree. Just because someone can recite algorithms and data structures in their sleep doesn't mean they can effectively scope, plan, or execute a complex software dev project; and conversely many of the best engineers either never learned or long ago forgot the details of the CS.
This mode of interviewing is like interviewing the contractor you're considering to remodel your bathroom by handing them a pile of iron ore and asking them to smelt high grade structural steel out of it.
So why do they do this? Lack of any better option. Their previous "brainteaser" style interview didn't work at all, and this is probably marginally better than that at finding good engineers. But more importantly, it's much cheaper and faster and less risky in terms of confidentiality than hiring candidates as temporary contractors for a real world test doing actual work, which is the only remotely effective way to tell who is actually a good engineer.