In my email I have an "interview prep packet" from them that essentially tells me to brush up on algorithms and read Cracking the Coding Interview to prepare for their interview process.
I'm fairly happy in my job. If they offered more money or a really interesting project I'd consider working for them. But I'm pretty lazy about redo-ing college algorithms class during my free time at home to go work there, so I probably won't.
There's an opportunity cost with interviews like this where an M.S. and long career of getting shit done counts for very little and memorization of undergrad level topics that you can look up in two minutes in Knuth if you have a problem that requires it can make or break an interview.
I've made a career fixing a ton of horribly shitty, inefficient code that's been produced exclusively by people who pass these interviews.
That's who they want to hire at the end of the day: some coders that don't get too critical about their job and do what they are asked to do, even if it is repetitive, stupid and doesn't really make sense (such as re-studying algorithms implementation details for two weeks before an interview when it can be looked up super-easily online).
I'm pretty much the opposite of what you think, so if my desire to study for the algorithm interview is your litmus test for that, kinda proves my point.
Not everyone that would be good for Google has a burning desire to work for Google. Google might want to consider that.
What I described on my previous post is a credible explanation that my group of engineer friends came up with on why all the FANG companies pursue those heavy memorization algorithmic interviews.
It is also a safe place to work for the really exceptional engineers. They can talk freely about complex computer science, without getting blank stares or having to dumb it down. Otherwise it gets frustrating fast.
My understanding is that different IQ tests will rely on different kinds of questions (i.e. not all involve visual shapes; some involve word/logic puzzles). The point the scores for all IQ tests correlate very highly (and thus suggest that there exists some common factor).
There are divergent aptitude tests measure quantity and diversity of answers to a given question opposed to the one desired answer. Divergent tests are rarely performed, but at a stronger measure of creativity and problem solving which is typically what people actually want when they say intelligence. The reason why convergent testing is preferred over divergent test is because it is easier to measure and those simplified measures are easier to compare.
How much do such tests correlate with things like (for example) creative achievement?
> Testing or measuring procedures cannot be determinative in employment decisions unless they have some connection to the job.
IQ tests are not directly related to the job, and so are illegal according to that ruling. Coding tests are directly related, which is why they get a pass.
Do you scan source code and draw firm conclusions about what it does based on skim reading the first comment you see?
Perhaps my old contracts prof could have a second career as a google interviewer. (He was notorious for cold calling people that hadn’t briefed their cases and eating them alive.)
I think the most likely explanation is that "elite" of CS grads usually do competitions like ACM ICPC and olympiads, which are full of problems like those watered-down interview questions. As there is no real authority on what makes a developer good, but there was a measurable outcome of those programming competitions, leading to high status and pride displayed by winners/participants, it was simply taken from there and dumbed down to fit into interviews. Some top colleges even have special prep courses for those competitions and comparing to them FANG interviews are super trivial.
Spot on. Certain companies ask silly question and to perform tedious exercises and, in the debrief, investigate if the candidate complained.
(disclaimer: I worked for Amazon and never seen this pattern in the company)
some coders that don't get too critical about their job
As someone who works there I've noticed this general trend of animosity towards Google, it seems to be fueled by the fact that people feel a sort of inferiority complex when they don't clear an interview or they feel they wouldn't be able to crack it if they ever gave it a shot, this leads to an overcompensation of attitude in the other direction, a sort of "sour grapes" narrative where there is an effort to downplay the prospects of working at Google or ridicule the people working there like you did just now.
With ie banks, they are at least honest about how they earn from customers like me and I don't have any unreasonable moral expectations.
There is nothing more annoying than discussing with a Googler that tries to convince you that you should apply and work for Google.
I think this is a key thing to learn about oneself. Very large companies often have a lot of cachet and can provide opportunities that small companies cannot (scale is scale, after all).
But small companies (not even start-ups, just companies with < 100 employees) can provide a different kind of opportunities:
* Interaction with different parts of the business
* Opportunity to wear multiple hats
* Less likely to be in the Bay area
* Nowhere to hide incompetence
Of course this isn't every small company, but I have worked in a few that were like this.
Some of the most talented people I've ever met have failed Google interviews. As in far, far more talented than the majority of folks I know who do work at google. No amount of bullshitting is going to convince me that Google's False Negative interviewing system is the correct one. It simply bypasses too many talented candidates even beyond the reasons generally mentioned above.
One guy in particular I know is now a CEO of a company which, ironically enough, employs several former googlers. Yes, I know Google doesn't give a shit they missed on this individual. But to mis-characterize the anti-Google-interview narrative as "sour grapes" sounds a bit like you're drinking their champagne.
You just can't test for some qualities like focused, persistent problem solving and the desire to find ways to speed up your work.
Well, if that was the goal, boy did we ever fail to deliver on that one ;-)
What happened was that the interview questions inevitably got leaked and accumulated (this is the result of a post-highly-super-indexed and centralized internet) and it became a race to the bottom for candidates.
Imho the test may have worked in year 1-4 of google but it no longer flies since college kids spend an eternity studying them at home. It’s like the SATs all over again for these kids.
I still think these tests are generally good for testing how good somebody is. If you somehow got through cs without knowing how to 3 color a map vaguely (not talking perfect answer here, just vaguely correct intuitive explanation) even if after 15 years of work, something isn’t right. These sorts of questions definitely will weed out your local web-dev baddy or even dev-bootcamp baddy, which Silicon Valley is starting to be flooded with now.
I'm a big fan of not wasting time, which I why I get stuff done at work and have been promoted twice at my current company in the past 3 years. As a sibling commenter suggests, if Google wants employees who blindly do what they're told, then I wouldn't be a good culture fit. I was taught critical thinking skills in school. Respectfully questioning my superiors' plans from time to time has been a valuable skill.
And this is the big thing I've noticed about many Google apps (especially dev tools): they are all incredibly (for want of a better word) hacky. And the documentation... while I can very much appreciate that they are serious about it and work very hard, it's a huge mishmash of marketing speak ("It washes your socks and makes dinner for you!") with the crucial information you need hidden away in secret corners. I literally have to use Google the search engine to navigate any of the documentation.
I don't mean to be so negative. It's not that it's so terrible in reality, it's just that it's not anything particularly great. I literally can't think of a single offering they have that I would aspire towards. There are lots of smaller, hungrier companies making much better products. And by extension, I think if you work at those smaller, hungrier companies you have a better chance of learning more, becoming a better developer and even being happier with your job.
So the thing is, unless I'm alone in my assessment, I think the main things people are looking for from a job at Google are status and money. Additionally, I think a lot of young people believe that talented developers mostly work at famous companies. In my experience, this is the opposite. Back when Microsoft was the powerhouse, I knew a lot of MS developers. Some were amazing. Most were average (as you might expect when a company is hiring thousands upon thousands of developers). However, you had a much better chance of actually working with someone amazing if you got a job at a small shop. I still think this is true.
Personally, I don't complain about the "You have to be this clever to ride" interviews. Yes, they self select for people who are not like me (bulldogs that work away obsessively at problems until they are solved without using any particular magic insight). Yes, it means that I'm unlikely to get paid at the very, very top of the payscale (heck, if I wanted to get paid more, I would have been a lawyer -- I want to write code). It just means that it's that much easier for me to find companies that are a good fit for me. I don't really see a problems with that.
Working for Google is the only way to get bugs in Google products fixed, or your feedback even listened to. ;)
It was funny because in my frustration, I left a comment on the last page I looked at and they responded with the completely reasonable question, "Why were you looking at that page? It doesn't contain any of the information you were looking for." It was such a great summation of what my problems was that I felt difficulty in finding an appropriate response :-). Possibly this is too unfair, but I felt like the question, "How should I have found the information I was looking for?" was something that had never occurred to them. It's pretty applicable to virtually everything I've used from Google. It has that feel of, "If you don't already know, then you don't deserve to know".
with all the android privacy issues researchers keep uncovering, i feel like Google has really driven a certain fraction of the labor pool which cares about protecting user privacy directly away from itself
> they are all incredibly (for want of a better word) hacky.
yeah, i haven't seen much of anything to feel inspired by either (although I do think Google Maps team has put out a really solid product). but i have to deal with the Android SDK and other google libraries for android, and "hacky" seems like the best single word description for that stuff, IMHO, too.
what makes me laugh about the android documentation is that even though Google's mission is to organize the world's information, the best android documentation and guidance i can find is organized by Stack Overflow.
But I think it's more than that. You know how certain things are obviously built by committees? You can see it because everything is consistent, but it's often got a lot of compromises. On the other hand some things are obviously built by individual contributors. It's got some things that are really great, but other things that are horrible because the developer just has blinders on. Many Google apps give me the latter experience. Also, when they have a suite of tools, although they are tied in together, they have wildly different UI, naming conventions, placement and layout of information, etc. Usually there are some really cool bits, but these bits are often not the point of the project and look a bit out of place.
I think teams in Google often have people who are confident and smart and the tools reflect that. It's a kind of "This is the way to do X", shouted 500 different ways. One of the biggest things I find frustrating is the mountain of trivia that I have to commit to memory in order to use their tools fluently. I really do think it reflects the type of people that Google chooses when they hire people. While they may be good at the things that Google screens them for, they may not be the best people overall when it comes to building finished products.
On the phone interview I was asked how to something I had no idea how to do. I have no idea how I passed the phone interview.
Seems like a mixed bag, the question as far as I know is completely up to the interviewer, which makes the interview very subjective when they don't use a normal question.
Well, the interviews are marginally effective at identifying new CS grads (who also are cheap and have few outside commitments - companies love this!) but not much else. You're exactly right to view it as an algorithms final exam or an algorithm puzzle contest, but Cracking the Coding Interview is an embarrassingly bad book, in spite of (?) its ostensible purpose of helping people study for algorithm puzzle interviews at Google, Facebook, etc..
I like how you made this bold claim with zero justification as to why.
I love this comment.
I would say they probably made the decision like we all do when development work, that solving this n+1 or bubble sort, or api package doesn't have time in the budget.
Getting to market sometimes is more important to managers than optimal `correct` code, when the market isn't willing to pay for the services of `correct`.
Haven’t had a chance to hire any ex-FAANGers yet, do they compare well to the general market?
In short, I think the exact same question is interpreted as a cool algorithm challenge or a recall check, and the interviewer will be fine with you rederiving the answer on the spot if you are quick thinking enough to do so.
They seem to have no lack of talent and it certainly doesn't seem to be negatively affecting them. I wonder if this will change eventually, and Google will become IBM v2.0 (that is, a respected and profitable company which is mostly boring/unexciting).
Once upon a time, skill at doing these sorts of problems might have correlated (imperfectly) with general aptitude as a programmer or software engineer. But the very act of trying to leverage that correlation for hiring purposes probably also made it go away. Now you've got a whole lot of people practicing hard on these sorts of problems, spending huge chunks of their free time grinding away on Project Euler and Advent of Code and HackerRank. That muddies the quality of this stuff as a proxy for what it was originally trying to detect: natural aptitude. I'm guessing having time to level grind like that also correlates inversely with other traits that are desirable in a programmer.
In the silicon valley I have a fair amount of colleagues that expect of any dev to spend a fair amount of their free time grinding on even more dev.
No surprise that there is a lack of diversity in the profession as a result.
The relentless scepticism about people's achievements is to some extent understandable (we've all run into the senior person who can't do fizzbuzz), but it ties neatly in with the idea that every new hire should be 25 at most.
I, too, in 1988, fresh out of my BS in CP with no family or life experiences and a strong desire to code day and night would have sailed past the technical side of these often ridiculous interviews for jobs I could literally do in my sleep now.
But now, at 53, not only has Father Time fucked with my ability to sight memorize (something I took totally for granted in my younger years) without even trying, I find it almost impossible to hide my frustration at the ridiculousness of the very idea that I would be unable to do the work required of me for the job...
"Ok..we really need someone to fix X and Y on this website, and it would be just great if they could reconfigure Z on the server..."
"Well, sure...I first saw X and Y-like issues back in the mid-90s in a networked client-server environment and I did X(x) and Y(x) to fix it, and again saw it in the mid-00s under the LAMP-stack and again fixed it doing X'(x) and Y'(x)...and the server issue is something I've seen over and over again during my 30y+ career..I am 100% certain I can solve these relatively simple issues for you guys..."
...and of course I don't get a callback. Do people think I'm lying about all my experience on my resume or what exactly?
It's at the level of life and death for me right now, to be honest. I've been shooting out resume after resume for the past month, and nothing good is happening.
All I know for certain is...if the real issue was truly about finding someone who can do the job and fix the problems, there is no way in hell I'd still be looking now.
Bill budget IT guy, "What? We need to harden a server? Why do we need to pay this guy over a hundred thousand a year to do that? The internet is full of documentation. Let's hire someone with just enough technical know how to implement it."
The reality on the other side is that hiring an experienced engineer has it's risk. I've worked with 20+ years of experience engineers who did it because it was a job and didn't deserve the salary based on their skill set. Companies let this happen, you need to cap positions and then give inflation based raises.
Totally agree and have seen it from the interviewee side. I interviewed in March last year as an experienced engineer, and my network was the number one source of interviews and how I found my current job.
Don't neglect your network, folks. As you age, it will become ever more important for your next position.
When I graduated (over two decades ago, and in Belgium), testing was a regular part of the interviewing process. However, it was understood that this was only done for people with no or very little experience. Later in my career, this fact helped me weed out the jobs which were below my experience level. After stating such and withdrawing candidature for the position, in several cases I was even contacted to come in for a more senior position.
The world has certainly changed.
I did end up finding something I'm enjoying.
Anyway, just wanted to send you some support. Keep your head up and keep going. You really only need one offer.
Lately, I decline to interview with any company that requires new-grad coding tests for experienced people (especially if they require it even when the person has open source code and community participation that the company can look at). I usually do well, but even then, it leaves a bad taste.
Of course, I'm very happy to talk each other's ears off in energetic collegial discussions about engineering problems and technologies, including whiteboard brainstorming of approaches/algorithms, perhaps much like would be a part of everyday work. If anyone ever then interrupted, "Hold on, can you put in all the semicolons, so I can type it in, and make sure you know how to code," you might wonder how that's not already obvious to them, and where they're coming from.
This aversion to "coding tests" for experienced people seems to be more acceptable to small companies/startups (or small autonomous units in large orgs), than it is to less-flexible/agile large companies. Recently, after discussing my latest background with a nice FAANG recruiter, we had a good discussion about the company's practice of putting experienced people through what seemed like a new-grad vetting/hazing process, and why that's been a turn-off. They soon sent a followup email, including a quote from an engineer there saying "... I need to know whether you can code in a language," along with attachments on how to prep for their new-grad coding tests. :) For whatever reason the company insists on that process, it seemed like it probably wasn't on track to a professional relationship that I'd want.
I made the mistake of hiring someone quite senior through internal transfer once who was utterly and flagrantly unable to code despite the job saying this was a requirement - I probably could have caught this with a simple fizzbuzz, but felt like this would have been too insulting.
At at least some of the FAANGs it's also a pretty clear indication of the fact that you're going to get busted way down and work your way up. I was a Principal Engineer at Intel with a successful exit in a highly technical area, but I would be shocked if I wouldn't have to re-earn my stripes at most other companies. Most of the super-smart guys I know who went to Google, for example, got busted way down and quickly earned their way up.
So if we're not up to grinding and expect to go back in at a high level, maybe it's kinder to warn us off early. :-)
Edit: By "over 50," I mean age demographics in general, which is the main thing that raises your family obligations. The only demographic division that I can think of that would reduce someone's willingness to abandon their personal life would be age.
This is like saying: "I have a time-intense hobby that I prefer over working too much for you.".
I think the maybe reasonable counter-argument against your sentence would be that while you personally do not have to value something like raising kids above your work, others very well might and do and there is nothing absurd or weird about that. Your framing betrays a certain kind of worldview where “work” is real and everything else is a mere hobby, a worldview that might be valid for certain people but is certainly not universal.
I don't think work per se is everything. But working on world-changing things is.
At least concerning the situation in Germany, where there is compulsary school with a compulsary curriculum (vulgo: 18-19 years of brainwashing), I am working a lot time in the evening on an alternative curriculum (currently focusing on computer science topics since this is something I am hopefully knowledgable about) that enables people to deprogram themselves from this kind of brainwashing to enable them to develop their intellectual potential so that they can begin to work on world-changing things. A lot of highly gifted people already asked me multiple times when they can finally read the first text of my planned series (they are really eager for it) - unluckily there is still so much to do even on the first text.
So the first thing that we should solve is to prepare a curriculum that does not completely brainwash our children. Only when this problem is solved, we can begin to think about how we can set aside some time for them.
You work to live, not the other way around.
> You work to live, not the other way around.
At least if you work in (academic) research, working to live simply does not work [pun intended]. The things that you work on in this area are the really important things in life.
Indeed - and that is why you should be very cautious to give birth to children; in particular if you have career plans.
This false dichotomy cuts to the heart of a lot of gender and racial diversity issues that plague the workplace in the US.
The sooner tech companies learn to embrace and work with this very basic fact, the better.
"Sure, I'll consider abandoning all priorities (i.e. social life) and/or putting off others (kids) -- and maybe even take minor risks on my health -- if you you're able to appropriate compensation."
Guess what, though? In the vast majority of cases -- even when we're talking about the bulge-bracket FAANG salaries occasionally gloated about in these and other parts -- simply don't come anywhere close enough to providing that level of compensation.
All of their pretensions to the contrary.
It's a lot more socially acceptable for members of one sex than it is for members of the other to be too busy with career stuff to spare much time for their families.
On a somewhat related note, single parents simply aren't going to have that kind of time.
If you come from an underrepresented demographic or have interests or social preferences outside the mainstream of the profession, it will be harder to fulfill your need for social affiliation by participating in communities like HackerRank in your free time-- because it may be harder to find people who share your values, and that you can identify with on a social level.
In other words: the social component of "grinding" is more fulfilling for some demographics than others.
I can think of a few others tied to national work culture.
If I'm a FAANG, I'm simply not using my normal interview process to hire for the really interesting jobs. I reserve those ones for people who got the job by virtue of their publication history in the academic literature, or because they built some well-known cool thing, or because they got promoted into the position. Those people get shunted over into the "you didn't come to us, we came to you" interview process.
The seats I'm looking to fill with the more public interview process are mostly seats for the grunt coders who work under those people. My ideal candidate for that position isn't some rock star creative genius; it's a workaholic who is resistant to boredom. And what's something a workaholic who's resistant to boredom would be really good at? Grinding away on programming interview questions, of course.
Teach a bunch of people to think in a certain way, speak a certain language and respect authority. Someone who excels at the repetitive mundanity of business school will be a perfect junior marketing manager at BigCo.
It's basically taking the way the Army trains new recruits and applying it to white collar jobs.
'Business schools turn out well trained, amoral yet obedient clerks.'
Can likely say the same about most CS programs. I know you can say that about engineering.
What, like homebrew? :)
Context for everybody else: https://twitter.com/mxcl/status/608682016205344768?lang=en
TL;DR: author of Homebrew interviews at Google, but doesn't get hired because he couldn't/wouldn't invert a binary tree on a whiteboard.
I am not sure what to think about this claim: it is the other company that prevents you from working for them by this interview process. The current employer has the incentive that you don't leave. The other (potential) employer rather has the incentive to poach you.
This explains this phenomenon plausibly for the FAANG companies. But I think there also exist lots of startups that could easily act as a cartel breaker - to their advantage, since this way they can poach from other companies.
I don't know if I can ever be one of those 'built cool things/get published' guys, so I guess that means I'm destined for grunt work. Fk.
I am not sure I follow. For people who want to, there is a section where they can find answers even if they have not solved the problems. What was 'wowzers' about HackerRank before you learned that?
HackerRank also used to organize contests where they would have a certain time to tackle a number of new problems (5-10 problems in 1 hour to 1 week, depending on the contest) meaning you have no access to solutions and are competing against other people solving the same problems. They have a rating for performance on those as well.
It’s really a travesty that we can’t teach it the same way that we teach maths or natural languages.
The end result is we’re left trying to divine whether someone is the programming equivalent of being illiterate. As with illiteracy people find ways to fake it.
As an aside, as someone who also studied linguistics  (and a couple of foreign languages at a beginner level), I am very confident in saying that our approach to teaching natural languages relies almost entirely on natural aptitude. It is just that, absent a serious mental disorder, all humans have a very large natural aptitude for natural languages
 Probably because we are absolutly terrible at teaching it.
 I assume not what you mean by teaching natural languages, but the topic of language acuisition (including in adults) does come up; plus it gives some perspective on how teaching language would look if we didn't rely on natural aptitude.
If a developer-to-be doesn't understand the framing context of what they are doing they are being dropped in a lake with no sense of direction.
Its why all the "naturals" started as geeks who played with computers from a young age. You learned about the environment you would end up working in and later on when you hit the grindstone and actually started creating gears to stick on that machine you had an idea what the result should look like and knew the tools in the shop when you set out to start building it. Even if you didn't know the steps involved in the process, you were familiar with the environment.
People who haven't spent time engrossed in computers, such as the myriads of youth entering a cs 101 class thinking its an easy career when the most exposure to tech they have had is maybe updating their phone and using apps for Facebook and Twitter and maybe owned a video game console with no tinkerability as a total black box drop out so fast. Their professors lead them to an anvil and tell them to forge a steel rod without any wink of an idea what a hammer is.
Its just not an answer anyone wants to hear, because the solution is only to have what amounts to an entire degrees worth of learning to predicate the actual study of programming. But you don't want your brain surgeon to go to medical school after having never studied high school biology or even more generally learned to read.
I have some ideas about where to go from here but it's not going to be easy. The search for real meaning and really meaningful relationships is ongoing.
I still think my parent's should not have gone so Amish, but I also don't think I would have developed as a programmer without that.
The personal computer grew up alongside us xennials, and some of us were just drawn to it, even without the promise of video games.
In my case it was a natural progression: "videogames are great!" -> "I have an idea for even better videogame!" -> "how do I make one?" -> "can I tweak this one into being a bit more the way I like it?" -> tinkering around data files -> "I really want to make my own game" -> picking up a programming book at 13 -> a programming career.
edit: now if you'll excuse me, I need to do some dynamic programming problems.
Ultimately, its just studying for the test, very much like the ACT/SAT in high school. You can be great at taking tests but ultimately a terrible student or vice versa.
On the other hand, I genuinely do feel theory is really important. While not knowing the minutia of a tim-sort doesn't indicate that you'll be a bad engineer, not knowing the runtime efficiency of a sort can lead to some really awful code. Not knowing when to use a hash table instead of a nested-for loop can be a sign that you don't really know what you're doing, and not knowing some rough theory on concurrency indicates that I might be stuck debugging your race conditions or deadlock.
I try to not be a complete jerk and I won't do stuff like give out an NP-complete problem (which an interviewer gave me once), nor will I ask for intimate details of how one would implement CSP, but I do tend to focus on theory-heavy questions more than my peers, but I try to give a fairly-generous amount of hints so that people don't get too stuck.
Theory is important but what's more important is how someone applies the theory to actual problem solving.
Reversing a binary tree isn't testing your knowledge of theory as much as it's really just testing memorization of a very very specific application. Knowing how to reverse a binary tree or how a hash map works is pointless if the person can't identify when to use them when solving an actual higher level problem. No one is ever given a binary tree and told to reverse it in the real world, they are given a business problem that you identify can be solved efficiently by modelling it as a binary tree and reversing it.
I'd bet most of the good people who fail the "reverse a binary tree" type of questions would succeed if you give them a realistic problem to solve without forcing a very specific solution onto them. Either they will come up to the realization that the solution is to think of the problem in terms of binary tree or they won't.
And neither is a terribly bad answer either. If they recognize you gave them a problem that can be represented as a binary tree and efficiently solved by reversing it then there you go, not only did you prove they knew the "theory" but they knew how to apply it as well. If they don't recognize it you gain valuable insight into their line of thinking and they might find novel ways to represent the problem and apply other theoretical concepts that solve it efficiently (maybe more maybe less).
I learned later that hashmaps are inherently O(n) (or O(log n)), or they take a lot of memory; internal arrays can get super huge if you're not careful.
In this particular case, an ugly nested for-loop was the immediate solution, and eventually I was able to cheat a little and have hard-coded integer indexes and was able to use an array.
Anyway, your point is valid, I just figured I'd give an example where knowing the internals for a hash table would have saved me a lot of time.
OK but why don’t these interviewers ever ask “internals” questions? How does a CPU works, or what the different levels of memory hierarchy are, what an interrupt request is, what is pipelining, what is SIMD, what is a GPU, etc. Or how about compiler internals, what are the different stages in a compilers, what are the different grammars and parsers, what is an AST, what is interpretation, compilation, bytecode, JIT, and how does it work? How about database internals? How is a database implemented, what is relational algebra, what is normalization, what is a data model, how do indices work? Similar questions can be asked about operating system internals, networking, floating point arithmetic, etc. In my experience, no interviewer has ever asked me these questions. And as an interviewer, I have consistently been disappointed with candidates’ inability to answer basic and relevant “internals” questions. Ironically, the algorithms and data structures questions that are commonly asked in interviews are outdated ways of thinking which do not take the underlying hardware reality into account (memory hierarchy and parallelism). It could be because a lot of these FANG(-like) “engineers” are themselves not knowledgeable to ask such questions. Or they are aware that new comp sci graduates are unlikely to really understand anything at a deep level, but all comp sci students are drilled on algorithms and data structures. If you are targeting a very specific age group (0-5 years experience) and want a standardized test then I guess asking CS201 exam questions in a job interview make sense. Companies prefer the 0-5yrs experience segment because they’re cheap, don’t have children, are easy to exploit, and they are easier to “mold” into your corporate culture.
Because not everyone writing Java code has had a background that exposes them to compiler internals or grammars and parsers.
But everyone writing Java code should at least be a little bit arsed to figure out what common data structures do - or how to write two for loops that print out "1 2 fizz 4 buzz"...
If I were hiring a carpenter I would want to know they understand their tools and when to use them. I don't care if they don't know how to make them. But of course knowing how to make them suggests an intimate appreciation for the craft (looking your way Matthias Wandel)
My beef with this in interviewing is that a huge chunk of modern programming work is developing UIs - and user interface theory and skills are treated like some softball thing. You ask UI-related questions to interviewers and half the time you get some shrug "oh we use whatever", etc.
I can't say that theory is more universal than frontend, though anecdotally, for me, it is. But ui isn't universal.
On the flip side, the O(1) look-up nature of hash tables make them the no-brainer data structure to use, at least for passing programming interviews. Perhaps more interesting test questions would be when not to use hash tables.
Although these days one should talk about "learned index structures" instead, I guess :)
That sort of thing might give a better sense of if someone has an instinct for how to verify whether the code is working. And, by extension, if they have a grasp of some concepts that go a long way toward helping a person come up with more reliable and maintainable designs.
How do you know?
I understand that FANG have themselves come to the conclusion that brain teasers are not necessarily very predictive for engineering performance.
But CS/programming questions for CS/programming roles? That seems sensible.
You can study lots of vocabulary to achieve better results on the verbal section of the GRE or similar tests. But afterwards, you will, in fact, have better vocabulary, I submit.
I am surprised that you categorically deny that they were better engineers after studying CS/programming questions.
Yeah, but tests like the GRE claim to predict academic performance, not vocabulary. So unless you think augmenting one's vocabulary alone will make one a substantially better student, there's still a gap there.
Maybe for a few days after the test anyways. And then...you push out those words you never use or read to make room for actual useful stuff.
There’s interview cake and leetcode, but I think people would pay $2k for a class that focuses on the questions and in person whiteboard practice.
They could collect information about the interviews at the major companies and then use those to create the program. For payment could also help candidates negotiate and then take a cut of the signing bonus.
If this isn’t part of lambda school already I think it should be.
I already work at a competitive tech company, but I’d sign up for this in a second to help me stay competitive for interviews.
It’s part of classes for our existing students, but it isn’t a single offering. One day.
It's funny that after doing some searching, many proponents of this widely adopted and poorly researched interviewing methodology appear to be running businesses selling training for these interview processes... surprise, surprise.
Worse, there's this huge emphasis on "big O" with zero focus on clearly egregious bad practices (tons of copies, outrageous memory usage, casting between strings and numbers all the time). I've rarely seen clearly bad algorithms be deployed but I have seen plenty of unperformant code go out.
Probably biggest influence on this I think is the culture, if nobody tells you that your code is shit (in a kind way) you'll never learn better. But then again, some people are so hard-headed and full of themselves that they get defensive and never actually admit their faults. Though giving and receiving feedback is not easy, can give you an identity crisis once you realize you've been doing something wrong for years.
And then you have a bunch of Ivy League graduates who have spent years learning algorithms and are burning to use them but there's no actual problems that really need them.
No matter your background, what school you went to, or what randomized experience you got in previous jobs, every person has equal opportunity to study and practice the same algorithms on their own (as opposed to being lucky enough to be able to afford a top-tier $$$ CS education, or to being lucky enough to have the connections or chance to get certain previous jobs).
And thus, when applying to jobs, it becomes something more akin to a raw-ability IQ test, which you can argue is "fairer", especially when management realistically knows developers might be shuffled around all the time, and that the extensive SQL experience they were hired for will mean nothing when project requirements switch to a basic key-value store.
On the other hand, if you are interviewing for a highly specialized position that is fairly certain not to undergo change, then it makes sense that specialized experience could rightly count for far more than any kind of generalized intelligence or ability.
Well, that's just not the cause - there are many groups of people who lack the opportunity to study and practice. Couple of examples: people with kids, people working 12 hour shifts, people without access to teaching materials, people without a sufficiently advanced machine to run dev environments, etc.
testing whether someone is willing to prepare for a thing is a relevant work skill test too.
Is it, though? I could understand if algo questions had some relation to the work you're doing, but your comment on why they're good is independent of the actual material. If we replaced the algorithm question interview with an interview testing obscure presidential facts, would it really be a useful test to have devs take? I guess it is a relevant work skill test in that you get people willing to put the work in/game the system, but it doesn't seem to be much of (if at all) an improvement from ad hoc conversations.
Handled correctly, a problem like this answers several questions:
1) Can you correctly break down a problem like this into its components parts?
2) Can you recognize the overall class of problems that this falls into?
3) Can you transform this specific problem into the more general class so that you can solve it in a known fashion?
4) Can you think about and implement the movement?
5) Can you communicate while you're doing the above?
No one cares about solving that particular problem. But the answers to the above really are relevant. Being able to map novel problems onto known solutions is absolutely a skill that any competent software engineer needs to have. "Oh, you want me to do X? That looks a lot like Y, this thing we've already solved; maybe I can just implement it in the same fashion (or re-use our existing system!)"
I'm not saying that this particular problem is a wonderful example, or that I'd use it in my own interviews. But this overall class of problems really does have a place in interviewing when it's handled well by the interviewers, and arguments against it on the basis of the specific problem being irrelevant are really rather missing the point.
I don't do many interviews anymore, but I used to, and I had no short supply of problems that I actually had to solve in the course of my work that I could ask about. I don't think whiteboard interviewing is great in general, but if you're going to do it you can at least try to keep it relevant.
As a matter of fact, I did have to do memoization of graph traversals at least once for work (most programmers never have to do this), and I find that problem a lot more interesting (trait matching for Rust). I could easily give a talk about that problem. As for that interview question, though? I don't always do well with time pressure, and so I can't guarantee I'd be able to answer it to your satisfaction.
I've done the "ask a real problem that I've solved before" thing, and I find that it usually gets hung up on details and context around the problem to the detriment of actually solving the problem at hand. That's not to say that such a conversation isn't itself very valuable, but that's a different interview conversation, at least on my team. We find it important to maintain some level of focus just to ensure that we're covering the various signals that we'd like to get from the candidate.
My view is that Google's interview process seems to work because Google gets so many applicants that they can afford to randomly reject most of them. It's not because it's a good process.
So it isn't fair to ask the candidate to solve a real bug or implement a real feature in only 35 minutes unless they've seen something similar before.
This is why big companies like Google are limited to whiteboarding interviews because they need to have an interview process efficient enough to properly vet and filter >1 million applicants Google receives each year.
Personally, I think a better interview process is a pair-programming or work audition for a day. But that is not even close to matching the scale of Google.
Let's say out of a million applications, maybe only 25% are qualified. That is still almost 1000 candidates to interview per day (number of U.S. business days in 2019 is 261; 250,000 candidates / 261 business days = ~957 candidates per business day). Pair-programming or full day work audition will not be able to accommodate 957 candidates every day.
I find that hard to believe.
Moreover, memoized graph traversals don't get you full credit on this question. There's a dynamic programming solution, and in fact the ideal solution is one using matrix math, which is ludicrously divorced from anything most programmers would ever see.
But it will also filter out everyone that is opinionated enough to not do that stupid preparation work, and you will end up with sheep coders that will always follow the rules.
Looking at Google, Facebook etc, this might already be the case. I will even go so far to say that they prefer those type of obedient coders than the ones that ask too many questions and get too creative.
Reviewing algorithms for interview prep at least has some relevance to programming. While a candidate may not use that exact algorithm in their day to day job, they are creating ad hoc algorithms all day long. With that said, I don't think time pressure, white boarding algorithms is a very good job performance predictor.
(The counter to that is that if people can relatively-easily (single-digit days) cram for your interview, you're still not going to be effectively screening for at-hand pre-existing familiarity/knowledge.)
If I require all prospective engineering hires to prove explicitly that they have an IQ of at least 135, I will get sued. Do you think this is false?
To turn this around, can you point at the law that makes IQ a protected class or whatever you're claiming it is? I'm not going to have much luck proving a negative - Russell's teapot and all that.
Managing complexity is a valuable skill that should also be screened for at interview. At most top companies you are there for at least 4-5 hours so there should be plenty of time to evaluate that skill.
I think whiteboard questions are good, I want to know that this person is capable of writing difficult code if we need them to.
I also think we probably ask too many of them.
I have been on the hiring side and my experience so far has been that almost always the feedback is close to identical across multiple whiteboard questions. The questions are also so abstract that asking multiple to "prevent bias" seems ineffectual. What bias could there be, you either solve the problem or you don't. Bias is more likely to come in on the behavioral interviews. There should be multiple of those for sure.
Generally someone is either a good enough coder or not and they will display that consistently across all the interviews. You will see the same stuff throughout (good or bad variable naming, good or bad communication etc) thus asking > 1 coding question by default is a waste of everyone's time.
It's not a false dichotomy because I am not suggesting that both skills are mutually exclusive. I'm highlighting some personal experiences where I've seen one skill is vastly overvalued compared to another.
>Managing complexity is a valuable skill that should also be screened for at interview
I've never been part of an interview, on either side, where I've seen testing for managing software complexity. Good interfaces, function design, side effects, state management, etc, are all second class citizens to finding the appropriate algorithm to solve the interview question. I've seen it over and over. The only time I've been close to a complexity management question was on a systems design question, but even then, it was only a very high-level systems discussion. I don't think most places know how to screen for it.
I've worked for two non-technical companies as a software developer and one highly technical company, and interviewed at a few Silicon Valley companies.
The difference between the interview processes is staggering; my current job's interview was two hours of conversation, no code tests, just a general assessment of "do you know what you're doing" by the hiring manager and a couple other members of the team. The highly technical company had a code assessment then the in-person interviews had zero coding.
The SV companies must have a good reason for this, but golly the amount of coding in those interviews is nuts. I'm a process over code speed kind of coder, and I've failed every SV-level test because of it; my code comes from talking to non-technical users like medical researchers and study operations managers and tossing something together in Python or a cloud service that makes their lives easier. Needless to say, I don't go over algorithm fundamentals on a regular basis, and I generally fall out after the first or second interview.
It's especially odd that interviews are so intensely focused on those couple hours since I personally don't see any dev or any resource for that matter contributing in any meaningful way in so fast a time, or even within 90 days. I'm not sure how this problem could be solved with the limited time companies can dedicate to interviews, though; maybe rely more on portfolios?
Unfortunately, people in general rarely try to first understand what the candidate offers. It's more often ONLY about whether candidates uunderstand the exact way the company uses certain technology.
The cherry-picked successes thing is certainly a problem, though, and not just for coding. Maybe it's the candidates I've asked it but when asked "tell me about when you made a mistake in a project" they tend to answer with a strength and try to re-frame it as a weakness. "Oh, I worked too hard on this project and it made me tired" isn't a weakness. "I worked too hard on this project and that made me neglect business requirements since I was too myopic to notice" is a weakness.
Sorry if that's a tangent, it's been a pet peeve of mine since I started interviewing that not many people are humble enough or have thought enough about what their weaknesses actually are, and how that affects the success of their work.
In fact, no, nobody has a good reason for it.
The reality is that these are extremely desirable positions with a staggering number of applicants who _do not know how to code_. Not as in “I can’t solve a dynamic programming problem without studying up on it”, but as in literally don’t know what a for loop is.
The process is far from perfect and the frustration is understandable, but it works well enough as a filter from these companies’ point of view.
A lot of more traditional software development roles do not have much testing. Sometimes they will have a quick online timed test or a simple question on an intitial phone screen. That isn't to say that all companies do not have testing. But it's definitely the startup and big tech worlds that have the majority of it.
I said I was interested in interviewing but that I would only agree to a process that evaluates me based on my previous work history, and not any onsite or takehome coding projects, system design questions or whiteboard coding questions.
The recruiter said she would run it by the manager, but thought it would not be possible, and a few days later I got a rejection email.
Let’s take their word for it that they are desperate for a machine learning engineer. Then it suggests they care more about mandating workspace conditions (since financial cost even to provide thousands of workers with private offices in dense urban areas is not a realistic excuse not to do it) or trivia during interviews than about business needs.
“The market can stay irrational longer than you can stay solvent,” seems apt for this.
The conditional probability you are hopelessly lacking software skills to do a job given that you nonetheless passed a TripleByte exam or something is quite high. Overfitting & memorization for the sake of the test is extremely common.
But it’s much, much harder to fake competence when needing to dynamically and verbally explain technical details in a conversational interview about past work experience.
It's far, far easier to fake expertise when you have more context than the person asking questions. Technical interview questions make sure that the question giver has more context than the recipient, ignoring pathological cases.
In fact, conversational interviewing like this has very little to do with any of the domain specifics of the project. The point is to recursively keep probing for deeper technical specifics, so they have to explain at finer and finer technical levels what were the tradeoffs, why exactly were certain decisions made or how were certain problems overcome.
It is precisely the situation when someone did not have to dig into the technical weeds of a project for themselves that they will not be able to fake or fast talk their way through this type of interview.
That is the number one, defining characteristic of this way of interviewing.
But I have. Not in the specific context of a job interview, but I have absolutely convinced technical experts that my level of expertise in a field is above my actual level of expertise. Ironically, if this were a technical interview, I'd fail it, not because I lack the skills to convince technical experts of my non-existent abilities, but because you don't find me trustworthy.
>The point is to recursively keep probing for deeper technical specifics, so they have to explain at finer and finer technical levels what were the tradeoffs, why exactly were certain decisions made or how were certain problems overcome.
But without context, you can't effectively do that. I, the candidate, am in control. I can steer the conversation to avoid areas where I don't have expertise by answering all kinds of things: "investigating that was someone else's responsibility", "well we never tried anything else and our current implementation works well enough that we never needed to", etc. You can't know if those are lies or not. There are all kinds of completely valid non-technical reasons for decisions that may be completely outside of a candidate's control.
You're either forced to completely trust the candidate, or attempt to verify their authenticity during the interview, at which point you quickly venture into the land of bias and subjectivity.
This is simply false. You don’t need context to understand if the breakdown of a technical problem into constituent trade-offs was appropriate or not — by definition that very breakdown into constituent technical details is the context.
> “investigating that was someone else's responsibility"
This just confirms to me that you are not correct in asserting people can just skate by these discussions. If someone tells me something like that, I’ll ask them what did that other person find when they investigated? If you say anything like, “I don’t know; that was their job not mine,” then you’ve lost credibility because you didn’t put that other person’s conclusions through strong skepticism until you were satisfied you knew the details well enough that you could own or support them if you had to. That is exactly the sort of thing that indicates bullshitting.
> “attempt to verify their authenticity during the interview, at which point you quickly venture into the land of bias and subjectivity.”
This is just wrong. You don’t ever “just trust” the candidate, that’s the opposite of this interview style. Further, you never “attempt” to verify authenticity.. you just do verify it, since it does not require context or domain specialty to analyze reductionist decomposition of any engineering work down into primitive constituent tasks or decisions that are universal.
This approach is far less biased or subjective than appraising “how a candidate thinks” while they solve tricky puzzles in a foreign environment with unrealistic time pressure.
I mean, yes you do. Context tells you which decisions are important. You're asking for the context you need to assess someone's technical competency from the person whose technical competency you're attempting to assess. They have every incentive to lie, mislead, or stretch the truth to make the context they give you highlight their skills more than the real context did. And no, you can't verify that.
>If someone tells me something like that, I’ll ask them what did that other person find when they investigated? If you say anything like, “I don’t know; that was their job not mine,” then you’ve lost credibility because you didn’t put that other person’s conclusions through strong skepticism until you were satisfied you knew the details well enough that you could own or support them if you had to. That is exactly the sort of thing that indicates bullshitting.
But now you're punishing someone for organizational things possibly beyond their control. If my job was to build a thing that made use of some blackbox algorithm, and John developed the algorithm, why should I put the algorithm under strong skepticism, perhaps that's how things work in your workplace, but there's no clear reason that mine work the same way. This is just a bias against people whose development practices aren't the same as yours.
>This approach is far less biased or subjective than appraising “how a candidate thinks” while they solve tricky puzzles in a foreign environment with unrealistic time pressure.
Except that, as I've literally just proven, you've failed to verify the authenticity of me because you've wrongly concluded that I'm not authentic. This comment thread exactly demonstrates just how you can fail to identify a good candidate with this method because if you dislike what they're saying, you'll unconsciously convert that to them being less authentic.
So again, either you take the candidate at face value in a situation where they have every incentive to lie to you, or you attempt to verify their authenticity, at which point that analysis is subject to a multitude of biases that have nothing to do with the candidates skill level.
You've just demonstrated this perfectly.
No, you definitely don’t need to come into it knowing about this, and even if you don’t know about this ahead of time, it won’t imply “just trusting” the candidate or being overly subjective. You will ask the candidate to explain why various decisions were important, and not stop at the top line answer but recursively probe into it, breaking it down into concepts and trade-offs that are universal in any kind of applied problem solving.
> “But now you're punishing someone for organizational things possibly beyond their control. If my job was to build a thing that made use of some blackbox algorithm, and John developed the algorithm, why should I put the algorithm under strong skepticism, perhaps that's how things work in your workplace, but there's no clear reason that mine work the same way.”
I’m sorry but this also just isn’t true. If you are describing your contributions to projects and all that keeps happening is you hit walls in your explanation where someome else did the work and you did not review that work at a high level of depth, then you’re just being misleading about your contributions at work.
Your job as an engineer in a company is to solve problems for your stakeholders, whether that means building tooling for other engineers, assisting designers with prototypes, designing algorithms for core product functionality, sales engineering for client stakeholders, etc.
It doesn’t matter how your company is structured, it doesn’t matter how the work was divided up. Your job is to know about the stakeholder problem you are solving, at a deep level, and when you represent your work to other people and you fail to offer technical depth about the trade-offs needed to solve stakeholder problems, that’s a clear mark against you as a candidate.
It’s bewildering to me that anyone would think that the way their current employer organizes assignments should reduce their burden of knowing how to represent their projects in significant technical depth. That is an always-on, never mitigated, constant responsibility for all employees anywhere. You’re not holding anything against someone if they can’t provide that in an interview... no, you’re just uncovering what they’ve lied about or embellished on a resume.
> “Except that, as I've literally just proven, you've failed to verify the authenticity of me because you've wrongly concluded that I'm not authentic.”
I see no such proof at all, and the cheeky rhetoric just makes me feel more entrenched that you are bullshitting hugely in this thread.
> “So again, either you take the candidate at face value in a situation where they have every incentive to lie to you, or you attempt to verify their authenticity, at which point that analysis is subject to a multitude of biases that have nothing to do with the candidates skill level.”
You are doing nothing but gainsaying here. You’ve made no argument that would support any of these strong conclusions, especially not any reason why this interview method faces the false dilemma between either just trusting the candidate or else succumbing to biases.
You are just asserting things, but they do not seem to be connected to or bolstered by any of the other things you’ve written.
Let me lay it out clearly: I asserted that I can, and have, inflated my abilities to people who have technical know how. This was in response to you stating that "I’m sorry but you cannot do what you are claiming."
So to be clear, at this point, one of two things is true:
1. You are wrong
2. I am a liar
To you, it is clear that point 2 is the true one. To most readers, this is not as obvious. Once you have decided that (2) is true and I am a liar, nothing I say can or will convince you otherwise. But you haven't decided that based on anything factual. In fact, (1) is true here. I am not lying. I can, and have, done the things I claim to have done in this case.
I'm using this to demonstrate that your ideas about such a conversational interview don't work, by pointing out that in the conversational interview that we are having right now, you've decided, based on a preconception, that what I say cannot be true! I could be the world's most successful conman, but because you're preconceptions lead you to believe that your preferred interview process is effective and is less biased than your non-preferred one, you won't accept evidence to the contrary.
>I see no such proof at all, and the cheeky rhetoric just makes me feel more entrenched that you are bullshitting hugely in this thread.
Right, and my point is you're wrong and unwilling to accept that. And that is a demonstration of you not being able to effectively figure out whether or not someone is bullshitting from a conversation with them. You've decided that I'm bullshitting because the alternative would require you to do a lot of introspection about how and why you analyze candidates the way you do. So it's easier to just say "you're bullshitting" and then not put in the effort. And that's certainly your prerogative, but its not at all a good look for your interviewing capabilities.
That you're so prone to cognitive biases that you're willing to completely write off someone's experience because it forces you to rethink something you hold dear is not a selling point of the process you espouse. It demonstrates, like I've said, that the process is prone to cognitive bias and is therefore decidedly not objective.
That is, there are two possibilities:
1. You are wrong, and you're refusal to accept that is coloring your perceptions of our interactions in such a way that you are not able to be objective about my experiences and abilities, as I claim.
2. I'm completely making everything I've said up and haven't ever been able to inflate my abilities to anyone. Your person-analytical skills are infallible and you've caught me.
I subscribe to (1), you continue to wrongly believe (2). This is expected, its why your process isn't as objective as you claim.
>It doesn’t matter how your company is structured, it doesn’t matter how the work was divided up. Your job is to know about the stakeholder problem you are solving, at a deep level, and when you represent your work to other people and you fail to offer technical depth about the trade-offs needed to solve stakeholder problems, that’s a clear mark against you as a candidate.
This is, again, your opinion of how engineering should be done. Not every engineer has the opportunity to work in a workplace where that's how things work. Are you going to write off everyone whose experience has been in a PM led environment because they haven't had the opportunity to develop using the process you prefer? If so that's again your prerogative, but you're probably filtering out a bunch of good engineers.
I’m speaking from ~10 years of experience running my team’s recruiting in a quant finance firm, where many interview requirements / tests / etc., came down from executive managers, so I got to see a wide range of performance on tests of all sorts, riddles, hardcore algo trivia, etc.
The sum total of all that leads me to believe quite strongly that the best signal to noise comes from super careful and tedious resume selection followed by conversational and behavioral interviews that recursively probe into more specific technical details.
I note that you didn't answer my question. Have you taken the TripleByte exam or not?
Where are you learning DP problems from? I am very bad at those and need a few good references so that it sticks in my memory.
Here's a few for you to try. Some of these are pretty hard, but you should be able to find solution sketches online if you google the contests they are from.
I've solved all of these as well, so if you get really stuck feel free to reply here and I'll try to guide you through them.