Hacker News new | past | comments | ask | show | jobs | submit login
Harder programming questions do a worse job of predicting outcomes (triplebyte.com)
507 points by Harj 33 days ago | hide | past | web | favorite | 511 comments



Google recruiters call me a lot. I think I'd do a good if not stellar job working there. I've passed multiple FAANG interviews and been very successful as a senior developer.

In my email I have an "interview prep packet" from them that essentially tells me to brush up on algorithms and read Cracking the Coding Interview to prepare for their interview process.

I'm fairly happy in my job. If they offered more money or a really interesting project I'd consider working for them. But I'm pretty lazy about redo-ing college algorithms class during my free time at home to go work there, so I probably won't.

There's an opportunity cost with interviews like this where an M.S. and long career of getting shit done counts for very little and memorization of undergrad level topics that you can look up in two minutes in Knuth if you have a problem that requires it can make or break an interview.

I've made a career fixing a ton of horribly shitty, inefficient code that's been produced exclusively by people who pass these interviews.


You wouldn't be a good fit for Google. With their algorithmic interviews that require college-grad level of studying, they filter for people that are ready to follow orders without complaining.

That's who they want to hire at the end of the day: some coders that don't get too critical about their job and do what they are asked to do, even if it is repetitive, stupid and doesn't really make sense (such as re-studying algorithms implementation details for two weeks before an interview when it can be looked up super-easily online).


I was a naval officer in a prior life, and my current manager loves that I get the job done, whatever it is, without complaint.

I'm pretty much the opposite of what you think, so if my desire to study for the algorithm interview is your litmus test for that, kinda proves my point.

Not everyone that would be good for Google has a burning desire to work for Google. Google might want to consider that.


To make it clear, I absolutely hate coding interviews that makes candidate lose so much time restudying. I think having some critical-thinking is absolutely needed and way too many engineers lack some (especially those swallowed into FANGs)

What I described on my previous post is a credible explanation that my group of engineer friends came up with on why all the FANG companies pursue those heavy memorization algorithmic interviews.


To my mind, the more likely explanation is that they would simply get too many false positives if they didn't use the algorithm stuff to filter potential hires. You lose a lot of potentially good hires that way, but the pool you're left with are all of a certain intelligence level. Whereas, if you don't use the algorithm stuff to filter, it's really hard to figure out who is even intelligent enough to do the job.


The article says that making the problems too hard results in a _worse_ selection, and not just more false negatives.


I'm not following, how does memorizing algorithms correlate with intelligence and skill in the engineering discipline of the job?


Not so much intelligence but applied diligence on top of intelligence is my sense. If you want what they’re offering and don’t want to jump through their hoops for it then they don’t want you. For those that ball at the hoop jumping as being beneath them, there’s probably additional characteristics that comes with that that they’re trying to filter out.


Given that many job applicants apparently can't write a Fizz-Buzz implementation, I'd say that being able to implement an algorithm is probably a reasonable way to cull the herd by a hefty margin.

https://blog.codinghorror.com/why-cant-programmers-program/


It's made me angry how many people I've interviewed who can't manage Fizz Buzz with only 2 conditions or at all.


I'm kind of angry that I've never been asked to do it.


I'm curious to find out what people define as fizzbuzz, do you have a good example of something you've asked in the past?


Print numbers from 1 to 100, except print Fizz for numbers evenly divisible by 3, print Buzz for numbers evenly divisible by 5, and print FizzBuzz for numbers divisible by both 3 and 5.


There's a correlation between the belief that memorizing algorithms correlates with intelligence and the application of algorithm questions in interviews.


If you have two great engineers in a kitchen, discussing a relevant problem, you want there to be synergy. That synergy is broken when one of the engineers has to Google how to reverse a binary tree.

It is also a safe place to work for the really exceptional engineers. They can talk freely about complex computer science, without getting blank stares or having to dumb it down. Otherwise it gets frustrating fast.


CS is way too big for any individual to know it all in depth. If it really is a “safe place” because the employees can talk about complex topics without ever losing anyone, that would imply that they’re all specialized in roughly the same topics and severely lack organizational breadth. I kind of doubt that’s really true.


You can have organizational breath with smaller teams. You pair a great programmer with an exceptional programmer and it helps if they are specialized in roughly the same topics.


How does that give you breadth?


The teams specialize.


The other commenter was describing it as a “safe place” where you can talk about complex CS topics without confusing people. Unless your teams never talk to other teams, that doesn’t fit.


One of the most valuable skills for any engineer is to be able to frankly admit the need to look something up.


Admit it to yourself. Then come prepared (to the job interview or inception meeting).


Do you work for google? This seems naive and inaccurate to my work experience, both in Tech and in Medical devices


If they cared about intelligence they would just give IQ tests.


IQ tests are a form of convergent preferential bias and they don’t test for abstraction or creativity which is more than half the definition of intelligence.


They don't test for creativity (though seeing a correlation between IQ and creativity would be interesting) but they almost certainly test the ability to deal with abstractions. That's pretty much what an I test is! What reason do you have for thinking otherwise?


They test for puzzles that appear as visual shape complexities which require use of the visual cortex. That is something a person might refer to as a visual abstraction. That isn't a practical abstraction, such as requiring a person to form an answer from the absent of acceptable criteria (to abstract or form something new to solve a problem).


It's not readily apparent to my that those two abilities should be distinct. It seems plenty likely that they would be at least highly correlated. While you're point about the visual cortex may be relevant, I don't think that's the whole story.

My understanding is that different IQ tests will rely on different kinds of questions (i.e. not all involve visual shapes; some involve word/logic puzzles). The point the scores for all IQ tests correlate very highly (and thus suggest that there exists some common factor).


What is convergent preferential bias?


It means a bias of coming together toward a tester's preferences. Do people excel on the few narrow performance meausures you find most correlated with intelligence?

There are divergent aptitude tests measure quantity and diversity of answers to a given question opposed to the one desired answer. Divergent tests are rarely performed, but at a stronger measure of creativity and problem solving which is typically what people actually want when they say intelligence. The reason why convergent testing is preferred over divergent test is because it is easier to measure and those simplified measures are easier to compare.


Can you provide a link for divergent aptitude tests? I'm afraid my search results all have to do with some young-adult fiction novel (called Divergent).

How much do such tests correlate with things like (for example) creative achievement?


Which is illegal to do, hence proxy IQ tests.


It is not illegal. Here’s Griggs v. Duke Power Co., 401 U.S. 424 (1971) https://supreme.justia.com/cases/federal/us/401/424/#tab-opi.... Go read it and see for yourself what it says.


Right at the top of that link:

> Testing or measuring procedures cannot be determinative in employment decisions unless they have some connection to the job.

IQ tests are not directly related to the job, and so are illegal according to that ruling. Coding tests are directly related, which is why they get a pass.


You didn’t read the opinion and you added the word “directly” to the summary.

Do you scan source code and draw firm conclusions about what it does based on skim reading the first comment you see?

Perhaps my old contracts prof could have a second career as a google interviewer. (He was notorious for cold calling people that hadn’t briefed their cases and eating them alive.)


The purpose of the "Primary Holding" section is to summarize the result, so that you don't have to dig through the whole thing.


> my group of engineer friends came up with on why all the FANG companies pursue those heavy memorization algorithmic interviews

I think the most likely explanation is that "elite" of CS grads usually do competitions like ACM ICPC and olympiads, which are full of problems like those watered-down interview questions. As there is no real authority on what makes a developer good, but there was a measurable outcome of those programming competitions, leading to high status and pride displayed by winners/participants, it was simply taken from there and dumbed down to fit into interviews. Some top colleges even have special prep courses for those competitions and comparing to them FANG interviews are super trivial.


> With their algorithmic interviews that require college-grad level of studying, they filter for people that are ready to follow orders without complaining.

Spot on. Certain companies ask silly question and to perform tedious exercises and, in the debrief, investigate if the candidate complained.

(disclaimer: I worked for Amazon and never seen this pattern in the company)


  some coders that don't get too critical about their job
Couldn't be farther from the truth.

As someone who works there I've noticed this general trend of animosity towards Google, it seems to be fueled by the fact that people feel a sort of inferiority complex when they don't clear an interview or they feel they wouldn't be able to crack it if they ever gave it a shot, this leads to an overcompensation of attitude in the other direction, a sort of "sour grapes" narrative where there is an effort to downplay the prospects of working at Google or ridicule the people working there like you did just now.


A lot of people are sour at FB/Google simply because how the companies represent themselves to the users. I can't look at Google/FB without bringing up gmail, maps, search etc. and their utmost desire to track every nanosecond of my life, and understand me better than I do... just to earn money from me. I couldn't care less about hiring processes there, since where I live they don't pay more than other businesses, and everybody knows how 'great' their hiring experience is.

With ie banks, they are at least honest about how they earn from customers like me and I don't have any unreasonable moral expectations.


My comment wasn't trying to ridicule anyone, but explain that big companies with an extremely large amount of coders need a very specific type of heads down personalities. As can be seen in this whole thread, plenty of people simply don't have a desire to work at big FANG companies, and even less desire to study multiple days//weeks for an interview that doesn't really help their day to day job.


I think it's possible for both things to be true at the same time. Yes it's a selective meritocracy, and it's also true that the process unnecessarily alienates people. It comes down to recruiters being given the very difficult task of finding candidates for a process with such a high rejection rate, and the added stress of being given very specific rules about the things you can't be transparent about. The overwhelming majority of job seekers are mature adults who can handle rejection and constructive negative feedback. It's the sense of having wasted one's time that generates the "sour grapes" feeling.


There is also the general attitude of Googlers//Facebookers that brag a bit too much about working at Google//Facebook as if it was the dream that everyone was looking for.

There is nothing more annoying than discussing with a Googler that tries to convince you that you should apply and work for Google.


I have no idea whether I could study hard enough to work at Google. What I do know is that I don’t like large companies and I have no desire to move to the west coast.


> What I do know is that I don’t like large companies

I think this is a key thing to learn about oneself. Very large companies often have a lot of cachet and can provide opportunities that small companies cannot (scale is scale, after all).

But small companies (not even start-ups, just companies with < 100 employees) can provide a different kind of opportunities:

* Interaction with different parts of the business

* Opportunity to wear multiple hats

* Less likely to be in the Bay area

* Nowhere to hide incompetence

Of course this isn't every small company, but I have worked in a few that were like this.


It is all of that and while after 20+ years developing professionally and developing as a hobbyist since middle school, I’ve never felt burnt out from continuously learning. But,I was burned out by big company politics and bureaucracy after only three when I was worked at a large Fortune 10 (non tech) company.


>it seems to be fueled by the fact that people feel a sort of inferiority complex when they don't clear an interview or they feel they wouldn't be able to crack it if they ever gave it a shot

Some of the most talented people I've ever met have failed Google interviews. As in far, far more talented than the majority of folks I know who do work at google. No amount of bullshitting is going to convince me that Google's False Negative interviewing system is the correct one. It simply bypasses too many talented candidates even beyond the reasons generally mentioned above.

One guy in particular I know is now a CEO of a company which, ironically enough, employs several former googlers. Yes, I know Google doesn't give a shit they missed on this individual. But to mis-characterize the anti-Google-interview narrative as "sour grapes" sounds a bit like you're drinking their champagne.


I've met my share of Googlers who are lazy, don't know how to solve problems (as opposed to write code itself), or still need help with basic tasks after several years on the job.

You just can't test for some qualities like focused, persistent problem solving and the desire to find ways to speed up your work.


> they filter for people that are ready to follow orders without complaining.

Well, if that was the goal, boy did we ever fail to deliver on that one ;-)


I think the original point of these interviews was to see how much the candidate actually learned or retained out of university.

What happened was that the interview questions inevitably got leaked and accumulated (this is the result of a post-highly-super-indexed and centralized internet) and it became a race to the bottom for candidates.

Imho the test may have worked in year 1-4 of google but it no longer flies since college kids spend an eternity studying them at home. It’s like the SATs all over again for these kids.

I still think these tests are generally good for testing how good somebody is. If you somehow got through cs without knowing how to 3 color a map vaguely (not talking perfect answer here, just vaguely correct intuitive explanation) even if after 15 years of work, something isn’t right. These sorts of questions definitely will weed out your local web-dev baddy or even dev-bootcamp baddy, which Silicon Valley is starting to be flooded with now.


You just described practically any corporate job, ie in banking (which especially are tuned to 11 due to massive bureaucracy, over-processing and very high pyramidal management structures)


I'm in your boat. I get hit up by Google 3-4 times a year. I was interested once and did the pre-screening. When I got their prep packet I looked it over and then promptly canceled my interview. I wasn't going to review stuff I've never used throughout my 10+ year career.

I'm a big fan of not wasting time, which I why I get stuff done at work and have been promoted twice at my current company in the past 3 years. As a sibling commenter suggests, if Google wants employees who blindly do what they're told, then I wouldn't be a good culture fit. I was taught critical thinking skills in school. Respectfully questioning my superiors' plans from time to time has been a valuable skill.


I have a technical phone screen with Google this week and I have the same attitude as you. So i'll probably fail the interview, but I'm doing it just in case they ask a real world problem, which I'm pretty good at.


If they ask you any CS101 or algo questions end the interview and explain why.


Google is certainly not looking for "employees who blindly do what they're told", at least not for engineering roles.


No, but they certainly have no use for SWEs thinking outside the box in ways beyond engineering work. Such as recognizing that interviews don't work well to predict ability and success. We engineers are supposed to shut up and conduct interviews and build interview tools, not wonder if the whole thing is a case of the emperor has no clothes.


That isn't really unprecedent through out the history of work. There are, have been, and will be many fads where the emperor was eventually found to be wearing no clothes. This particular interviewing style is just one of many, and is already on the way out as people learn they aren't getting very far by cargo culting Google (e.g. see Stripe and Microsoft's new interview process).


What I find weird is why does anybody want to work at Google? I use their tools all the time at work. If you go to console.cloud.google.com... I mean... Wow. Just yesterday I was trying to find the logging for our cloud endpoints on Stackdriver. They should use that as their interview question ;-)

And this is the big thing I've noticed about many Google apps (especially dev tools): they are all incredibly (for want of a better word) hacky. And the documentation... while I can very much appreciate that they are serious about it and work very hard, it's a huge mishmash of marketing speak ("It washes your socks and makes dinner for you!") with the crucial information you need hidden away in secret corners. I literally have to use Google the search engine to navigate any of the documentation.

I don't mean to be so negative. It's not that it's so terrible in reality, it's just that it's not anything particularly great. I literally can't think of a single offering they have that I would aspire towards. There are lots of smaller, hungrier companies making much better products. And by extension, I think if you work at those smaller, hungrier companies you have a better chance of learning more, becoming a better developer and even being happier with your job.

So the thing is, unless I'm alone in my assessment, I think the main things people are looking for from a job at Google are status and money. Additionally, I think a lot of young people believe that talented developers mostly work at famous companies. In my experience, this is the opposite. Back when Microsoft was the powerhouse, I knew a lot of MS developers. Some were amazing. Most were average (as you might expect when a company is hiring thousands upon thousands of developers). However, you had a much better chance of actually working with someone amazing if you got a job at a small shop. I still think this is true.

Personally, I don't complain about the "You have to be this clever to ride" interviews. Yes, they self select for people who are not like me (bulldogs that work away obsessively at problems until they are solved without using any particular magic insight). Yes, it means that I'm unlikely to get paid at the very, very top of the payscale (heck, if I wanted to get paid more, I would have been a lawyer -- I want to write code). It just means that it's that much easier for me to find companies that are a good fit for me. I don't really see a problems with that.


> What I find weird is why does anybody want to work at Google? I use their tools all the time at work. If you go to console.cloud.google.com... I mean... Wow. Just yesterday I was trying to find the logging for our cloud endpoints on Stackdriver. They should use that as their interview question ;-)

Working for Google is the only way to get bugs in Google products fixed, or your feedback even listened to. ;)


I'll be honest, I complained about my total lack of ability to find anything in the documentation and they emailed me back to ask for suggestions on what to fix. I never got back to them. I really should find time.

It was funny because in my frustration, I left a comment on the last page I looked at and they responded with the completely reasonable question, "Why were you looking at that page? It doesn't contain any of the information you were looking for." It was such a great summation of what my problems was that I felt difficulty in finding an appropriate response :-). Possibly this is too unfair, but I felt like the question, "How should I have found the information I was looking for?" was something that had never occurred to them. It's pretty applicable to virtually everything I've used from Google. It has that feel of, "If you don't already know, then you don't deserve to know".


> why does anybody want to work at Google

with all the android privacy issues researchers keep uncovering, i feel like Google has really driven a certain fraction of the labor pool which cares about protecting user privacy directly away from itself

> they are all incredibly (for want of a better word) hacky.

yeah, i haven't seen much of anything to feel inspired by either (although I do think Google Maps team has put out a really solid product). but i have to deal with the Android SDK and other google libraries for android, and "hacky" seems like the best single word description for that stuff, IMHO, too.

what makes me laugh about the android documentation is that even though Google's mission is to organize the world's information, the best android documentation and guidance i can find is organized by Stack Overflow.


Simple answer: Money + Prestige + Job security + Resume


Kinda the BBC of Silicon Valley then.


Maybe google employees aren’t incentivized to fix things as much as they are for making new things?


I'm probably responding more to this thread than I should, but I'm an a chatty mood today :-) I think there are a lot of factors and this is almost certainly one of them. Having worked in a couple big, wealthy and famous companies before, one of the biggest problems is that they are always making acquisitions. One day you're building X and then upper management thinks, "Woah... X + Y would be awesome. But we don't want to spend time to build the Y part. Let's acquire a company and glue it onto X." Usually that happens with great grumbling from the programmers who complain that X and Y were never meant to work together and it would actually take more time to get them to work well than it would have to build Y in the first place. Upper management doesn't believe this and says, "Stop grumbling". And, well, the programmers do what they have to do. So it ends up being super hacky... because it is super hacky :-)

But I think it's more than that. You know how certain things are obviously built by committees? You can see it because everything is consistent, but it's often got a lot of compromises. On the other hand some things are obviously built by individual contributors. It's got some things that are really great, but other things that are horrible because the developer just has blinders on. Many Google apps give me the latter experience. Also, when they have a suite of tools, although they are tied in together, they have wildly different UI, naming conventions, placement and layout of information, etc. Usually there are some really cool bits, but these bits are often not the point of the project and look a bit out of place.

I think teams in Google often have people who are confident and smart and the tools reflect that. It's a kind of "This is the way to do X", shouted 500 different ways. One of the biggest things I find frustrating is the mountain of trivia that I have to commit to memory in order to use their tools fluently. I really do think it reflects the type of people that Google chooses when they hire people. While they may be good at the things that Google screens them for, they may not be the best people overall when it comes to building finished products.


Your experiences closely reflect mine for GCP. Some of the tools, although amazing in and of itself, often have some minor flaw that makes the user experience extremely inconvenient when you encounter it many times. The simplicity of the inconvenience only exaggerates the frustration.


I've interviewed with Google and the interviews I got didn't really require having memorized algorithms. Maybe on the level of breadth first and binary search which are pretty basic. But I don't think most interviews used any algorithms from a book at all. I think their interview prep packet overstates the amount of knowledge expected.


We ... had different experiences. It was a day of implementing algorithms on a whiteboard. I was supposed to be interviewing for a management position so I didn’t get the study packet either.


I also interviewed at google, in the in-person all of the algos I got were very fair in my opinion, in that it took a little bit of thought to figure out what I needed to do, and then after that, most of the work was just turning that idea into an algorithm. It was all on the whiteboard but I thought it was fair. They were fine with little errors and we talked through them.

On the phone interview I was asked how to something I had no idea how to do. I have no idea how I passed the phone interview.

Seems like a mixed bag, the question as far as I know is completely up to the interviewer, which makes the interview very subjective when they don't use a normal question.


Interesting. My interview process was one of the most negative interviewing experiences of my life. Two of the people interviewing me were mostly silent except when they were combative. One algorithm I implemented was essentially the internals of a popular vi clone and was dismissively told the algorithm wouldn’t work in practice. I also argued Big O with a another interviewer who said nothing except my value was wrong (it wasn’t).


My interviews with Google last year did have a BFS and a system question, but they also asked me a problem that required dynamic programming and one that required fast exponentiation. I'd say that Google ranks 8 of 10 or so in how algorithm-heavy their interviews are.


"I've made a career fixing a ton of horribly shitty, inefficient code that's been produced exclusively by people who pass these interviews."

Well, the interviews are marginally effective at identifying new CS grads (who also are cheap and have few outside commitments - companies love this!) but not much else. You're exactly right to view it as an algorithms final exam or an algorithm puzzle contest, but Cracking the Coding Interview is an embarrassingly bad book, in spite of (?) its ostensible purpose of helping people study for algorithm puzzle interviews at Google, Facebook, etc..


>but Cracking the Coding Interview is an embarrassingly bad book

I like how you made this bold claim with zero justification as to why.


I've realized I can still have a great career, make a very respectable wage, and work on interesting, novel problems all without working for a FAANG. Moreover, I don't have to directly contribute to the proliferation of corporate surveillance as a societal norm.


> I've made a career fixing a ton of horribly shitty, inefficient code that's been produced exclusively by people who pass these interviews.

I love this comment.


Give them benefit of a doubt, you have no prior knowledge of the time/pressure constraint's and organization structure at the time they wrote the code.

I would say they probably made the decision like we all do when development work, that solving this n+1 or bubble sort, or api package doesn't have time in the budget.

Getting to market sometimes is more important to managers than optimal `correct` code, when the market isn't willing to pay for the services of `correct`.


Similar profile, would not considering working for them. My first and only interview with FB a few years ago I nailed the initial hoop jumping about IPC and other *nix things but overall just got a really bad taste in my mouth. Also went for dinner, which was nice but still, something is creepy about them. They are looking for some kind of desperation that I just don’t have. Sounds like some fun problem solving though.

Haven’t had a chance to hire any ex-FAANGers yet, do they compare well to the general market?


Having recently gotten an offer after interviewing at Google, I can testify that those preparation packets are a sufficient but not necessary condition for doing well on an interview, and seem to be primarily targeted at making sure people who want to prepare don't do so badly. I spent about an hour on the plane reading through some example questions online, and did fine. I suspect that a lot of questions I "figured out" a clever solution to could actually have been solved with some obscure algorithm, but by treating them as puzzles rather than memorization tasks I was better able to demonstrate my skills as a candidate.

In short, I think the exact same question is interpreted as a cool algorithm challenge or a recall check, and the interviewer will be fine with you rederiving the answer on the spot if you are quick thinking enough to do so.


What an eloquent way to summarize the silly elitism of Google interviews.

They seem to have no lack of talent and it certainly doesn't seem to be negatively affecting them. I wonder if this will change eventually, and Google will become IBM v2.0 (that is, a respected and profitable company which is mostly boring/unexciting).


I don't work in California and it took me a while to figure out what FAANG stands for. Have you guys ever thought that you're just in a bubble that's not representative of the industry as a whole?


Aside from all the things mentioned in the article, this also seems like a fairly predictable application of Goodhart's Law: "Any observed statistical regularity will tend to collapse once pressure is placed upon it for control purposes."

Once upon a time, skill at doing these sorts of problems might have correlated (imperfectly) with general aptitude as a programmer or software engineer. But the very act of trying to leverage that correlation for hiring purposes probably also made it go away. Now you've got a whole lot of people practicing hard on these sorts of problems, spending huge chunks of their free time grinding away on Project Euler and Advent of Code and HackerRank. That muddies the quality of this stuff as a proxy for what it was originally trying to detect: natural aptitude. I'm guessing having time to level grind like that also correlates inversely with other traits that are desirable in a programmer.


It is a perverse effect on the profession as a whole.

In the silicon valley I have a fair amount of colleagues that expect of any dev to spend a fair amount of their free time grinding on even more dev.

No surprise that there is a lack of diversity in the profession as a result.


I'm pretty sure it's age discrimination plain and simple. The only time in my life I would have been in great shape for the standard interview process without a good deal of study was a couple years in early graduate school, where nearly all of this "breadth" stuff was fresh in my mind.

The relentless scepticism about people's achievements is to some extent understandable (we've all run into the senior person who can't do fizzbuzz), but it ties neatly in with the idea that every new hire should be 25 at most.


Without a doubt it's age discrimination...you've hit the nail on the head. Also, God help you if you have followed a non-traditional career path where you decide at points in your life that you wanted a break from it all.

I, too, in 1988, fresh out of my BS in CP with no family or life experiences and a strong desire to code day and night would have sailed past the technical side of these often ridiculous interviews for jobs I could literally do in my sleep now.

But now, at 53, not only has Father Time fucked with my ability to sight memorize (something I took totally for granted in my younger years) without even trying, I find it almost impossible to hide my frustration at the ridiculousness of the very idea that I would be unable to do the work required of me for the job...

"Ok..we really need someone to fix X and Y on this website, and it would be just great if they could reconfigure Z on the server..." "Well, sure...I first saw X and Y-like issues back in the mid-90s in a networked client-server environment and I did X(x) and Y(x) to fix it, and again saw it in the mid-00s under the LAMP-stack and again fixed it doing X'(x) and Y'(x)...and the server issue is something I've seen over and over again during my 30y+ career..I am 100% certain I can solve these relatively simple issues for you guys..." "Oh...yeah ok...so can you whiteboard a bubblesort in Javascript for us? You have 15 minutes." "???"

...and of course I don't get a callback. Do people think I'm lying about all my experience on my resume or what exactly?

It's at the level of life and death for me right now, to be honest. I've been shooting out resume after resume for the past month, and nothing good is happening.

All I know for certain is...if the real issue was truly about finding someone who can do the job and fix the problems, there is no way in hell I'd still be looking now.


They don't want to pay you what they think (the you in their heads think you think) you are worth. That sounds convoluted but I've watched it happen repeatedly.

Bill budget IT guy, "What? We need to harden a server? Why do we need to pay this guy over a hundred thousand a year to do that? The internet is full of documentation. Let's hire someone with just enough technical know how to implement it."

The reality on the other side is that hiring an experienced engineer has it's risk. I've worked with 20+ years of experience engineers who did it because it was a job and didn't deserve the salary based on their skill set. Companies let this happen, you need to cap positions and then give inflation based raises.


> The reality on the other side is that hiring an experienced engineer has it's risk.

Totally agree and have seen it from the interviewee side. I interviewed in March last year as an experienced engineer, and my network was the number one source of interviews and how I found my current job.

Don't neglect your network, folks. As you age, it will become ever more important for your next position.


It is baffling that someone with over three decades of experience would be expected to go through the same interview process as a recent graduate.

When I graduated (over two decades ago, and in Belgium), testing was a regular part of the interviewing process. However, it was understood that this was only done for people with no or very little experience. Later in my career, this fact helped me weed out the jobs which were below my experience level. After stating such and withdrawing candidature for the position, in several cases I was even contacted to come in for a more senior position.

The world has certainly changed.


I'm about your age, and this is very relatable. My last job hunt took about six months. Among the other high points, I passed the on-site day at Google, passed the hiring committee, and then had an executive reach out to quash my hire.

I did end up finding something I'm enjoying.

Anyway, just wanted to send you some support. Keep your head up and keep going. You really only need one offer.


By accident or design, the "coding tests" seem to discriminate in favor of new grads (with CS 101 fresh in mind, and also willing to grind yet more artificial test-taking prep), and also have a component of hazing/negging, especially when administered to experienced people. (Asserting an imbalanced power dynamic from the start, and perhaps also exploiting a psych weakness, like "I'm jumping through hoops for this, so this must be worth jumping through hoops for." I'm sure most individual interviewers are of goodwill, and don't intend this, but that doesn't mean it's not present.)

Lately, I decline to interview with any company that requires new-grad coding tests for experienced people (especially if they require it even when the person has open source code and community participation that the company can look at). I usually do well, but even then, it leaves a bad taste.

Of course, I'm very happy to talk each other's ears off in energetic collegial discussions about engineering problems and technologies, including whiteboard brainstorming of approaches/algorithms, perhaps much like would be a part of everyday work. If anyone ever then interrupted, "Hold on, can you put in all the semicolons, so I can type it in, and make sure you know how to code," you might wonder how that's not already obvious to them, and where they're coming from.

This aversion to "coding tests" for experienced people seems to be more acceptable to small companies/startups (or small autonomous units in large orgs), than it is to less-flexible/agile large companies. Recently, after discussing my latest background with a nice FAANG recruiter, we had a good discussion about the company's practice of putting experienced people through what seemed like a new-grad vetting/hazing process, and why that's been a turn-off. They soon sent a followup email, including a quote from an engineer there saying "... I need to know whether you can code in a language," along with attachments on how to prep for their new-grad coding tests. :) For whatever reason the company insists on that process, it seemed like it probably wasn't on track to a professional relationship that I'd want.


I agree with you 100%, but I do have a bit of sympathy with the insistence that they want to verify that everyone can code and not just talk a good line.

I made the mistake of hiring someone quite senior through internal transfer once who was utterly and flagrantly unable to code despite the job saying this was a requirement - I probably could have caught this with a simple fizzbuzz, but felt like this would have been too insulting.

At at least some of the FAANGs it's also a pretty clear indication of the fact that you're going to get busted way down and work your way up. I was a Principal Engineer at Intel with a successful exit in a highly technical area, but I would be shocked if I wouldn't have to re-earn my stripes at most other companies. Most of the super-smart guys I know who went to Google, for example, got busted way down and quickly earned their way up.

So if we're not up to grinding and expect to go back in at a high level, maybe it's kinder to warn us off early. :-)


Except for people over 50 who have probably realized there is more to life, why would some demographics be willing to grind less than others?

Edit: By "over 50," I mean age demographics in general, which is the main thing that raises your family obligations. The only demographic division that I can think of that would reduce someone's willingness to abandon their personal life would be age.


They have kids. Developing a sentient being is a higher priority and more difficult than developing React Redux.


> They have kids. Developing a sentient being is a higher priority and more difficult than developing React Redux.

This is like saying: "I have a time-intense hobby that I prefer over working too much for you.".


I can see this sentence being an interesting Rorschach test. Either you think it’s quite sensible or utterly absurd. My own brain was even flip-flopping around for a bit before coming down on the utterly absurd side.

I think the maybe reasonable counter-argument against your sentence would be that while you personally do not have to value something like raising kids above your work, others very well might and do and there is nothing absurd or weird about that. Your framing betrays a certain kind of worldview where “work” is real and everything else is a mere hobby, a worldview that might be valid for certain people but is certainly not universal.


The gp actually just said what it said; in response to "They have kids. Developing a sentient being is a higher priority", that probably means "raising kids is just another hobby" - not a judgment of whether hobbies or a job should take priority. I.e. that him saying "developing a sentient being has higher priority..." is about the same as someone else saying "my creative hat-making has higher priority...". Sure, that may be true for many people. But given two otherwise roughly equivalent candidates with W>L and L>W priorities, why would a company choose the latter?


If work is everything, shouldn't we set aside some time to raise the next generation of workers? Working all the time is so defeating it even defeats the amount of work being done 30 years from now.

wolfgke 33 days ago [flagged]

> If work is everything, shouldn't we set aside some time to raise the next generation of workers?

I don't think work per se is everything. But working on world-changing things is.

At least concerning the situation in Germany, where there is compulsary school with a compulsary curriculum (vulgo: 18-19 years of brainwashing), I am working a lot time in the evening on an alternative curriculum (currently focusing on computer science topics since this is something I am hopefully knowledgable about) that enables people to deprogram themselves from this kind of brainwashing to enable them to develop their intellectual potential so that they can begin to work on world-changing things. A lot of highly gifted people already asked me multiple times when they can finally read the first text of my planned series (they are really eager for it) - unluckily there is still so much to do even on the first text.

So the first thing that we should solve is to prepare a curriculum that does not completely brainwash our children. Only when this problem is solved, we can begin to think about how we can set aside some time for them.


Rather: I work for you to finance the really important things in my life.

You work to live, not the other way around.


> Rather: I work for you to finance the really important things in my life.

> You work to live, not the other way around.

At least if you work in (academic) research, working to live simply does not work [pun intended]. The things that you work on in this area are the really important things in life.


A person can choose to spend less time on a hobby at any point, but once a child is born their parents have an ethical obligation to their children.


> but once a child is born their parents have an ethical obligation to their children.

Indeed - and that is why you should be very cautious to give birth to children; in particular if you have career plans.


No, this is the wrong way to think about it and it has destroyed our generation by its proliferation. You should give birth to children and have enough balance between your work and your life to raise them and still grow in your career. You should never have to choose one or the other.

This false dichotomy cuts to the heart of a lot of gender and racial diversity issues that plague the workplace in the US.


No, it's called having a life and a sense of proportion.

The sooner tech companies learn to embrace and work with this very basic fact, the better.


So the sooner they abandon all of their own priorities and instead adopt only yours, the better?


Let's put it this way:

"Sure, I'll consider abandoning all priorities (i.e. social life) and/or putting off others (kids) -- and maybe even take minor risks on my health -- if you you're able to appropriate compensation."

Guess what, though? In the vast majority of cases -- even when we're talking about the bulge-bracket FAANG salaries occasionally gloated about in these and other parts -- simply don't come anywhere close enough to providing that level of compensation.

All of their pretensions to the contrary.


Rather than the reverse? Yes.


Wouldn't that just be another expression of age demographics?


some people have kids in their 20s (gasp!)


I don't think you need to be over 50 to realize there's more to life. Some demographics are less likely to live to work v. working to live. Some demographics may not have time to juggle interview grinding with other responsibilities (children, existing job(s), etc.)


This. I don't have kids and I'm not over 50, but I agree with those people (if they hold this belief); not just philosophically, but also because I want to see a more diverse workforce in software development.


For a non-age-related one:

It's a lot more socially acceptable for members of one sex than it is for members of the other to be too busy with career stuff to spare much time for their families.

On a somewhat related note, single parents simply aren't going to have that kind of time.


People who have other responsibilities have less time to grind - and those other responsibilities fall more heavily to under-represented folks. Consider things like taking care of children (including siblings), elder parents, and other household work.


Humans are social creatures; we can't live on code alone.

If you come from an underrepresented demographic or have interests or social preferences outside the mainstream of the profession, it will be harder to fulfill your need for social affiliation by participating in communities like HackerRank in your free time-- because it may be harder to find people who share your values, and that you can identify with on a social level.

In other words: the social component of "grinding" is more fulfilling for some demographics than others.


People from some demographics are less likely to have the option to spend all their time grinding. Spending all your time grinding means that you don't need to work toward supporting yourself or others.


> The only demographic division that I can think of that would reduce someone's willingness to abandon their personal life would be age.

I can think of a few others tied to national work culture.


Well, kids, family obligations and any other non-work related stressors that zap your free time.


The tests have become a pure filter, screen out those who dont study heavily prior to taking them. They are uncorrelated from the candidates ability at every day tasks.


It’s like an industry hazing ritual. How bad do you want it?


Hmm. Y'know, that's a great point. I could build a whole conspiracy theory off of that idea:

If I'm a FAANG, I'm simply not using my normal interview process to hire for the really interesting jobs. I reserve those ones for people who got the job by virtue of their publication history in the academic literature, or because they built some well-known cool thing, or because they got promoted into the position. Those people get shunted over into the "you didn't come to us, we came to you" interview process.

The seats I'm looking to fill with the more public interview process are mostly seats for the grunt coders who work under those people. My ideal candidate for that position isn't some rock star creative genius; it's a workaholic who is resistant to boredom. And what's something a workaholic who's resistant to boredom would be really good at? Grinding away on programming interview questions, of course.


This isn't a conspiracy theory - it's literally exactly what business schools do...

Teach a bunch of people to think in a certain way, speak a certain language and respect authority. Someone who excels at the repetitive mundanity of business school will be a perfect junior marketing manager at BigCo.

It's basically taking the way the Army trains new recruits and applying it to white collar jobs.


Old quote I've forgotten the source but it's over 40 years old now.

'Business schools turn out well trained, amoral yet obedient clerks.'

Can likely say the same about most CS programs. I know you can say that about engineering.


> or because they built some well-known cool thing

What, like homebrew? :)


> What, like homebrew? :)

Context for everybody else: https://twitter.com/mxcl/status/608682016205344768?lang=en

TL;DR: author of Homebrew interviews at Google, but doesn't get hired because he couldn't/wouldn't invert a binary tree on a whiteboard.


Wow, this is an incredible answer from Max on Quora as well: https://www.quora.com/Whats-the-logic-behind-Google-rejectin...


I wonder if the added friction of changing your place of work caused by this practice is meant to somewhat counterbalance the heavy incentives engineers have to job-hop in the current climate. Kind of makes sense from the point of view of tech employers.


> I wonder if the added friction of changing your place of work caused by this practice is meant to somewhat counterbalance the heavy incentives engineers have to job-hop in the current climate. Kind of makes sense from the point of view of tech employers.

I am not sure what to think about this claim: it is the other company that prevents you from working for them by this interview process. The current employer has the incentive that you don't leave. The other (potential) employer rather has the incentive to poach you.


These same employers (famously) had a cartel that prohibited job hopping before - the Jobs/Schmidt email - so it’s not surprising that the system would trend toward the same equilibrium again.


> These same employers (famously) had a cartel that prohibited job hopping before - the Jobs/Schmidt email - so it’s not surprising that the system would trend toward the same equilibrium again.

This explains this phenomenon plausibly for the FAANG companies. But I think there also exist lots of startups that could easily act as a cartel breaker - to their advantage, since this way they can poach from other companies.


Time spent by a new employee learning the ropes is time wasted from the perspective of an individual employer-actor. On the surface, it sounds similar enough to the iterated prisoner's dilemma so I'm inclined to think that a greedy strategy would do poorly here.


Something about your comment instilled a sense of dread and hopelessness within me.

I don't know if I can ever be one of those 'built cool things/get published' guys, so I guess that means I'm destined for grunt work. Fk.


But they are not supposed to be boring? At least to me they never are, it's just that they are rarely high on my list of important (or fun) things to do on my self-improvement time.


This is exactly what it is. Which Frat house do you want into? Google? FB? Microsoft? Amazon?


Surely, this must eventually lead to negative results? It's curious because it doesn't seem to be having any negative impact as of now.


The companies have caught on to this and nowadays, even if you passed the technical bits with flying colors, they look to rule you out or rank you based on softer criteria, like how well will you fit with our team? IOW, how much like us are you?


I'm not sure if HackerRank has updated itself recently, but the last time I poked my head in there (years ago), all the answers were in the "Talk about this challenge" section. You'd just go in there and copy paste the code in, maybe change the variable names/order a bit. Ever since I learned that, it's never been that 'wowzers' for me.


"Ever since I learned that, it's never been that 'wowzers' for me."

I am not sure I follow. For people who want to, there is a section where they can find answers even if they have not solved the problems. What was 'wowzers' about HackerRank before you learned that?


Well, some folks would brag about their score (years ago, dunno about now) and use that ranking as an indicator of their coding prowess. When I learned that the whole thing was easy to 'cheat' (whatever that mean in this context), then that rank lost all prestige for me.


Bragging about your rating as a function of solved problems in practice (with solutions available) doesn't seem rational.

HackerRank also used to organize contests where they would have a certain time to tackle a number of new problems (5-10 problems in 1 hour to 1 week, depending on the contest) meaning you have no access to solutions and are competing against other people solving the same problems. They have a rating for performance on those as well.


Isn't the whole point of the challenge to figure out how to do the problem? What you say is literally just copy/pasting others' answers. Is that not looked down on?


Given the name of the site (I've never used it), I always assumed it was about proving your ability to others (like maybe future employers). Easily-found answers totally destroys the trust that's needed for such a thing to work.


Bingo. Years back, in what small circles I run in, the score was something to toot your horn about. When I discovered that you could 'cheat' on the scoring, it took away all the prestige from that ranking score. Honestly, other than practice, what use is it?


What’s frustrating to me is that we haven’t found a way to teach programming that doesn’t rely on natural aptitude.

It’s really a travesty that we can’t teach it the same way that we teach maths or natural languages.

The end result is we’re left trying to divine whether someone is the programming equivalent of being illiterate. As with illiteracy people find ways to fake it.


As someone who has studied both math and computer science (and is now a professional programmer), I have no idea what you are talking about. We are far better at teaching people to program than we are at teaching them to do math. Its just that most people self select out of math [0], so, due to selection bias, it appears that we are great at teaching those that remain.

As an aside, as someone who also studied linguistics [1] (and a couple of foreign languages at a beginner level), I am very confident in saying that our approach to teaching natural languages relies almost entirely on natural aptitude. It is just that, absent a serious mental disorder, all humans have a very large natural aptitude for natural languages

[0] Probably because we are absolutly terrible at teaching it.

[1] I assume not what you mean by teaching natural languages, but the topic of language acuisition (including in adults) does come up; plus it gives some perspective on how teaching language would look if we didn't rely on natural aptitude.


I took math as part of my engineering degree. And that was 30 years ago. The older I get the more I think the way it was taught was terrible. Same reason I think teachers are bitching about the US's fetish for academic testing. Testing pressures teachers to teach students to mechanically solve problems. But with shallow understanding.


I think of monads, which are real easy to explain in a programming context as soon as someone understands lists and map/reduce, but are total gibberish in a math context even if you've gotten through calculus.


I don't believe for a second it requires "natural aptitude" to program. The problem is any programming curriculum starts with a text editor open.

If a developer-to-be doesn't understand the framing context of what they are doing they are being dropped in a lake with no sense of direction.

Its why all the "naturals" started as geeks who played with computers from a young age. You learned about the environment you would end up working in and later on when you hit the grindstone and actually started creating gears to stick on that machine you had an idea what the result should look like and knew the tools in the shop when you set out to start building it. Even if you didn't know the steps involved in the process, you were familiar with the environment.

People who haven't spent time engrossed in computers, such as the myriads of youth entering a cs 101 class thinking its an easy career when the most exposure to tech they have had is maybe updating their phone and using apps for Facebook and Twitter and maybe owned a video game console with no tinkerability as a total black box drop out so fast. Their professors lead them to an anvil and tell them to forge a steel rod without any wink of an idea what a hammer is.

Its just not an answer anyone wants to hear, because the solution is only to have what amounts to an entire degrees worth of learning to predicate the actual study of programming. But you don't want your brain surgeon to go to medical school after having never studied high school biology or even more generally learned to read.


We convinced our parents to buy a computer because it would help our education when all we really wanted one for was playing computer games. Joke was on us though, because playing computer games at the time usually involved a lot of putzing around and figuring out how stuff worked, ultimately teaching us marketable skills.


I'm envious because my parents heavily restricted my computer use thinking it'd rot my brain or I'd get r*ped. They wanted me reading books, to become a lawyer or a doctor. I always had an affinity for technology. My folks meant well but I think not watering that seed has me in the middle of this lost life.


I'm envious of you. I had no restrictions on technology use, growing up. I dropped out of high school at 16 and spent the next 15 years doing very little of note (with video games consuming the bulk of that time). Now I'm 35, struggling through my undergrad, and surrounded by kids.

I have some ideas about where to go from here but it's not going to be easy. The search for real meaning and really meaningful relationships is ongoing.


I spent so many years yearning. It was so unfair that all of the other kids had Pentium computers at home, and all I had was an old 386. My parent's refused to get me a game console, or cable tv. I was SO BORED, in my desperation, I tried making games in QuickBASIC.

I still think my parent's should not have gone so Amish, but I also don't think I would have developed as a programmer without that.


Not having access to a powerful computer makes you appreciate the finer things in life, like hand-bumming instructions till your inner loop hits a hard deadline.


If it makes you feel any better, even many of those who had that seed watered still feel lost in life :)


I had zero games for my VIC-20. It was strictly a BASIC machine. (Later I had a TI-99/4A which had a few.) My dad is a kind of super-polymath, and in the early 80s he bought a computer and learned how to use and program it so he could grind out solutions to mechanical-engineering equations. He saw me become hooked on BASIC and went down to Crazy Eddie's to pick up the VIC for me so I wouldn't bother him on his rather expensive machine.

The personal computer grew up alongside us xennials, and some of us were just drawn to it, even without the promise of video games.


Totally true.

In my case it was a natural progression: "videogames are great!" -> "I have an idea for even better videogame!" -> "how do I make one?" -> "can I tweak this one into being a bit more the way I like it?" -> tinkering around data files -> "I really want to make my own game" -> picking up a programming book at 13 -> a programming career.


22 -> make crud Java apps at a corporate software farm for a decade and lose faith in humanity


Spend a few years putting in 100-hour weeks writing collision detection for this year's Dora the Explorer game -- a typical game industry position -- and you will be thankful for the crud Java app job.


using literacy model - we're testing people to write haiku according to all the rules of style and form when the job we're testing for is basically just low level clerical like copying documents (using copy machine), sometimes taking notes and filling simple forms.


The less talked about absurdity is that successfully passing programming interviews is a skill itself. It's especially absurd because the time I spend developing that skill is less time spent developing skills and knowledge more directly relevant to my job. Yet, programming interview skill is more relevant to progressing my career.

edit: now if you'll excuse me, I need to do some dynamic programming problems.


I've worked with some folks who were decent engineers who originally didn't get offers from FANG companies because they blew the interview; they later studied their ass off on leetcode and got offers. This did not make them better engineers at all.

Ultimately, its just studying for the test, very much like the ACT/SAT in high school. You can be great at taking tests but ultimately a terrible student or vice versa.


I go back and forth about how I feel about doing heavy algorithm/data-structure stuff in interviews. On the one hand, I have never once written any kind of sort algorithm or LRU cache by hand for a production system, because why would I? Pretty much every language's standard library has a fairly-optimized sort and caching thing built in, and if they don't then there's still probably a million outside libraries to do it for me.

On the other hand, I genuinely do feel theory is really important. While not knowing the minutia of a tim-sort doesn't indicate that you'll be a bad engineer, not knowing the runtime efficiency of a sort can lead to some really awful code. Not knowing when to use a hash table instead of a nested-for loop can be a sign that you don't really know what you're doing, and not knowing some rough theory on concurrency indicates that I might be stuck debugging your race conditions or deadlock.

I try to not be a complete jerk and I won't do stuff like give out an NP-complete problem (which an interviewer gave me once), nor will I ask for intimate details of how one would implement CSP, but I do tend to focus on theory-heavy questions more than my peers, but I try to give a fairly-generous amount of hints so that people don't get too stuck.


> On the other hand, I genuinely do feel theory is really important. While not knowing the minutia of a tim-sort doesn't indicate that you'll be a bad engineer, not knowing the runtime efficiency of a sort can lead to some really awful code. Not knowing when to use a hash table instead of a nested-for loop can be a sign that you don't really know what you're doing, and not knowing some rough theory on concurrency indicates that I might be stuck debugging your race conditions or deadlock.

Theory is important but what's more important is how someone applies the theory to actual problem solving.

Reversing a binary tree isn't testing your knowledge of theory as much as it's really just testing memorization of a very very specific application. Knowing how to reverse a binary tree or how a hash map works is pointless if the person can't identify when to use them when solving an actual higher level problem. No one is ever given a binary tree and told to reverse it in the real world, they are given a business problem that you identify can be solved efficiently by modelling it as a binary tree and reversing it.

I'd bet most of the good people who fail the "reverse a binary tree" type of questions would succeed if you give them a realistic problem to solve without forcing a very specific solution onto them. Either they will come up to the realization that the solution is to think of the problem in terms of binary tree or they won't.

And neither is a terribly bad answer either. If they recognize you gave them a problem that can be represented as a binary tree and efficiently solved by reversing it then there you go, not only did you prove they knew the "theory" but they knew how to apply it as well. If they don't recognize it you gain valuable insight into their line of thinking and they might find novel ways to represent the problem and apply other theoretical concepts that solve it efficiently (maybe more maybe less).


I don't think you're wrong in general, but in one specific case I actually can give an example where me not knowing how a hash-map worked was really bad. I was working on a very limited set of memory for a small embedded thing when I was an intern, and was using hashmaps all over the place because they're magic, and kept getting out of memory exceptions.

I learned later that hashmaps are inherently O(n) (or O(log n)), or they take a lot of memory; internal arrays can get super huge if you're not careful.

In this particular case, an ugly nested for-loop was the immediate solution, and eventually I was able to cheat a little and have hard-coded integer indexes and was able to use an array.

Anyway, your point is valid, I just figured I'd give an example where knowing the internals for a hash table would have saved me a lot of time.


Hash maps are O(1)


> On the other hand, I genuinely do feel theory is really important. While not knowing the minutia of a tim-sort doesn't indicate that you'll be a bad engineer, not knowing the runtime efficiency of a sort can lead to some really awful code.

OK but why don’t these interviewers ever ask “internals” questions? How does a CPU works, or what the different levels of memory hierarchy are, what an interrupt request is, what is pipelining, what is SIMD, what is a GPU, etc. Or how about compiler internals, what are the different stages in a compilers, what are the different grammars and parsers, what is an AST, what is interpretation, compilation, bytecode, JIT, and how does it work? How about database internals? How is a database implemented, what is relational algebra, what is normalization, what is a data model, how do indices work? Similar questions can be asked about operating system internals, networking, floating point arithmetic, etc. In my experience, no interviewer has ever asked me these questions. And as an interviewer, I have consistently been disappointed with candidates’ inability to answer basic and relevant “internals” questions. Ironically, the algorithms and data structures questions that are commonly asked in interviews are outdated ways of thinking which do not take the underlying hardware reality into account (memory hierarchy and parallelism). It could be because a lot of these FANG(-like) “engineers” are themselves not knowledgeable to ask such questions. Or they are aware that new comp sci graduates are unlikely to really understand anything at a deep level, but all comp sci students are drilled on algorithms and data structures. If you are targeting a very specific age group (0-5 years experience) and want a standardized test then I guess asking CS201 exam questions in a job interview make sense. Companies prefer the 0-5yrs experience segment because they’re cheap, don’t have children, are easy to exploit, and they are easier to “mold” into your corporate culture.


> In my experience, no interviewer has ever asked me these questions

Because not everyone writing Java code has had a background that exposes them to compiler internals or grammars and parsers.

But everyone writing Java code should at least be a little bit arsed to figure out what common data structures do - or how to write two for loops that print out "1 2 fizz 4 buzz"...


Where are those companies who ask such questions, and not for a embedded software engineer role but for someone higher up stack. I wish I knew


I think the important thing here is the difference between understanding complexity and implementing a sorting algorithm.

If I were hiring a carpenter I would want to know they understand their tools and when to use them. I don't care if they don't know how to make them. But of course knowing how to make them suggests an intimate appreciation for the craft (looking your way Matthias Wandel)


The flip side would be asking a carpenter to do trigonometry problems or something... Anyway, what's most important is measure twice and cut once. That and following a blueprint and going the extra mile.


> On the other hand, I genuinely do feel theory is really important.

My beef with this in interviewing is that a huge chunk of modern programming work is developing UIs - and user interface theory and skills are treated like some softball thing. You ask UI-related questions to interviewers and half the time you get some shrug "oh we use whatever", etc.


This depends highly on the domain. I'm the only person on my team with any level of frontend work in the past ~2 years. And when I did, I was able to consult with UX experts and designers.

I can't say that theory is more universal than frontend, though anecdotally, for me, it is. But ui isn't universal.


> Not knowing when to use a hash table instead of a nested-for loop

On the flip side, the O(1) look-up nature of hash tables make them the no-brainer data structure to use, at least for passing programming interviews. Perhaps more interesting test questions would be when not to use hash tables.


Fully agreed. Most questions I got when I started out as a data scientist interviewing for jobs were pretty simple: "use a hash table to count stuff, join stuff, lookup stuff". However not once did I get asked, "why do databases not exclusively use hash tables if they're so good?" That's a much more interesting question, though perhaps out of scope for a data scientist. I'd add that I've never heard that question being asked of engineers, I'd love to hear of people out there getting it though.


I actually got that exact question once, or pretty close to it. I'm paraphrasing, but the question was "Hash tables are cool, but what's a data structure they might use in a DB besides that?". This was pretty early in my career, so it stumped me, and he pointed out that a binary tree is used semi-often because of ordering.


I am surprised B-trees were not the first data structure in that discussion, as they are "famous" for being used in DBs.

Although these days one should talk about "learned index structures" instead, I guess :)


I'd be curious to see if there is any value in questions that focus on understanding of the basic ideas behind algorithms and data structures as a subject. Time complexity, sure, but what I would really want to know is, given a description of a data structure and some algorithms for manipulating it, can you identify the invariants that should be true of these rules?

That sort of thing might give a better sense of if someone has an instinct for how to verify whether the code is working. And, by extension, if they have a grasp of some concepts that go a long way toward helping a person come up with more reliable and maintainable designs.


> This did not make them better engineers at all.

How do you know?

I understand that FANG have themselves come to the conclusion that brain teasers are not necessarily very predictive for engineering performance.

But CS/programming questions for CS/programming roles? That seems sensible.

You can study lots of vocabulary to achieve better results on the verbal section of the GRE or similar tests. But afterwards, you will, in fact, have better vocabulary, I submit.

I am surprised that you categorically deny that they were better engineers after studying CS/programming questions.


>You can study lots of vocabulary to achieve better results on the verbal section of the GRE or similar tests. But afterwards, you will, in fact, have better vocabulary, I submit.

Yeah, but tests like the GRE claim to predict academic performance, not vocabulary. So unless you think augmenting one's vocabulary alone will make one a substantially better student, there's still a gap there.


Oh, you'll certainly be better at solving programming puzzles type of problems, but the thing is that those questions usually don't translate well into real-life problems that you'll encounter at work later. You can't architect a good solution IRL by trying to figure our what the test author wanted you to do, like with programming puzzles. And for implementing some algorithm keeping every detail of it in head definitely helps, but it's not a significant advantage over someone who just googled all those details 10 minutes ago. So both types of tests are not testing the real dev capabilities, they test how much they prepared for the test. IMO only tasking the candidate to build some real piece of software or even better refactor some real code, with enough time and full access to google and stack overflow, and being fully able to ask other people for suggestions & help will show you the realistic picture of the candidate's future performance as a part of the dev team. Put them in a real situation and give them the real type of problems, and you'll get the real results. Everything else is like giving sudoku tests hoping to choose the best mathematician.


As someone trying to break into the field, I'm very intimidated by coding challenges. The ones on leetcode take me an afternoon and some change to get through the easy ones. Some people seem to take no issue with googling the result then moving forward but I thought that wasn't in the spirit of doing these challenges.


> You can study lots of vocabulary to achieve better results on the verbal section of the GRE or similar tests. But afterwards, you will, in fact, have better vocabulary, I submit.

Maybe for a few days after the test anyways. And then...you push out those words you never use or read to make room for actual useful stuff.


It feels very much like that to me which is why it’s strange there isn’t a Kaplan equivalent.

There’s interview cake and leetcode, but I think people would pay $2k for a class that focuses on the questions and in person whiteboard practice.

They could collect information about the interviews at the major companies and then use those to create the program. For payment could also help candidates negotiate and then take a cut of the signing bonus.

If this isn’t part of lambda school already I think it should be.

I already work at a competitive tech company, but I’d sign up for this in a second to help me stay competitive for interviews.


App Academy was working on something like this, last time I checked. I believe it's more focused on people who've already been through their program working on getting the next-step-up job from their first so I don't know if it's generally advertised/ available


The only way I would consider paying for a course like this is if, upon successfully completing it, I would not have to do the technical portion of the interview at companies I applied to. They would accept this cert and just do the soft skills interview.


One could argue that CS undergraduate degrees are perfectly suited for this purpose. FAANG & Co don't seem to care for that. You've successfully passed a few dozen exams at MIT or Stanford? Well, let's make the reasonable assumption that you have no clue about algorithms and data structures and start with our third whiteboarding session, that should be a way better proxy for your knowledge.


Those incentives are misaligned. Not every law school grad can pass the bar.


They may very well be, but is the quality of signal truly worse than that which an interviewer can extract from a few hours of your writing code on the board?


There are people with degrees who can't code. I'd prefer we don't hire them, so we ask for coding.


That's called college, and experience has shown that the people who have the certificate aren't guaranteed to know how to write a fizzbuzz.


I think that’ll never happen because the incentives of the cert granting institution and the company hiring will never be aligned (or aligned enough to matter).


Sounds like a neat segment for Triplebyte to move to- or for a competitor to capitalize upon.


The major ones seem to be Outco and Interview Kickstart. But agreed, there's far fewer whiteboarding/interview process-specific bootcamps compared to web/mobile bootcamps.


> If this isn’t part of lambda school already I think it should be

It’s part of classes for our existing students, but it isn’t a single offering. One day.


Cool - if you need somebody to help I’d be interested.


Those courses are out there. I saw one for $5k and ran for, IIRC, 8 weeks, that covered the entire interview process.


I've noticed several groups that offer SE interview specific training out there now that I'm stepping through this interview process after the landscape has changed since my last position.

It's funny that after doing some searching, many proponents of this widely adopted and poorly researched interviewing methodology appear to be running businesses selling training for these interview processes... surprise, surprise.


This is exactly what Outco.io does, and I highly recommend them for this purpose.


Looks interesting - have you used them?


Yes! I felt I was noticeably better at selling myself in interviews, which was my primary goal for going through them. I didn't feel I had enough of a support network to properly prepare otherwise. There's a lot of technical stuff, that goes by very quickly, and also equal amount of non-technical, this-is-how-recruiting-works content.


If you're FANG you kind of want to hire that guy though -- the one who sees there's a task and a set of things he can do to accomplish that task. Same thing with the ACT/SAT, colleges want good test-takers because they're going to be giving a lot of tests.


What's more, it's funny reading that Google sends out prep packets saying to learn/relearn these algorithms. So they don't expect their recruits to know them, just be able to memorize them for the interview.


I've interviewed tons of engineers that ace the interview and you have no choice but to pass but you see later on writing abysmal code. The problem is that design, quality, and maintainability are not valued. As a result, you end up with extremely mediocre "hacky" engineers at larger companies that can spit out tons of valid code but lead to a tangled mess shortly thereafter.

Worse, there's this huge emphasis on "big O" with zero focus on clearly egregious bad practices (tons of copies, outrageous memory usage, casting between strings and numbers all the time). I've rarely seen clearly bad algorithms be deployed but I have seen plenty of unperformant code go out.


I myself too have witnessed a similar thing but with not-so-algorithmic-people. I think it all comes down to the company culture and the person actually being able to let go of their ego and improve their programming skills. It is similar to a senior developer who never learned to properly program and because of hir position is incapable of seeing faults in their code. Once you've gotten too accustomed to shitty code, it's hard to change your style which is understandable.

Probably biggest influence on this I think is the culture, if nobody tells you that your code is shit (in a kind way) you'll never learn better. But then again, some people are so hard-headed and full of themselves that they get defensive and never actually admit their faults. Though giving and receiving feedback is not easy, can give you an identity crisis once you realize you've been doing something wrong for years.

And then you have a bunch of Ivy League graduates who have spent years learning algorithms and are burning to use them but there's no actual problems that really need them.


Does anyone just give you someone else's code and have you write tests and documentation for it?


I had a coding test once where I was given someone else's code with an introduced bug, and I had to fix the bug. To do so, you were basically documenting the code along the way as you traced your way through possible trouble spots. It felt far more meaningful than the typical "implement some list traversal algorithm that you won't actually ever use at this job."


I've had a similar interview and although I did not solve the bug, I think the interview was much more effective than most algorithm based interviews. Not only did my interviewers get to see how I navigate a codebase and use my IDE/other tools, they also saw how I approached known unknowns and discovered information.


Note that one often-cited counterargument to its seeming "absurdity" is that it evens the playing field in a fairer, more meritocratic way.

No matter your background, what school you went to, or what randomized experience you got in previous jobs, every person has equal opportunity to study and practice the same algorithms on their own (as opposed to being lucky enough to be able to afford a top-tier $$$ CS education, or to being lucky enough to have the connections or chance to get certain previous jobs).

And thus, when applying to jobs, it becomes something more akin to a raw-ability IQ test, which you can argue is "fairer", especially when management realistically knows developers might be shuffled around all the time, and that the extensive SQL experience they were hired for will mean nothing when project requirements switch to a basic key-value store.

On the other hand, if you are interviewing for a highly specialized position that is fairly certain not to undergo change, then it makes sense that specialized experience could rightly count for far more than any kind of generalized intelligence or ability.


> every person has equal opportunity to study and practice the same algorithms on their own

Well, that's just not the cause - there are many groups of people who lack the opportunity to study and practice. Couple of examples: people with kids, people working 12 hour shifts, people without access to teaching materials, people without a sufficiently advanced machine to run dev environments, etc.


It's "fairer" if you're a recent college grad, probably single, definitely no kids or other family obligations, fresh out of school, and without much to lose if you spend all your free time studying for a "fair, meritocratic" test.


This is why this whole interview thing is so absurd. The amount of days lost by engineers to relearn obscure algorithms and training on leetcode while we could be coding for things that are actually useful.


obscure algorithms are useful. it isn't a perfect system but its better than what most people propose as alternatives, which is to just have an ad hoc conversation.

testing whether someone is willing to prepare for a thing is a relevant work skill test too.


Knowing the existence of, and understanding the performance characteristics of a broad range of algorithms is, to my mind, a much more useful set of working knowledge than the details of any individual algorithm. The latter can always be looked up. It is harder to find out the former and much better to say “yes, there is an algorithm such-and-such that may be applicable here, let me spend 5 minutes checking that.” In my experience, anyway.


Broadly speaking, that used to be how design patterns were often treated in interviews. Could you recognise, describe, code, discuss pros, cons and applicability for pattern X in given circumstance. Pretty easy stuff to cram unfortunately, which may be why they may have fallen out of favour in recent times as proxy measure for brains, but depending on the role in question that sort of material was often more likely to be useful in the job than a similarly broad knowledge of algorithms


>testing whether someone is willing to prepare for a thing is a relevant work skill test too.

Is it, though? I could understand if algo questions had some relation to the work you're doing, but your comment on why they're good is independent of the actual material. If we replaced the algorithm question interview with an interview testing obscure presidential facts, would it really be a useful test to have devs take? I guess it is a relevant work skill test in that you get people willing to put the work in/game the system, but it doesn't seem to be much of (if at all) an improvement from ad hoc conversations.


There’s a vast difference between, say, preparing for a school exam and preparing for a job interview. In the former, you know in advance what material will be tested. Job interviews are more like PhD comprehensive exams, or professional licensing exams. I know a lot of working professionals who are good at their jobs, yet admit they wouldn’t be able to pass the license exam today.


Go search for "Knights on a Keypad" (a formerly-common Google interview question). Trying to imagine any situation in which the solution would be useful is harder than the problem itself.


If you're trying to find a situation where that exact solution is going to be useful, you're not thinking about the problem from the right perspective.

Handled correctly, a problem like this answers several questions:

1) Can you correctly break down a problem like this into its components parts? 2) Can you recognize the overall class of problems that this falls into? 3) Can you transform this specific problem into the more general class so that you can solve it in a known fashion? 4) Can you think about and implement the movement? 5) Can you communicate while you're doing the above?

No one cares about solving that particular problem. But the answers to the above really are relevant. Being able to map novel problems onto known solutions is absolutely a skill that any competent software engineer needs to have. "Oh, you want me to do X? That looks a lot like Y, this thing we've already solved; maybe I can just implement it in the same fashion (or re-use our existing system!)"

I'm not saying that this particular problem is a wonderful example, or that I'd use it in my own interviews. But this overall class of problems really does have a place in interviewing when it's handled well by the interviewers, and arguments against it on the basis of the specific problem being irrelevant are really rather missing the point.


If it's true that this type of problem is relevant (I doubt it), then why not ask about a specific problem you've actually had to solve?

I don't do many interviews anymore, but I used to, and I had no short supply of problems that I actually had to solve in the course of my work that I could ask about. I don't think whiteboard interviewing is great in general, but if you're going to do it you can at least try to keep it relevant.

As a matter of fact, I did have to do memoization of graph traversals at least once for work (most programmers never have to do this), and I find that problem a lot more interesting (trait matching for Rust). I could easily give a talk about that problem. As for that interview question, though? I don't always do well with time pressure, and so I can't guarantee I'd be able to answer it to your satisfaction.


Setting up the same problem in terms of trait matching for Rust is quite a bit more difficult than this toy problem when you're in a limited-duration interview setting. All of the time that you spend laying out the problem is time the candidate doesn't spend solving it, and time you spend not getting a signal. The simpler the problem setup the better, in my experience.

I've done the "ask a real problem that I've solved before" thing, and I find that it usually gets hung up on details and context around the problem to the detriment of actually solving the problem at hand. That's not to say that such a conversation isn't itself very valuable, but that's a different interview conversation, at least on my team. We find it important to maintain some level of focus just to ensure that we're covering the various signals that we'd like to get from the candidate.


I am not at all confident that I would be able to answer Knights on a Keypad to your satisfaction, due to some combination of nervousness and time pressure. I probably could if I did many hours of interviewing practice, though. Absent that, I think there would be a good chance you would reject me on the grounds that I don't know how to do memoized graph traversals, despite the fact that I'm in a small minority of programmers that have shipped memoized graph traversals to production (and in fact a few teams at Google rely on my implementation of memoized graph traversals, ironically enough).

My view is that Google's interview process seems to work because Google gets so many applicants that they can afford to randomly reject most of them. It's not because it's a good process.


Using an actual problem you solved can also work against the candidate because most problems we encounter in our daily jobs require a lot more time than the standard 45 minutes reserved for an interview (actually only 35 minutes out of 45 because there's 5 minutes of intro/icebreaker and 5 minutes reserved at the end to answer any of the candidate's questions).

So it isn't fair to ask the candidate to solve a real bug or implement a real feature in only 35 minutes unless they've seen something similar before.

This is why big companies like Google are limited to whiteboarding interviews because they need to have an interview process efficient enough to properly vet and filter >1 million applicants Google receives each year.

Personally, I think a better interview process is a pair-programming or work audition for a day. But that is not even close to matching the scale of Google.

Let's say out of a million applications, maybe only 25% are qualified. That is still almost 1000 candidates to interview per day (number of U.S. business days in 2019 is 261; 250,000 candidates / 261 business days = ~957 candidates per business day). Pair-programming or full day work audition will not be able to accommodate 957 candidates every day.


Why can't it be done using Permutation and Combination formula? I am bad at dynamic programming so can't really get into the solution right now?


You mean that you can't imagine when a memorized graph traversal would be useful?

I find that hard to believe.


I have actually implemented memoized graph traversals for work (unlike most programmers). To begin with, I see no point in asking this when I could ask how to implement trait checking in Rust (what I had to use memoized graph traversals for), which a practitioner will find much more intuitive.

Moreover, memoized graph traversals don't get you full credit on this question. There's a dynamic programming solution, and in fact the ideal solution is one using matrix math, which is ludicrously divorced from anything most programmers would ever see.


So you are saying that companies are hiring people that are willing to prove that they can put "a lot of work" into preparing for interviews that have no relevance to the actual work? I can see that as a valid point, as a way to filter lazy individuals. (In the same way that a college degree is more a way to prove that you can sustain X years of learning things without dropping out).

But it will also filter out everyone that is opinionated enough to not do that stupid preparation work, and you will end up with sheep coders that will always follow the rules. Looking at Google, Facebook etc, this might already be the case. I will even go so far to say that they prefer those type of obedient coders than the ones that ask too many questions and get too creative.


No relevance would be like when I took the GRE for grad school. I studied things that had zero relevance to the CS program I ended up attending.

Reviewing algorithms for interview prep at least has some relevance to programming. While a candidate may not use that exact algorithm in their day to day job, they are creating ad hoc algorithms all day long. With that said, I don't think time pressure, white boarding algorithms is a very good job performance predictor.


There seems to be an assumption in these discussions that everyone has to prepare for these things in equal amount. But that's not the case.

(The counter to that is that if people can relatively-easily (single-digit days) cram for your interview, you're still not going to be effectively screening for at-hand pre-existing familiarity/knowledge.)


What about using a real job related task. If they can do that they probably can do the job.


The elephant in the room is that it's a surrogate for an IQ test, since those are illegal.


It is not illegal to use IQ tests as part of a job screening (in the US). It is illegal to screen in a way that is both discriminatory and not proven to correlate with job performance, but that applies equally to both IQ tests and algorithms questions.


IQ is proven to correlate with performance in virtually everything, so I'm not sure where you're going with that.

If I require all prospective engineering hires to prove explicitly that they have an IQ of at least 135, I will get sued. Do you think this is false?


Obviously I do not think that is false; you can get sued for literally anything. Do I think you would lose? Depends on whether you've done the research showing that 135 is an important cut-off.

To turn this around, can you point at the law that makes IQ a protected class or whatever you're claiming it is? I'm not going to have much luck proving a negative - Russell's teapot and all that.


Exactly. As I said before, in a world where both Google and Knuth exist, they aren't hiring me to be a algorithm reference book.


This resonates strongly with me. I know a few engineers who are bad to mediocre software developers, but excellent interviewees. That skill alone lands them offers at any place they want to work. They can whiteboard algorithms like there is no tomorrow, but they can't manage software complexity.


A false dichotomy.

Managing complexity is a valuable skill that should also be screened for at interview. At most top companies you are there for at least 4-5 hours so there should be plenty of time to evaluate that skill.

I think whiteboard questions are good, I want to know that this person is capable of writing difficult code if we need them to.

I also think we probably ask too many of them.

I have been on the hiring side and my experience so far has been that almost always the feedback is close to identical across multiple whiteboard questions. The questions are also so abstract that asking multiple to "prevent bias" seems ineffectual. What bias could there be, you either solve the problem or you don't. Bias is more likely to come in on the behavioral interviews. There should be multiple of those for sure.

Generally someone is either a good enough coder or not and they will display that consistently across all the interviews. You will see the same stuff throughout (good or bad variable naming, good or bad communication etc) thus asking > 1 coding question by default is a waste of everyone's time.


>A false dichotomy

It's not a false dichotomy because I am not suggesting that both skills are mutually exclusive. I'm highlighting some personal experiences where I've seen one skill is vastly overvalued compared to another.

>Managing complexity is a valuable skill that should also be screened for at interview

I've never been part of an interview, on either side, where I've seen testing for managing software complexity. Good interfaces, function design, side effects, state management, etc, are all second class citizens to finding the appropriate algorithm to solve the interview question. I've seen it over and over. The only time I've been close to a complexity management question was on a systems design question, but even then, it was only a very high-level systems discussion. I don't think most places know how to screen for it.


This is kind off strange for a non US resident to grasp. I've been employed as a programmer three times and noone tested my coding abilites what so ever on the interviews. No samples, nothing. Never heard of any collegue doing that either.


>This is kind off strange for a non US resident to grasp.

I've worked for two non-technical companies as a software developer and one highly technical company, and interviewed at a few Silicon Valley companies.

The difference between the interview processes is staggering; my current job's interview was two hours of conversation, no code tests, just a general assessment of "do you know what you're doing" by the hiring manager and a couple other members of the team. The highly technical company had a code assessment then the in-person interviews had zero coding.

The SV companies must have a good reason for this, but golly the amount of coding in those interviews is nuts. I'm a process over code speed kind of coder, and I've failed every SV-level test because of it; my code comes from talking to non-technical users like medical researchers and study operations managers and tossing something together in Python or a cloud service that makes their lives easier. Needless to say, I don't go over algorithm fundamentals on a regular basis, and I generally fall out after the first or second interview.

It's especially odd that interviews are so intensely focused on those couple hours since I personally don't see any dev or any resource for that matter contributing in any meaningful way in so fast a time, or even within 90 days. I'm not sure how this problem could be solved with the limited time companies can dedicate to interviews, though; maybe rely more on portfolios?


It's quite frustrating, I've submitted my portfolio of open source projects on GitHub for interviews. I specifically told the recruiters, HR personnel, hiring managers and some of the developers that the projects contain a large enough body of work to see examples of my code. These projects are quite comprehensive and not one person looked at them or mentioned them during the interviews.

Unfortunately, people in general rarely try to first understand what the candidate offers. It's more often ONLY about whether candidates uunderstand the exact way the company uses certain technology.


Knowing what good work looks like is a plus, but I wouldn't be convinced by cherry-picked successes. That doesn't tell me how much you struggled with them, how many others you failed, or how much of the work was actually yours.


I think most of your concerns could be answered fairly quickly in a conversation, though. I can look over a Github repo and check out its history and ask specific questions about changes and why they were made or why design decisions were made.

The cherry-picked successes thing is certainly a problem, though, and not just for coding. Maybe it's the candidates I've asked it but when asked "tell me about when you made a mistake in a project" they tend to answer with a strength and try to re-frame it as a weakness. "Oh, I worked too hard on this project and it made me tired" isn't a weakness. "I worked too hard on this project and that made me neglect business requirements since I was too myopic to notice" is a weakness.

Sorry if that's a tangent, it's been a pet peeve of mine since I started interviewing that not many people are humble enough or have thought enough about what their weaknesses actually are, and how that affects the success of their work.


Requiring irrelevant coding tests isn't going to give you those answers either.


This is also my experience interviewing for embedded systems and (hardware) test automation work at electronics manufacturers. I've never been asked to code anything. (Nor have I been asked to draw any schematics.) Just conversation.


> “The SV companies must have a good reason for this”

In fact, no, nobody has a good reason for it.


Go start a company. Hire people without doing any sort of coding interviews. Report back in a year.

The reality is that these are extremely desirable positions with a staggering number of applicants who _do not know how to code_. Not as in “I can’t solve a dynamic programming problem without studying up on it”, but as in literally don’t know what a for loop is.

The process is far from perfect and the frustration is understandable, but it works well enough as a filter from these companies’ point of view.


I imagine technical screening is necessary, but in-person coding assessments are nonsensical. It's not like programming is a spectator sport, so why are we tested on it live? I suppose the process is self-selecting, because I personally have given up responding to SV recruiters or interviewing with those companies.


Well it started off with Microsoft in the 90's, then Google in 2000's and then it just became common for every company in SV to conduct these programming interviews.


I don't see a "good reason" in your comment.


It's actually not as common as it sounds from reading HN. There is definitely a bias based on the fact that the big tech companies do it and startups tend to do it. Which many commenters here have experience with.

A lot of more traditional software development roles do not have much testing. Sometimes they will have a quick online timed test or a simple question on an intitial phone screen. That isn't to say that all companies do not have testing. But it's definitely the startup and big tech worlds that have the majority of it.


I've been in technical positions in the UK and Hong Kong, and in both cases the interviews were very technical. Hours and hours of technical assessment and questions.


The massive FAANG-types of companies receive so many millions of job applications, they championed these types of interviews to have a more objective way to filter through all the applicants. That practice has waned somewhat at the larger companies (not totally, but it's changing), but has trickled down to smaller companies.


I received a cold call from a Lyft recruiter recently for a senior machine learning position. I asked why they were reaching out to me and the recruiter mentioned that it’s an especially hard time to locate experienced machine learning candidates, they don’t have enough applicants.

I said I was interested in interviewing but that I would only agree to a process that evaluates me based on my previous work history, and not any onsite or takehome coding projects, system design questions or whiteboard coding questions.

The recruiter said she would run it by the manager, but thought it would not be possible, and a few days later I got a rejection email.


I understand takehome and whiteboard coding of course, but why not system design questions?


I dunno.. flip side to this is that you could name your own price basically.


Not really though. If your price is working in a quiet, private office with a door that shuts, or having humane treatment when you’re being interviewed, then it seems nobody would pay it.

Let’s take their word for it that they are desperate for a machine learning engineer. Then it suggests they care more about mandating workspace conditions (since financial cost even to provide thousands of workers with private offices in dense urban areas is not a realistic excuse not to do it) or trivia during interviews than about business needs.


This is such an important observation. Claims of limited talent pool don't align with reality. If demand were truly inelastic, companies would negotiate.


In some cases I do think there is a limited talent pool. Unwillingness to negotiate in those cases usually means the company would rather simply suffer along with worse business outcomes, or try to invest in a totally different area of work, than to accept they are at a negotiating disadvantage and that it may be the case that painful company culture changes have to be made (like giving some people private offices and not others).

“The market can stay irrational longer than you can stay solvent,” seems apt for this.


true that, and I wonder, based on the little info I've about employment in the US, it's so expensive to fire someone once you realize they are not a good fit? Even in my country (Arg), that has huge amounts of protections to workers, the first 3 months are consider trial, so you can fire someone without additional cost of the payed salaries.


Most workers in the US are under the "at will" regime: they can be let go at any time in theory. In practice, companies (especially big ones) usually try to "build a case" to justify the move.


What stops someone unqualified from getting through? What other filters do you apply?


Probing someone with very technical conversation about past technical projects is a much stronger filter to prevent unqualified candidates than passing CoderPad tests, whiteboard algorithms, etc.

The conditional probability you are hopelessly lacking software skills to do a job given that you nonetheless passed a TripleByte exam or something is quite high. Overfitting & memorization for the sake of the test is extremely common.

But it’s much, much harder to fake competence when needing to dynamically and verbally explain technical details in a conversational interview about past work experience.


I can, with quite a high level of competence, speak about projects I had no part in implementing or that don't yet exist.

It's far, far easier to fake expertise when you have more context than the person asking questions. Technical interview questions make sure that the question giver has more context than the recipient, ignoring pathological cases.


I’m sorry but you cannot do what you are claiming. It has nothing to do with whether the candidate has more context or not.

In fact, conversational interviewing like this has very little to do with any of the domain specifics of the project. The point is to recursively keep probing for deeper technical specifics, so they have to explain at finer and finer technical levels what were the tradeoffs, why exactly were certain decisions made or how were certain problems overcome.

It is precisely the situation when someone did not have to dig into the technical weeds of a project for themselves that they will not be able to fake or fast talk their way through this type of interview.

That is the number one, defining characteristic of this way of interviewing.


>I’m sorry but you cannot do what you are claiming.

But I have. Not in the specific context of a job interview, but I have absolutely convinced technical experts that my level of expertise in a field is above my actual level of expertise. Ironically, if this were a technical interview, I'd fail it, not because I lack the skills to convince technical experts of my non-existent abilities, but because you don't find me trustworthy.

>The point is to recursively keep probing for deeper technical specifics, so they have to explain at finer and finer technical levels what were the tradeoffs, why exactly were certain decisions made or how were certain problems overcome.

But without context, you can't effectively do that. I, the candidate, am in control. I can steer the conversation to avoid areas where I don't have expertise by answering all kinds of things: "investigating that was someone else's responsibility", "well we never tried anything else and our current implementation works well enough that we never needed to", etc. You can't know if those are lies or not. There are all kinds of completely valid non-technical reasons for decisions that may be completely outside of a candidate's control.

You're either forced to completely trust the candidate, or attempt to verify their authenticity during the interview, at which point you quickly venture into the land of bias and subjectivity.


> “But without context, you can't effectively do that.”

This is simply false. You don’t need context to understand if the breakdown of a technical problem into constituent trade-offs was appropriate or not — by definition that very breakdown into constituent technical details is the context.

> “investigating that was someone else's responsibility"

This just confirms to me that you are not correct in asserting people can just skate by these discussions. If someone tells me something like that, I’ll ask them what did that other person find when they investigated? If you say anything like, “I don’t know; that was their job not mine,” then you’ve lost credibility because you didn’t put that other person’s conclusions through strong skepticism until you were satisfied you knew the details well enough that you could own or support them if you had to. That is exactly the sort of thing that indicates bullshitting.

> “attempt to verify their authenticity during the interview, at which point you quickly venture into the land of bias and subjectivity.”

This is just wrong. You don’t ever “just trust” the candidate, that’s the opposite of this interview style. Further, you never “attempt” to verify authenticity.. you just do verify it, since it does not require context or domain specialty to analyze reductionist decomposition of any engineering work down into primitive constituent tasks or decisions that are universal.

This approach is far less biased or subjective than appraising “how a candidate thinks” while they solve tricky puzzles in a foreign environment with unrealistic time pressure.


>This is simply false. You don’t need context to understand if the breakdown of a technical problem into constituent trade-offs was appropriate or not — by definition that very breakdown into constituent technical details is the context.

I mean, yes you do. Context tells you which decisions are important. You're asking for the context you need to assess someone's technical competency from the person whose technical competency you're attempting to assess. They have every incentive to lie, mislead, or stretch the truth to make the context they give you highlight their skills more than the real context did. And no, you can't verify that.

>If someone tells me something like that, I’ll ask them what did that other person find when they investigated? If you say anything like, “I don’t know; that was their job not mine,” then you’ve lost credibility because you didn’t put that other person’s conclusions through strong skepticism until you were satisfied you knew the details well enough that you could own or support them if you had to. That is exactly the sort of thing that indicates bullshitting.

But now you're punishing someone for organizational things possibly beyond their control. If my job was to build a thing that made use of some blackbox algorithm, and John developed the algorithm, why should I put the algorithm under strong skepticism, perhaps that's how things work in your workplace, but there's no clear reason that mine work the same way. This is just a bias against people whose development practices aren't the same as yours.

>This approach is far less biased or subjective than appraising “how a candidate thinks” while they solve tricky puzzles in a foreign environment with unrealistic time pressure.

Except that, as I've literally just proven, you've failed to verify the authenticity of me because you've wrongly concluded that I'm not authentic. This comment thread exactly demonstrates just how you can fail to identify a good candidate with this method because if you dislike what they're saying, you'll unconsciously convert that to them being less authentic.

So again, either you take the candidate at face value in a situation where they have every incentive to lie to you, or you attempt to verify their authenticity, at which point that analysis is subject to a multitude of biases that have nothing to do with the candidates skill level.

You've just demonstrated this perfectly.


> “I mean, yes you do. Context tells you which decisions are important.”

No, you definitely don’t need to come into it knowing about this, and even if you don’t know about this ahead of time, it won’t imply “just trusting” the candidate or being overly subjective. You will ask the candidate to explain why various decisions were important, and not stop at the top line answer but recursively probe into it, breaking it down into concepts and trade-offs that are universal in any kind of applied problem solving.

> “But now you're punishing someone for organizational things possibly beyond their control. If my job was to build a thing that made use of some blackbox algorithm, and John developed the algorithm, why should I put the algorithm under strong skepticism, perhaps that's how things work in your workplace, but there's no clear reason that mine work the same way.”

I’m sorry but this also just isn’t true. If you are describing your contributions to projects and all that keeps happening is you hit walls in your explanation where someome else did the work and you did not review that work at a high level of depth, then you’re just being misleading about your contributions at work.

Your job as an engineer in a company is to solve problems for your stakeholders, whether that means building tooling for other engineers, assisting designers with prototypes, designing algorithms for core product functionality, sales engineering for client stakeholders, etc.

It doesn’t matter how your company is structured, it doesn’t matter how the work was divided up. Your job is to know about the stakeholder problem you are solving, at a deep level, and when you represent your work to other people and you fail to offer technical depth about the trade-offs needed to solve stakeholder problems, that’s a clear mark against you as a candidate.

It’s bewildering to me that anyone would think that the way their current employer organizes assignments should reduce their burden of knowing how to represent their projects in significant technical depth. That is an always-on, never mitigated, constant responsibility for all employees anywhere. You’re not holding anything against someone if they can’t provide that in an interview... no, you’re just uncovering what they’ve lied about or embellished on a resume.

> “Except that, as I've literally just proven, you've failed to verify the authenticity of me because you've wrongly concluded that I'm not authentic.”

I see no such proof at all, and the cheeky rhetoric just makes me feel more entrenched that you are bullshitting hugely in this thread.

> “So again, either you take the candidate at face value in a situation where they have every incentive to lie to you, or you attempt to verify their authenticity, at which point that analysis is subject to a multitude of biases that have nothing to do with the candidates skill level.”

You are doing nothing but gainsaying here. You’ve made no argument that would support any of these strong conclusions, especially not any reason why this interview method faces the false dilemma between either just trusting the candidate or else succumbing to biases.

You are just asserting things, but they do not seem to be connected to or bolstered by any of the other things you’ve written.


> You are just asserting things, but they do not seem to be connected to or bolstered by any of the other things you’ve written.

Let me lay it out clearly: I asserted that I can, and have, inflated my abilities to people who have technical know how. This was in response to you stating that "I’m sorry but you cannot do what you are claiming."

So to be clear, at this point, one of two things is true:

1. You are wrong

2. I am a liar

To you, it is clear that point 2 is the true one. To most readers, this is not as obvious. Once you have decided that (2) is true and I am a liar, nothing I say can or will convince you otherwise. But you haven't decided that based on anything factual. In fact, (1) is true here. I am not lying. I can, and have, done the things I claim to have done in this case.

I'm using this to demonstrate that your ideas about such a conversational interview don't work, by pointing out that in the conversational interview that we are having right now, you've decided, based on a preconception, that what I say cannot be true! I could be the world's most successful conman, but because you're preconceptions lead you to believe that your preferred interview process is effective and is less biased than your non-preferred one, you won't accept evidence to the contrary.

>I see no such proof at all, and the cheeky rhetoric just makes me feel more entrenched that you are bullshitting hugely in this thread.

Right, and my point is you're wrong and unwilling to accept that. And that is a demonstration of you not being able to effectively figure out whether or not someone is bullshitting from a conversation with them. You've decided that I'm bullshitting because the alternative would require you to do a lot of introspection about how and why you analyze candidates the way you do. So it's easier to just say "you're bullshitting" and then not put in the effort. And that's certainly your prerogative, but its not at all a good look for your interviewing capabilities.

That you're so prone to cognitive biases that you're willing to completely write off someone's experience because it forces you to rethink something you hold dear is not a selling point of the process you espouse. It demonstrates, like I've said, that the process is prone to cognitive bias and is therefore decidedly not objective.

That is, there are two possibilities:

1. You are wrong, and you're refusal to accept that is coloring your perceptions of our interactions in such a way that you are not able to be objective about my experiences and abilities, as I claim.

2. I'm completely making everything I've said up and haven't ever been able to inflate my abilities to anyone. Your person-analytical skills are infallible and you've caught me.

I subscribe to (1), you continue to wrongly believe (2). This is expected, its why your process isn't as objective as you claim.

>It doesn’t matter how your company is structured, it doesn’t matter how the work was divided up. Your job is to know about the stakeholder problem you are solving, at a deep level, and when you represent your work to other people and you fail to offer technical depth about the trade-offs needed to solve stakeholder problems, that’s a clear mark against you as a candidate.

This is, again, your opinion of how engineering should be done. Not every engineer has the opportunity to work in a workplace where that's how things work. Are you going to write off everyone whose experience has been in a PM led environment because they haven't had the opportunity to develop using the process you prefer? If so that's again your prerogative, but you're probably filtering out a bunch of good engineers.


Are you talking about TripleByte exams from experience, or are you speculating that they are similar to other software interviewing processes? The questions they asked me covered what I think of as a staggeringly wide range of knowledge, including sysadmin stuff, detailed knowledge of four or five programming languages, POSIX semantics, high-level scalable systems architecture, and so on. It seemed to me that it would be very difficult to "memorize for the sake of the test".


Maybe we just see it differently, but I’d consider everything you’ve listed as exactly the sort of stuff that just gets memorized. I don’t even think the breadth of what you listed is that bad actually. Especially for recent grad brogrammer types with no serious life obligation time commitments, memorizing an entire program of study like that is probably only an investment of ~1 month of time, and has practically no correlation to on the job effectiveness once hired.

I’m speaking from ~10 years of experience running my team’s recruiting in a quant finance firm, where many interview requirements / tests / etc., came down from executive managers, so I got to see a wide range of performance on tests of all sorts, riddles, hardcore algo trivia, etc.

The sum total of all that leads me to believe quite strongly that the best signal to noise comes from super careful and tedious resume selection followed by conversational and behavioral interviews that recursively probe into more specific technical details.


I agree that any one of the questions could have been memorized, but it seems to me like it would take a year or two of constant memorization, like more than 8 hours a day, to get all of it, even assuming there was a place that had the relevant information conveniently formatted for memorization — which, as far as I know, there isn't. But maybe I'm just bad at memorizing things?

I note that you didn't answer my question. Have you taken the TripleByte exam or not?


Interviewing is a skill regardless of what field you are in. I agree that algorithm are not optimal for testing engineering skills, but there's a whiff of entitlement when engineers get indignant about having to spend time preparing for an interview. Objective evaluation methods for evaluating candidates are hard to come by, and I don't see obviously better alternatives.


Hi.

Where are you learning DP problems from? I am very bad at those and need a few good references so that it sticks in my memory.


I'd suggest kattis and CodeForces. I use kattis mostly, but know a number of people who like codeforces better.

Here's a few for you to try. Some of these are pretty hard, but you should be able to find solution sketches online if you google the contests they are from.

https://open.kattis.com/problems/increasingsubsequence

https://open.kattis.com/problems/maximumsubarrays

https://open.kattis.com/problems/tray

https://open.kattis.com/problems/dinnerbet

https://open.kattis.com/problems/hyperpyramids

I've solved all of these as well, so if you get really stuck feel free to reply here and I'll try to guide you through them.


Thank you. I will definitely be coming back. I guess I have difficulty formulating a DP problem properly.


I found another recently that has DP on a tree:

https://open.kattis.com/problems/theescape


Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: