Some of the best software engineers I have seen in my life were not trained computer scientists and would've failed many of the questions presented in this article. Not because they weren't highly intelligent, but because terms like "bubble sort" meant nothing to them.
E: The company had around 10 employees. "Interview" processes often went like this:
1) Establish first mail contact (either because they are applying for a job, or because the company found out about an interesting tool they wrote, or met them somewhere, or...)
2) Discuss skills of candidate, decide whether the skillset fits the companies needs
3) If so, take a programming job fitting their skillset (usually some low-priority ticket) and give it to them as a freelance job
4) If delivered in good quality, pay them and discuss future job options. If delivered in bad quality, pay them and don't discuss job options. If not delivered, ignore.
Which I always thought was a very reasonable approach. Very often, the people weren't even interested in a job, but wanted to continue to work as freelancers.
No libraries meant we had to manage our own data structures.
We had our own UI system, so understanding how to walk trees and manipulate them efficiently was crucial. Removing duplicates from an array in an efficient manner, important.
Linked list stuff? Oh yeah, very important. Linked Lists underlay a lot of basic OS data structures.
We did have real world problems in the interview mix as well. We'd give an underlying low level API and ask how it could be used to construct a more user friendly wrapper API. We'd give examples of in memory layout of objects and ask how that layout could be improved.
At the end of the day, we wanted to hire people who could solve problems, but also sit down and have a good discussion about how to solve problems.
FWIW it was always possible to tell, with junior employees, the engineers who had gone through CS and those who hadn't.
A good decade of self study of course bridges the gap, but right out of college? I've seen very smart people, good coders, who didn't realize they were working with relational data, and over-complicated solutions.
I also saw people trained as game programmers write mind blowingly fast code that computer scientists would be hard pressed to understand without a fair bit of study.
I had to spend a fair bit of time explaining isomorphic tranforms of code/data structures/concepts though. Game programmers pick up on that one quick, "like a matrix, but instead provable facts about software."
I also wasn't allowed to tell people what job they were interviewing for. That made things extra fun!
Teaching people who had spent their life in tight highly performance optimized C++ how to work in embedded wasn't that hard.
Except for the "so, now we measure ram in Kilobytes!" part.
But, there is a reason the Microsoft Band had real alpha blending, transparency effects, a simple particle engine, and true type fonts, all with a 120mhz CPU and 256KB of SRAM (and a few more megs of super slow RAM for cache sitting off the side).
That's fine, but understand that they often would not be a fit. If the position is low level systems programming for example, not knowing the term "bubble sort" indicates that they will have troubles in getting up to speed in a modern, complex environment, even if they are very smart. Same if the position is e.g. to work on a database, i.e. its actual implementation.
Solid knowledge of both abstract and specific aspects of computer science and engineering are prerequisites for a lot of programming jobs, especially where you work on things that consist of the environment itself, instead of the applications that run on it (though this is sometimes true for those, too).
I've always wondered why this isn't more common. It's win-win to be able to test the waters in this way. What downsides am I not seeing?
Being unable to work 10 hours to prove it shows you are a pauper, all the other excuses are complacency.
Also, if anyone should be able to make another 10 hours in their week to work in tech why not say Steve who works at $BigTechCo? Now I know we just asked Steve for another 10 hours, but lets go ask him for another 10. No, the circumstances shouldn't matter, he's making 6 figures after all. Let's ask him again. And again, And again. Oh, Steve quit? What a pauper.
So unless you advocate for completely blind interviews that do not rely on a resume and only take up 3 hours of someone's day, your point is moot.
Thats not a "people cant do 10 hours" its a "they dont want to" which is perfectly reasonable. The company that gives 10 horsu of work better be appealing.
But saying that take home assignments are bad because they take time people cant afford to give is complete bs.
Applying ten places and getting 100 hours of homework knowing there's a good chance of ten businesses ghosting? I have plenty of empathy for people who have to make that choice.
It is interesting to watch, but not convincing as an argument.
What about I am not doing that, because there are places with same salary and more consideration for people who work.
Not spending 10 hours to decide where you might spend the next 20 thousand is plain penny wise, pound foolish. If a job ceteris paribus pays 5% more on a 100,000 income, and doing the work assignment gives you 50% changes of getting that 5%, its like getting 2,500 U$S per year for 10 hours. It pays for itself.
Also, the assumption that forfeiting those 10h will make one financially worst off is likely wrong, especially in long term.
Lastly, we are not talking about 10h deciding, we are talking about 10h of a single take home assignment - there are more companies to talk with and decision making time is not in it.
Everything else regarding your argument has already been covered by other users.
Why would I go through the song and dance? The salary ranges for senior developers/architects were all around $20K of each other. Unless the company is offering salaries in the 80th percentile, why would I jump through hoops just to work on a company’s software as a service CRUD app?
My last job hunt, I found the job that met all of my criteria (salary, technology, commute distance, size of company) within two weeks of looking.
Here's what questions I'd love to hear be asked of me if I'd ask the candidate "create a bubble sort algorithm".
1. How many bubbles are there? (collection size and knowledge of collection data structures)
2. What sizes, and how many or percent of each size? (constants, basic math, possibly low-level statistics knowledge)
3. Do new bubbles appear/disappear, and if so do I know what size they appear as, and do bubbles change size? (new inputs to an existing set of data)
4. Since I'm sorting, would you prefer a particular end result? (2d or 3d structure or something similar things)
that said, we have one extremely exercise (a simpler foo baz) just to avoid going in deep with people that lack the basic skills, of which there are many and were clogging our hiring queue.
For instance if someone with kids is already in a job, it might be tricky to find those extra hours.
1) Don't just memorize this list. The fact that this list is publicly available means that we know not to ask these specific questions.
2) This is a very good way to practice the types of problems that are often asked in interviews, so if you can tackle these then you should be good to go.
A lot of people on HN and in real life (I've had people get angry for asking them to write code in an interview) hate these types of interviews. There's a lot of research done on this topic and so far, it's kinda like democracy. A crappy system, except for all the others. For all the complaints about this, I have never seen anybody propose something that satisfies the long list of requirements for interview questions.
Here's the list, pasted from an HN comment I made a few years back on a similar discussion:
Objective - the process avoids human bias (this guy was in my frat, therefore is awesome).
Inclusive - someone doesn't need extensive past knowledge in a particular area of coding to do well in the interview.
Risk Averse - Avoids false positives.
Relevant - it properly tests the abilities needed for a coding job, and not irrelevant ones (like the ability to write an impressive resume).
Scales - this will be used for tens of thousands of interviews.
Easy-to-do - this will need to be done by hundreds/thousands of engineers, who would probably rather be coding.
For example, I (non seriously) propose chess puzzles for interviews then:
Objective - the process avoids human bias
Inclusive - anyone can learn chess, and its played internationally.
Risk Averse - Avoids false positives since the puzzles are hard
Relevant - it properly tests the abilities needed for a coding job, like focus, dedication, intelligence, and preparation. Edit: I have friends in cs classes who have had to implement knight tours and queen puzzles for class as well, so its even cs relevant!
Really, I think 'relevant' is the category most people contest.
Note: chess really is a great game
The big-cos are also leaving a lot of talent on the table -- which makes it silly when they talk about a "talent-shortage." But, they're also paying enough where people WILL game these interviews in order to get the job... So the best argument I've seen for why these interviews work is that it selects people who work hard at learning this type of problem solving, to get the job... which correlates well with how hard they will work at the job.
Having an "objective" process is near impossible... and in fact probably not something one should strive for. Everything we do has humans has intuition behind it, whether good or bad. I think intuition is very important for finding a maximal team, especially a small one.
"Inclusivity" is kind of weird here.. most often you'll want someone with domain knowledge to get a project going quickly.. It's definitely less important for big-cos, but you still end up interviewing for a web job, an iOS job, etc.. so domain knoweledge does come into play.. at least at the big companies I've interviewed for (google, fb, lyft, uber, etc..)
I doubt this process is risk averse... you'll find people smart enough, but you learn little about their character.
Relevancy seems like a joke in your list... unless all you need is people to iterate on a small set of already solved algorithms in slightly different ways. I'd say that applies to zero companies. Except maybe leetcode, which helps people study for these tests.
The process scales well, and is easy to do, i think thats its strong suit... it's mostly effective across a ton of people.
So in summary, I agree that it seems like the best known process for these big companies. but I doubt it's as risk-averse as you think, and if you truly have a talent shortage, it's an artificial shortage created by your interview process. You might need to take some risks on people if you need to grow faster...
That being said, having that big-co process is great for startups! There's a lot of exceptional talent out there that will fail at the big-cos interviews, but will be better than their engineers for your team :) (There's also exceptional talent that will succeed at those interviews too, but would rather work at a startup) It might be a matter of using your intuition to find them... it's definitely a hard game.
I think this recent data-driven trend that is spreading across all realms of companies is poisonous in some areas where it does not belong... Human intuition still wins in a lot of areas, if it's good.
> Having an "objective" process is near impossible
You're completely right here, however there are plenty of biases that can be avoided by attempting some level of objectivity.
> I doubt this process is risk averse
The risk-aversion is about avoiding hiring people that can't write code. You're right that it doesn't filter for character, which you will discover very quickly when working at larger companies! However a lot of the processes such as code review and design reviews help mitigate some of the bad character quirks that impact software. These processes are often not present at smaller companies, so the impact of character flaws is much larger there.
> Relevancy seems like a joke in your list
I'd say it's one of the most important ones on the list, especially for startups. When I was hiring for startups, I would get loads of people who could talk the talk but couldn't write a line of code. "Relevancy" means that your interview confirms that the candidate can actually code rather than just talk about it.
I haven’t really met anyone yet who could talk the talk but couldn’t write code... I’ve definitely met people who have tried... maybe I have a good BS detector, or maybe I’ve been lucky? All people I’ve hired without doing the BST/linked list/DP problems, Ended up working out great. They probably can’t do those problems still, but they can code a damn good app. Guess it goes back to not caring about false negatives
It proposes to find the missing number in a range 1, 2, 3, ..., N by computing the sum of 1+2+3+...+N and then subtracting the sum of the given numbers. It tries to use the old Gauss formula Nx(N+1)/2, but gets the implementation wrong:
int expectedSum = totalCount * ((totalCount + 1) / 2);
It happens to work for their sample program because they used N=5 with the array [1, 2, 4]. Had they picked N=4 with [1, 2, 4], this program will say that the missing number is 1.
I didn't get any further in their list of questions.
It's almost always presented as having the tortoise and the hare start on the same node, with the tortoise advancing one node per iteration and the hare advancing two nodes per iteration. Let's call this a (1,2) stepping version.
That suggests the first question to explore: for what positive integers n, m does an (n, m) stepping always work, when the tortoise and hare start from the same node?
A lot of people from what I've seen think that n and m have to be relatively prime, but that is not correct.
The surprising answer is that an (n, m) stepping works for any positive n and m as long as n != m.
This is surprising, because you might expect that if you were using something like a (2, 4) stepping, and the cycle length was a multiple of 2, you could get the tortoise and hare stuck in non-intersecting loops. E.g., if the cycle length is 8 and we number the nodes in the cycle 0, 1, ..., 7, and the nodes before we reach the cycle are -1, -2, -3, ..., couldn't we end up where the tortoise gets stuck in looping over 0, 2, 4, 6, and the hare gets stuck out of phase with the hare looping over 1, 5?
It turns out that there is no way for the hare to enter the loop at 1 without the tortoise also entering the loop at 1, which puts the tortoise on the 1, 3, 5, 7, and it will meet the hare. This is a consequence of the tortoise and hare starting from the same node.
That suggests the second question to explore: what is required for an (n, m) stepping to guarantee it works regardless of the starting nodes of the tortoise and hare?
The answer there is the difference in speed, m-n, has to be relatively prime to the cycle length.
Note that (1, 2) is a stepping that works universally. If the tortoise and hare start at the same node, it works because 1 != 2. If they start on different nodes, it works because 2-1 = 1, and 1 is relatively prime to every possible cycle size.
But (1,2) is not the only stepping that works universally. (n, n+1) works for the same reason.
That suggests a third question to explore: are there circumstances where it is better to use an (n, n+1) stepping with n > 1 than to use (1, 2)?
You may have discovered why: it's not actually as easy as it seems when you know the answer.
There's a blog post about this somewhere, how it's not actually a great question to ask.
If I wanted to become a better programmer, what would be the best way of doing that?
In other words, if I practice this style of problem solving day in and day out, will that make me 10x more efficient at my day job? Honest question... I don't know the answer, although, I doubt it will.
First question: "How would you find the missing number in an array?"
Easy answer: If there's only one missing number, XOR every number in the array together and XOR that with your expected result, from XORing every number you expected to have been there. That's the missing number. It even works with nonconsecutive numbers, which your n(n+1)/2 solutions won't get. And for consecutive numbers, to get the XOR of the whole list from 1 to N, switch on N % 4, returning 0: N, 1:1 , 2: N+1, 3: 0 .
Hard answer: I can't think of any real-world situation in which the developer couldn't solve this problem better by ensuring it never needs solving at all. Keep track of numbers that were never added, or later removed. What are the missing numbers? Return the list. Fix the problem upstream, and you won't have to recover missing information later.
Of course giving this answer WHICH SOLVES THE FUCKING PROBLEM gets me instantly disqualified for not being cumbersome enough for the liking of the rookies running the industry.
My point was that solving the problem as posed does not solve the problem inherent in posing the problem. Finding the one missing number in an array of otherwise consecutive numbers means that you were already storing N-1 numbers instead of the one number you wanted. If you stored the missing number instead, it's trivial to construct an array from 1 to N that skips over it. Save space, save time.
And if the profiler says that this problem is on a critical path in an inner loop, optimizing the heck out of it still won't help as much as unmaking the problem entirely, pulling all the cumbersome calculations out into less-used code. If the profiler says that it barely ever gets hit, then there should be zero algorithmic tricks, and the code should be optimized for human readability instead.
As an interview question, it might as well be a cryptic crossword clue, because the solutions that would be put in place by perfectly good developers are not the ones in the back of the textbook.
So now that we've set the scene, let's scale for a million numbers, rounding up to 2^20. The existing solution needs at least an extra megabyte (since you are not setting individual bits, it will be at least a byte per number), and a second pass of scanning through that entire megabyte. Fine in a web application, maybe not so much in an embedded system kernel. How can we improve?
And so we got a discussion started.
I personally would not ask that question, since it's really a bit too generic, but you get the gist: Start out with something that sets the scene (and catch unqualified people really early), and opens up discussion for more and more interesting discussions.
That's assuming a lot already. The array might have come in externally, with no control over the cadence and format it arrives in, so no way to "keep track" until you are handed the final state.
At the core, what you should be doing if you want to hire properly, is to take a break approximately 10-15 minutes into the interview. Go get a bottle of water or something. During the next 1-5 minutes, jot down on a piece of paper your initial impressions of the candidate, your suspicions and biases.
Some studies show that based on folks viewing the first five minutes of an interview, they can usually predict the outcome. One can interpret that in several ways, but the more cynical is to just say that most interviews are exercises in confirmation bias: you get some suspicion of the candidate, you spend the rest of the interview looking for evidence of your suspicions so you can feel good about your eventual predetermined decision.
If you want to do better, it's very simple: do what a scientist does and flip this on its head. Write down your suspicions and then try to disprove them. You think that this person is too egocentric for your company? Then ask them to talk about how they have worked with others, ask what they'd do if a coworker was struggling with a personal issue, try to gather evidence that they can actually be compassionate and empathic. You think that this person cannot program their way out of a bag? Throw them a softball programming question like FizzBuzz. You think that they have encyclopedic knowledge of algorithms? OK, then ask them to implement a Dijkstra's algorithm and if they don't know what that is, a breadth-first traversal.
What people are really sick of is that people use these tools as crystal balls into their own psyche: You as an interviewer tend towards "he got QuickSort wrong!" if you subjectively didn't want him in your company, "she got QuickSort almost right, except for some finagling around the pivot!" if you subjectively did want her. What's missing is deliberateness: choose what you want to know deliberately, choose the question deliberately to satisfy some goal: either saying "yeah my hunch was wrong about that, they did pretty well" or else "no, I really tried to give them every opportunity to show me that they can code and even trying to give them the benefit of the doubt I am still unconvinced."
I have interviewed dozens of people, possibly hundreds of people.
I do get an impression within the first three questions. Some people I ask three questions covering different areas, and they knock all three questions out of the park. Some I ask three questions of and they kind of fumble around and give a very limited answer for two of the questions, and can't answer the third at all.
In my experience, I can't recall ever seeing a divergence after this. If they fumble through the first three questions, they will fumble through the rest of the interview. If they can answer the first three questions coherently and in-depth, they will be able to answer virtually all of the subsequent questions coherently and in-depth.
I'm not really confirming biases because I generally ask everyone the same questions. The only time I mix it up is if they're answering all the easy questions correctly, I start asking slightly more difficult questions which I don't ask much. I don't ask them much because if someone is stumbling over the easier questions, there's no reason to ask them tougher questions.
Also, my first technical question is almost always an incredibly easy, standard question. I just ask it so they can answer one quickly and correctly and relax a little. But maybe 20% of people botch even this.
But what I'm confused about, that I'd like to hear your opinion on, is why if you have this confidence that your first three questions contain 90% of the signal, that you spend the rest of the interview doing anything vaguely related to that? Like it sounds to me that you just told me, "I get a 90% accurate measurement in the first, say, 30% of the interview, and then I waste my time for the remaining 70% trying to up it from 90 to 98%," and that just doesn't seem like a great use of your time after that point. Can you help me understand -- why don't you pivot to something like trying to evaluate their conscientiousness or openness to learning or emotional intelligence or ownership of their previous projects and mistakes and such? Or do you mean that you ask these programming-type questions and you indirectly learn about those other dimensions of success?
Also, I'd be curious to know what you do about teachability for some of the skills -- I know that sometimes it's just "we're hiring for a senior role and we cannot accept a junior/intern at this time," but very often it seems like it's just "we're growing in general" and then you are open to saying "hey we don't have an opening for you as a mid-level developer but we would be willing to hire you on as a junior-level dev and if you are as good as you say you are we'd also consider promoting you after your 90-day-review when we've seen you really shine." Are you trying to assess the level of the role that you would offer the candidate with your questions, in other words? Or are you just judging fitness for a fixed role?
When interviewing people maybe one in six answer questions correctly and in-depth, one in six do very poorly, and the other two thirds do so-so. The two thirds are all kind of interchangeable - they can answer most simple questions in a basic manner, missing a few here and there, but can't go into much depth on things.
I would say it is kind of a strict culling thing. The purpose is to find the one in six who can answer the questions well. There's not much reason to change it up much with the twelfth interview who is doing so-so, because if I interviewed twelve people I probably just talked to two people who are a standard deviation above the person doing so-so. So I want to talk more with them, not the one who doesn't know anything in-depth.
On the interviews where they're really stumbling, sometimes I have felt like standing up after only a few questions and going back to my work. Initially I am never focused on one particularly area, I am jumping around in case the last question was in some area they have a gap in, and perhaps they have a strength in some other area. I use their resume, and ask them directly about the areas of strength, to guide this. If they tell me they are strong in an area I will certainly ask them some questions in that area. If I know early I will say no, but keep asking questions any how, I suppose I stay the minimum amount of time I can without seeming rude.
The one in six who do well, those are the ones that I "pivot to something like trying to evaluate their conscientiousness or openness to learning or emotional intelligence or ownership of their previous projects and mistakes and such". Or more broadly, get a sense of their personality, how they will fit on the team, what their expectations are etc. I suppose it is almost the pattern as the technical questions - one in six will have some obvious personality quirk (they really don't listen to what we're saying at all, or they are very obnoxious, or they seem uptight beyond being nervous at an interview - or they take a pen out and start writing on different objects in the office, and the the boss comes by later and asks who wrote over everything and you say the interview subject, and the boss asks why you didn't stop them from writing on desks, walls, machines etc. with a pen - yes, this happened), two-thirds seem normal, and one in six is very friendly and/or seems like a real go-getter with a lot of accomplishments under their belt. I'm pretty lenient at this stage, although others are less so (also, I wouldn't mind having a teammate who would gripe about features and projects they thought were BS, but maybe my team lead or manager would mind - so a pass by me for this type might be a block by them).
So in short, I don't even try to fathom their "emotional intelligence" if they're stumbling on easy questions.
Insofar as teachability, we're usually hiring for a role. They're also being interviewed by multiple people - my technical assessments of the one in six who are good usually matches my coworkers. I tend to be more lenient on personality, emotional intelligence etc., they will say no to people I say yes to. So even if I was more lax, they are not, and they won't get in. Usually there is not much leeway, we are hiring for set roles. The main leeway people get is if they know someone in the group already. If their friend works in our IT group, and they know the subject but not in-depth, we often OK them on the basis of the recommendation that this person is the type that will get up to speed.
Those are two very different questions - the second is so abstract it's essentially meaningless.
The first however - is not a bad question at all.
Note that it is not a 'brain teaser' with a specific answer a-la Microsoft's 'river crossing' type.
A similar question used in consulting is 'if Quebec separated from Canada, how would they divide the debt' although that is less mechanical.
The 'Portland mechanic' question is actually quite good, and I enjoy asking them.
The objective is simply to see how people might think creatively at arriving at an answer.
I once asked the infamous 'how man tennis balls on a 747' question - and the interviewee started calculating how many balls would fit in the area of various body parts, it was ridiculous.
You can tell a lot by how they answer such questions, and get a good grip on their common sense, however, that type of question might be more suited for business analysts.
They are good questions to just get a rough idea of how someone might think about ambiguous problems that need solving - of course, they could never be used in isolation.
FYI - these arbitrary data questions are quite valid for software. We don't expect people to write a perfect 'bubble sort' or know specifically how the various sorts work - rather - the questions just provide an opportunity for someone to work through a simple problem - and then code the solution.
A crazy number of people have trouble with basic things, it's important to weed them out. Smart, coherent people will shine in these scenarios and it will generally be clear.
I guess the implication here is that this was a bad approach? This is why I would be skeptical of this sort of question, it's really not clear to me why you think that answer is any more ridiculous than the question.
In fact, to the extent that it's novel to the candidate and they haven't heard it before - it's an excellent question.
It requires someone to actually think a little bit, to reason about an unknown scenario, and to try to apply intelligence to an otherwise undiscovered set of variables.
Do they know the formula for the volume of a sphere? Will they consider how balls might fit in a lattice? Can they estimate how many balls fit in a cubic meter? Can they reason about how large a plane actually is? Can they even do basic math?
Can they make rough estimates, piece everything together and come up with a rough estimate? Can they parameterize the estimate?
There are any number of tangents to move along and consider.
Watching someone answer this question can tell you quite a lot about how they think, reason and communicate.
There are nary any circumstances in business that have clear data points, clear outcomes, or perfectly clear objectives and unfortunately, our current interviewing paradigms generally don't account for that.
The best question I was ever asked was during a McKinsey interview: "it's 1965, Sweden is going to switch from driving on the left, to the right side of the road, you have 2 years until this day, how do you plan it?" ... which leads to some amazing discussions. It's definitely a good consulting question.
But I came to know a lot of professional mathematicians in that time, and they are people with very big minds, extraordinarily creative -- but several of them would potentially struggle on these sorts of questions (indeed some of them would struggle with arithmetic, having long since just unloaded that burden onto calculators and computers to free their brain for more interesting thoughts) simply because they're not used to reasoning about uncertainty. They are likely to simultaneously be very good hires, and to only give you the barest of lower bounds. "I mean, it's at least 30 because I can fit 30 in a carry-on. Oh, but there could be a hundred seats and a hundred carry-ons, so I guess it's at least 3000." "It's, uh, it's way more than that." "Right, I said 'at least'. I mean we could probably stuff at least two carry-ons in each seat, so that gets us towards a better lower bound of 9000 or 10,000?" "Yeah, uh, still much much more."
I know other folks who would make great junior or mid-level developers who never had any formal education and wouldn't be comfortable approaching this either from the "let me look up or estimate volumes and recall packing densities" or the "I can give you lower bounds based on the things I am absolutely sure about" approaches. They wouldn't necessarily know, for example, that one might just say "okay let me look up the size of a 747" in an interview, that such a thing is probably "within the rules" of what the question expects -- unless prompted, and even then.
Let me put this a different way, do you think that responses to that question could be improved if before the interview you walked a candidate through a similar problem for ten minutes? Because I would imagine it could be, but maybe you ask it in a way that already provides for that. The problem is that if people can do substantially better with a ten-minute nudge then you are actually just testing for who has already had the sort of life experience to give them that ten-minute nudge, and that doesn't seem like an "excellent question" to me.
I think those sites you mentioned only get you to the point of actually having your skills vetted. It's not a solution to the proposed problem of interviewing.
It's those who genuinely believe they're good... they do well in an interview, because they have been working as a "developer" and can talk the talk. But as soon as they have to write a piece of code that's beyond a basic CRUD application you're basically code reviewing a bowl of spaghetti.
Google one of their variable names, and guess what site pops up.
And then there's the ability to debug something - they'll either pester other people, stare blankly at the screen, or start asking on forums.
I know this may sound bad, and some will dislike this, but: no matter how much "experience" or training, some people just don't have what it takes.
edit, to add:
Yes, I've worked with, recommended, and hired a few of these. That is why I now insist that I see some code first. Take it home, during interview, or github - whatever's best.
I'll also give a piece of sample code understand and improve. No brain-teasers or gotchas, just something I believe someone with their experience should easily do, accounting for interview pressures.
If I scare off some prima-donnas, then that's a bonus.
We've interviewed developers who are amazing on paper, but can't answer simple questions that are mainly around enumerating a collection of integers.
I'd say communication is something we look for.
Can you articulate why solution x is better than solution y?
What are the compromises? Do you need to ask for more information?
So many candidates don't know an interview is an invitation to a conversation. You need to be able to engage with the people who are interviewing you. Ask questions to clarify if required.
The key advice I'd give developers is to be humble. I've met and worked with developers who are very talented, but also difficult to work with because of their personalities, we filter for this as a result.
I've been interviewing software engineer/dev candidates for 10+ years, and I would be extremely concerned if these kinds of questions either sunk or swam a candidate on their own. Obviously if someone couldn't figure out any of these, they'd be a bad hire - but I've found simply discussing work/projects people have done before, peeling the onion away layer by layer, to be a far more effective way of determining skill/fit/critical thinking skills.
The point of the interview is to determine if they are in fact a 'solid candidate'!
Also, sometimes very experienced engineers have difficulty thinking, communicating and writing code coherently even if they can 'discuss' project issues in detail.
Here's a famous and interesting article on that 
I’m not going to hire someone who can’t solve one of these problems if they absolutely needed to: these are pretty basic computer science problems at the end of the day. I’m looking for problem solving capability, not rote memorization.
But at the same time I’m not going to ask them because they are condescending; it’s like asking someone with a math degree interviewing for a teaching job to explain basic algebra in an interview, which would be absurd. And if they really couldn’t do basic algebra, it’d come out through other lines of questioning.
It's something that virtually everyone knows, and is a test that they have a very solid command of the basics as a foundation to build more complex things on.
Thats not how you look for developers with passion - people that care about you as a customer and want to solve your problems the best way possible.
Thats how you hire coding monkeys which produce sub bar code and dont care about their job..
Actually, no. Though problems like these do (occasionally) occur IRL, the conditions under which you're expected to solve these problems in interviews ("on the spot, while I stare and grin at you, and occasionally interrupt you - and your result better be near-perfect, or GTFO!") --
have basically no relation to real-life development work.
By asking rote questions like this you're much more likely to decrease your chances of getting people who are good team builders as the lack of a focus on team cohesion will turn a lot of people off.