Hacker News new | more | comments | ask | show | jobs | submit login
Coding Problems from Programming Job Interviews (java67.com)
178 points by javinpaul 29 days ago | hide | past | web | favorite | 85 comments



I personally sat next to my boss in some of our few "classic" job interviews. Questions like the ones mentioned in this article were never brought up, because it would've felt extremely unnatural, even ridiculous. We usually took day-to-day real-world problems that weren't abstract and asked the candidates how they would approach them.

Some of the best software engineers I have seen in my life were not trained computer scientists and would've failed many of the questions presented in this article. Not because they weren't highly intelligent, but because terms like "bubble sort" meant nothing to them.

E: The company had around 10 employees. "Interview" processes often went like this:

1) Establish first mail contact (either because they are applying for a job, or because the company found out about an interesting tool they wrote, or met them somewhere, or...)

2) Discuss skills of candidate, decide whether the skillset fits the companies needs

3) If so, take a programming job fitting their skillset (usually some low-priority ticket) and give it to them as a freelance job

4) If delivered in good quality, pay them and discuss future job options. If delivered in bad quality, pay them and don't discuss job options. If not delivered, ignore.

Which I always thought was a very reasonable approach. Very often, the people weren't even interested in a job, but wanted to continue to work as freelancers.


While working on an embedded team, the basic CS questions were incredibly relevant.

No libraries meant we had to manage our own data structures. We had our own UI system, so understanding how to walk trees and manipulate them efficiently was crucial. Removing duplicates from an array in an efficient manner, important.

Linked list stuff? Oh yeah, very important. Linked Lists underlay a lot of basic OS data structures.

We did have real world problems in the interview mix as well. We'd give an underlying low level API and ask how it could be used to construct a more user friendly wrapper API. We'd give examples of in memory layout of objects and ask how that layout could be improved.

At the end of the day, we wanted to hire people who could solve problems, but also sit down and have a good discussion about how to solve problems.

FWIW it was always possible to tell, with junior employees, the engineers who had gone through CS and those who hadn't.

A good decade of self study of course bridges the gap, but right out of college? I've seen very smart people, good coders, who didn't realize they were working with relational data, and over-complicated solutions.

I also saw people trained as game programmers write mind blowingly fast code that computer scientists would be hard pressed to understand without a fair bit of study.

I had to spend a fair bit of time explaining isomorphic tranforms of code/data structures/concepts though. Game programmers pick up on that one quick, "like a matrix, but instead provable facts about software."


What were the percentage of non-CS grads applying for an embedded systems job, anyway? Maybe I'm wrong, but if you're applying for embedded with as a non-CS, you either know what you're applying for and know your stuff or are absolutely clueless in general.


Hiring for embedded UI jobs is hard. I went out and recruited from schools that teach video game programming. The skill sets overlap a lot.

I also wasn't allowed to tell people what job they were interviewing for. That made things extra fun!

Teaching people who had spent their life in tight highly performance optimized C++ how to work in embedded wasn't that hard.

Except for the "so, now we measure ram in Kilobytes!" part.

But, there is a reason the Microsoft Band had real alpha blending, transparency effects, a simple particle engine, and true type fonts, all with a 120mhz CPU and 256KB of SRAM (and a few more megs of super slow RAM for cache sitting off the side).


> Not because they weren't highly intelligent, but because terms like "bubble sort" meant nothing to them.

That's fine, but understand that they often would not be a fit. If the position is low level systems programming for example, not knowing the term "bubble sort" indicates that they will have troubles in getting up to speed in a modern, complex environment, even if they are very smart. Same if the position is e.g. to work on a database, i.e. its actual implementation.

Solid knowledge of both abstract and specific aspects of computer science and engineering are prerequisites for a lot of programming jobs, especially where you work on things that consist of the environment itself, instead of the applications that run on it (though this is sometimes true for those, too).


I think most highly intelligent people can learn concepts that are part of undergrad degrees pretty easily. Let's face it plenty of regular people don't have much issue either. Especially if this lack of minutae was combined with a CV that got them into the interview room for a role that required it in the first place. Gaps in knowledge are about the easiest to fix.


> give it to them as a freelance job

I've always wondered why this isn't more common. It's win-win to be able to test the waters in this way. What downsides am I not seeing?


Difficult to hire people who already have a job and aren't interested in moonlighting.


In addition to what the other commenters wrote, take-home assignments are also negative for diversity as they pre-select against candidates with less free time, which likely includes single parents and people from low-income backgrounds.


Gotta be honest about this. You are applying to a job of 40 hours a week and 6 figure salaries, and you cant make time for 10 hours of work?


You might currently be working a job that is 50 hours a week -- or 60! Also with the example of a single parent they often can't make room for yet another 10 hours of a work in a week because they need to spend time with their children/take them to activities/cook dinner etc. Manageable when you are working 40 hours a week but it quickly becomes unreasonable. To me a job that wants me to work for them before I have an actual job offer is a red flag. I don't even work for them yet and they don't respect my time. What's it going to be like when I do work for them?


Sorry, this triggers no empathy to me. This is alledgedly a workforce that requires years of study ,training and practice.

Being unable to work 10 hours to prove it shows you are a pauper, all the other excuses are complacency.


Quitting your job or dropping some other responsibilities might he reasonable for some people who can live off of their parents money or their own savings, but for many people that means losing their home, their car, or even their children. Indigo945 isn't making excuses for them or himself; he's telling you that for them to make that 10 hours would cost them far more than the new job might give.

Also, if anyone should be able to make another 10 hours in their week to work in tech why not say Steve who works at $BigTechCo? Now I know we just asked Steve for another 10 hours, but lets go ask him for another 10. No, the circumstances shouldn't matter, he's making 6 figures after all. Let's ask him again. And again, And again. Oh, Steve quit? What a pauper.


The same can be said for requiring candidates to have a github or using a college degree as a filter. Not everyone has the luxury to take out 10's of thousand in debt and spend 4+ years of their life learning instead of earning.

So unless you advocate for completely blind interviews that do not rely on a resume and only take up 3 hours of someone's day, your point is moot.


It's a part of hiring someone to have them send in a resume and verify that they have the skills needed to do the job. I contest that it is not fair to disqualify candidates because they cannot meet arbitrary and unreasonable demands for their time. A phone interview is almost always reasonable. A 10 hour work sample due this week would often be unreasonable. It depends a lot on context and conanbatt discarded that.


> Indigo945 isn't making excuses for them or himself; he's telling you that for them to make that 10 hours would cost them far more than the new job might give.

Thats not a "people cant do 10 hours" its a "they dont want to" which is perfectly reasonable. The company that gives 10 horsu of work better be appealing.

But saying that take home assignments are bad because they take time people cant afford to give is complete bs.


I have trouble believing you aren't being facetious, trolling, or willfully ignorant. If you've spent any sort of time in this industry at all, you know that people apply to many places at once. Sure, one 10-hour assignment isn't much, but although you seem to live in a fantasy world where people only apply to (and receive) one "job application homework assignment" at a time, that's not actually how the real world works. Companies that tend to pull such nonsense also tend to have poor work cultures. I suppose it self-selects for people who don't have a family or out-of-work commitments, but maybe that's what they're going for.


I could put aside ten hours for an opportunity and not consider it an issue.

Applying ten places and getting 100 hours of homework knowing there's a good chance of ten businesses ghosting? I have plenty of empathy for people who have to make that choice.


I like that rhetorical tactic where people who don't bend to you wishes due to having better options are surely complacent paupers.

It is interesting to watch, but not convincing as an argument.

What about I am not doing that, because there are places with same salary and more consideration for people who work.


There is no bending, there is an ask, and a response of "the ask is impossible to fulfill", which isn't. Its dishonest, but worse, its dumb because its obviously false and couterproductive.

Not spending 10 hours to decide where you might spend the next 20 thousand is plain penny wise, pound foolish. If a job ceteris paribus pays 5% more on a 100,000 income, and doing the work assignment gives you 50% changes of getting that 5%, its like getting 2,500 U$S per year for 10 hours. It pays for itself.


The kind of place that treats 10 hours this way is a kind of place that will treat you badly as employer. Same attitude will be applied in all aspects. The same very real considerations will be treated as dishonest excuses later on. Expect regular and long crunch and no control over when overtime.

Also, the assumption that forfeiting those 10h will make one financially worst off is likely wrong, especially in long term.

Lastly, we are not talking about 10h deciding, we are talking about 10h of a single take home assignment - there are more companies to talk with and decision making time is not in it.


Then it works as a great signaling, its just not the match.


Yeah, well except the complacent paupers part and dishonest part where very real considerations and decision making is casted as something shameful that makes you less capable. Likely serves to tilt the culture toward not speaking up about these openly.


Or you know your worth, and as soon as you change your status to “actively looking” on LinkedIn, you’ll have job opportunities flooding your mailbox.


If you are applying to only one job, you are either in a position where you don't have to do silly homework exercises (i.e. you already have a foot in the door) or you're not doing job searches right. For most people, it is far more reasonable to assume that they might apply to 3-5 jobs at once, and going by your estimate of 10 hours of work per job prospect, they would need to take an entire week off just to get to the point where they're offered an interview.

Everything else regarding your argument has already been covered by other users.


When I was applying for a job about 3 years ago, I scheduled six phone screens from different companies in one day. They all led to invitations for in person interviews, and all four in person interviews I ended up doing led to “40 hour a week, six figure jobs”. I’m no special snowflake - except for my network of local recruiters.

Why would I go through the song and dance? The salary ranges for senior developers/architects were all around $20K of each other. Unless the company is offering salaries in the 80th percentile, why would I jump through hoops just to work on a company’s software as a service CRUD app?

My last job hunt, I found the job that met all of my criteria (salary, technology, commute distance, size of company) within two weeks of looking.


Lots of good arguments against here. But assuming this assignment is in lieu of the usual 6+ hour whiteboard exam, then I say bring it on -- it's a lot less intrusive to my schedule.


You're basically limiting your skill pool to those without a job. Or a family. Certainly not both.


If asking about a bubble sort during an interview I'm giving, I'd much prefer to hear questions come back about sorting bubbles rather than some person who's memorized popular algorithms and blindly gives me back some recreation of cookie-cutter algorithms that can be looked up from any source.

Here's what questions I'd love to hear be asked of me if I'd ask the candidate "create a bubble sort algorithm".

1. How many bubbles are there? (collection size and knowledge of collection data structures)

2. What sizes, and how many or percent of each size? (constants, basic math, possibly low-level statistics knowledge)

3. Do new bubbles appear/disappear, and if so do I know what size they appear as, and do bubbles change size? (new inputs to an existing set of data)

4. Since I'm sorting, would you prefer a particular end result? (2d or 3d structure or something similar things)


This just converts the problem from useless bullshit of the Cracking the Coding Interview variety into useless bullshit of the “how many ping pong balls can fit in a 747” hedge fund variety.


same, one really needs to engage the interviewee to see how it solves day to day problems, not just get packaged answer to closed questions.

that said, we have one extremely exercise (a simpler foo baz) just to avoid going in deep with people that lack the basic skills, of which there are many and were clogging our hiring queue.


That's a very sane approach, it's one of the only ways I've seen to get an actual good work sample out of someone that doesn't exploit either party - good on you guys!


Makes sense but it selects against people with high opportunity costs, who are often the best ones.

For instance if someone with kids is already in a job, it might be tricky to find those extra hours.


A take from someone who asks these types of questions in interviews on a regular basis:

1) Don't just memorize this list. The fact that this list is publicly available means that we know not to ask these specific questions.

2) This is a very good way to practice the types of problems that are often asked in interviews, so if you can tackle these then you should be good to go.

A lot of people on HN and in real life (I've had people get angry for asking them to write code in an interview) hate these types of interviews. There's a lot of research done on this topic and so far, it's kinda like democracy. A crappy system, except for all the others. For all the complaints about this, I have never seen anybody propose something that satisfies the long list of requirements for interview questions.

Here's the list, pasted from an HN comment I made a few years back on a similar discussion:

Objective - the process avoids human bias (this guy was in my frat, therefore is awesome).

Inclusive - someone doesn't need extensive past knowledge in a particular area of coding to do well in the interview.

Risk Averse - Avoids false positives.

Relevant - it properly tests the abilities needed for a coding job, and not irrelevant ones (like the ability to write an impressive resume).

Scales - this will be used for tens of thousands of interviews.

Easy-to-do - this will need to be done by hundreds/thousands of engineers, who would probably rather be coding.


While those are good traits in general, they don't narrow down on good interview questions very well.

For example, I (non seriously) propose chess puzzles for interviews then:

Objective - the process avoids human bias

Inclusive - anyone can learn chess, and its played internationally.

Risk Averse - Avoids false positives since the puzzles are hard

Relevant - it properly tests the abilities needed for a coding job, like focus, dedication, intelligence, and preparation. Edit: I have friends in cs classes who have had to implement knight tours and queen puzzles for class as well, so its even cs relevant!

Scales - this will be used for tens of thousands of interviews.

Easy-to-do - this will need to be done by hundreds/thousands of engineers, who would probably rather be coding.

Really, I think 'relevant' is the category most people contest.

Note: chess really is a great game


Leadership ability, communication ability, team-work ability are probably more important factors than algorithmic ability in getting a project done, depending on the team. It seems the big-cos have figured that stuff out enough in order to make the development process as much of a meat-market as possible. However, these interviews would probably not select a maximal team for a startup.

The big-cos are also leaving a lot of talent on the table -- which makes it silly when they talk about a "talent-shortage." But, they're also paying enough where people WILL game these interviews in order to get the job... So the best argument I've seen for why these interviews work is that it selects people who work hard at learning this type of problem solving, to get the job... which correlates well with how hard they will work at the job.

Having an "objective" process is near impossible... and in fact probably not something one should strive for. Everything we do has humans has intuition behind it, whether good or bad. I think intuition is very important for finding a maximal team, especially a small one.

"Inclusivity" is kind of weird here.. most often you'll want someone with domain knowledge to get a project going quickly.. It's definitely less important for big-cos, but you still end up interviewing for a web job, an iOS job, etc.. so domain knoweledge does come into play.. at least at the big companies I've interviewed for (google, fb, lyft, uber, etc..)

I doubt this process is risk averse... you'll find people smart enough, but you learn little about their character.

Relevancy seems like a joke in your list... unless all you need is people to iterate on a small set of already solved algorithms in slightly different ways. I'd say that applies to zero companies. Except maybe leetcode, which helps people study for these tests.

The process scales well, and is easy to do, i think thats its strong suit... it's mostly effective across a ton of people.

So in summary, I agree that it seems like the best known process for these big companies. but I doubt it's as risk-averse as you think, and if you truly have a talent shortage, it's an artificial shortage created by your interview process. You might need to take some risks on people if you need to grow faster...

That being said, having that big-co process is great for startups! There's a lot of exceptional talent out there that will fail at the big-cos interviews, but will be better than their engineers for your team :) (There's also exceptional talent that will succeed at those interviews too, but would rather work at a startup) It might be a matter of using your intuition to find them... it's definitely a hard game.

I think this recent data-driven trend that is spreading across all realms of companies is poisonous in some areas where it does not belong... Human intuition still wins in a lot of areas, if it's good.


You're right, a lot of these metrics aren't as useful for startups and small companies as they are for bigcos.

> Having an "objective" process is near impossible

You're completely right here, however there are plenty of biases that can be avoided by attempting some level of objectivity.

> I doubt this process is risk averse

The risk-aversion is about avoiding hiring people that can't write code. You're right that it doesn't filter for character, which you will discover very quickly when working at larger companies! However a lot of the processes such as code review and design reviews help mitigate some of the bad character quirks that impact software. These processes are often not present at smaller companies, so the impact of character flaws is much larger there.

> Relevancy seems like a joke in your list

I'd say it's one of the most important ones on the list, especially for startups. When I was hiring for startups, I would get loads of people who could talk the talk but couldn't write a line of code. "Relevancy" means that your interview confirms that the candidate can actually code rather than just talk about it.


Interesting on the relevancy stand point. As an iOS developer, I had to learn a whole new set of skills for the interviews... I can count on one hand the number of times I needed to use techniques needed for the algorithm questions. I feel like the skills around making large scale apps is more about being able to reason about large scale projects and keeping everything clean and tidy.

I haven’t really met anyone yet who could talk the talk but couldn’t write code... I’ve definitely met people who have tried... maybe I have a good BS detector, or maybe I’ve been lucky? All people I’ve hired without doing the BST/linked list/DP problems, Ended up working out great. They probably can’t do those problems still, but they can code a damn good app. Guess it goes back to not caring about false negatives


The given answer to #1 is incorrect.

It proposes to find the missing number in a range 1, 2, 3, ..., N by computing the sum of 1+2+3+...+N and then subtracting the sum of the given numbers. It tries to use the old Gauss formula Nx(N+1)/2, but gets the implementation wrong:

    int expectedSum = totalCount * ((totalCount + 1) / 2);
In Java, of course, "/" means "divide by 2 and throw away the remainder", so if you happen to use this for an even number (like 100, as in the original problem), you'll get 5000 rather than 5050.

It happens to work for their sample program because they used N=5 with the array [1, 2, 4]. Had they picked N=4 with [1, 2, 4], this program will say that the missing number is 1.

I didn't get any further in their list of questions.


It includes the cycle detection in a linked list problem, and even has an illustration of the classic algorithm of Floyd using the tortoise and hare. It's kind of fun and interesting to look at what is actually required for that algorithm to work.

It's almost always presented as having the tortoise and the hare start on the same node, with the tortoise advancing one node per iteration and the hare advancing two nodes per iteration. Let's call this a (1,2) stepping version.

That suggests the first question to explore: for what positive integers n, m does an (n, m) stepping always work, when the tortoise and hare start from the same node?

A lot of people from what I've seen think that n and m have to be relatively prime, but that is not correct.

The surprising answer is that an (n, m) stepping works for any positive n and m as long as n != m.

This is surprising, because you might expect that if you were using something like a (2, 4) stepping, and the cycle length was a multiple of 2, you could get the tortoise and hare stuck in non-intersecting loops. E.g., if the cycle length is 8 and we number the nodes in the cycle 0, 1, ..., 7, and the nodes before we reach the cycle are -1, -2, -3, ..., couldn't we end up where the tortoise gets stuck in looping over 0, 2, 4, 6, and the hare gets stuck out of phase with the hare looping over 1, 5?

It turns out that there is no way for the hare to enter the loop at 1 without the tortoise also entering the loop at 1, which puts the tortoise on the 1, 3, 5, 7, and it will meet the hare. This is a consequence of the tortoise and hare starting from the same node.

That suggests the second question to explore: what is required for an (n, m) stepping to guarantee it works regardless of the starting nodes of the tortoise and hare?

The answer there is the difference in speed, m-n, has to be relatively prime to the cycle length.

Note that (1, 2) is a stepping that works universally. If the tortoise and hare start at the same node, it works because 1 != 2. If they start on different nodes, it works because 2-1 = 1, and 1 is relatively prime to every possible cycle size.

But (1,2) is not the only stepping that works universally. (n, n+1) works for the same reason.

That suggests a third question to explore: are there circumstances where it is better to use an (n, n+1) stepping with n > 1 than to use (1, 2)?


The linked list cycle problem was an open question for many years in the mid 20th century.

You may have discovered why: it's not actually as easy as it seems when you know the answer.

There's a blog post about this somewhere, how it's not actually a great question to ask.


It's an interesting thought to approach these types of coding questions from a different angle -

If I wanted to become a better programmer, what would be the best way of doing that?

In other words, if I practice this style of problem solving day in and day out, will that make me 10x more efficient at my day job? Honest question... I don't know the answer, although, I doubt it will.


A company that asks these types of questions has likely never critically examined its own interview process. For every last one of them, my first thought was "why was this data structure chosen, and could this specific problem be solved architecturally rather than algorithmically?"

First question: "How would you find the missing number in an array?"

Easy answer: If there's only one missing number, XOR every number in the array together and XOR that with your expected result, from XORing every number you expected to have been there. That's the missing number. It even works with nonconsecutive numbers, which your n(n+1)/2 solutions won't get. And for consecutive numbers, to get the XOR of the whole list from 1 to N, switch on N % 4, returning 0: N, 1:1 , 2: N+1, 3: 0 .

Hard answer: I can't think of any real-world situation in which the developer couldn't solve this problem better by ensuring it never needs solving at all. Keep track of numbers that were never added, or later removed. What are the missing numbers? Return the list. Fix the problem upstream, and you won't have to recover missing information later.


Fucking Christ, both the OP's answer and logfromblammo's one are all that's wrong with this industry: really zero field experience and requiring the academic, PhD-thesis level clever fucking answer. A seasoned professional would use a fucking map and get the job done. No XOR tricks, no clever arithmetic, no niche BitSet crap. FOR i=1 to 100 occurences_map[x[i]]++. FOR i=1 to 100 if occurences_map[i] == 0, found the fucking number.

Of course giving this answer WHICH SOLVES THE FUCKING PROBLEM gets me instantly disqualified for not being cumbersome enough for the liking of the rookies running the industry.


I thought about including the answer that doesn't require branching, but didn't, because it sounded too fucking clever even just proofreading it. The N(N+1)/2 and summation is the answer they probably expect. Gimmicky XOR tricks set you slightly ahead of the pack. Magical bit-twiddling answers and branch elimination make you look pretentious, unless you're literally Carmack.

My point was that solving the problem as posed does not solve the problem inherent in posing the problem. Finding the one missing number in an array of otherwise consecutive numbers means that you were already storing N-1 numbers instead of the one number you wanted. If you stored the missing number instead, it's trivial to construct an array from 1 to N that skips over it. Save space, save time.

And if the profiler says that this problem is on a critical path in an inner loop, optimizing the heck out of it still won't help as much as unmaking the problem entirely, pulling all the cumbersome calculations out into less-used code. If the profiler says that it barely ever gets hit, then there should be zero algorithmic tricks, and the code should be optimized for human readability instead.

As an interview question, it might as well be a cryptic crossword clue, because the solutions that would be put in place by perfectly good developers are not the ones in the back of the textbook.


No, it does not get you "instantly disqualified". I think if an interviewer posed that question to you exactly as stated, namely with a 100 numbers, they would tell you that you did a good job and solved the problem in a perfectly reasonable way.

So now that we've set the scene, let's scale for a million numbers, rounding up to 2^20. The existing solution needs at least an extra megabyte (since you are not setting individual bits, it will be at least a byte per number), and a second pass of scanning through that entire megabyte. Fine in a web application, maybe not so much in an embedded system kernel. How can we improve?

And so we got a discussion started.

I personally would not ask that question, since it's really a bit too generic, but you get the gist: Start out with something that sets the scene (and catch unqualified people really early), and opens up discussion for more and more interesting discussions.


> I can't think of any real-world situation in which the developer couldn't solve this problem better by ensuring it never needs solving at all.

That's assuming a lot already. The array might have come in externally, with no control over the cadence and format it arrives in, so no way to "keep track" until you are handed the final state.


Your hard answers the question by asking it again or assuming something is wrong with needing a solution.


So it's almost HN cliche to complain about these sorts of questions, and that's a fairly earned failure, but I just want to reiterate that these sorts of questions are one or two orders of magnitude more meaningful than the nebulous questions you sometimes see, "estimate how many auto mechanics serve Portland" or "I shrink you to the size of a pea and throw you in a blender, how will you survive?" or the like.

At the core, what you should be doing if you want to hire properly, is to take a break approximately 10-15 minutes into the interview. Go get a bottle of water or something. During the next 1-5 minutes, jot down on a piece of paper your initial impressions of the candidate, your suspicions and biases.

Some studies show that based on folks viewing the first five minutes of an interview, they can usually predict the outcome. One can interpret that in several ways, but the more cynical is to just say that most interviews are exercises in confirmation bias: you get some suspicion of the candidate, you spend the rest of the interview looking for evidence of your suspicions so you can feel good about your eventual predetermined decision.

If you want to do better, it's very simple: do what a scientist does and flip this on its head. Write down your suspicions and then try to disprove them. You think that this person is too egocentric for your company? Then ask them to talk about how they have worked with others, ask what they'd do if a coworker was struggling with a personal issue, try to gather evidence that they can actually be compassionate and empathic. You think that this person cannot program their way out of a bag? Throw them a softball programming question like FizzBuzz. You think that they have encyclopedic knowledge of algorithms? OK, then ask them to implement a Dijkstra's algorithm and if they don't know what that is, a breadth-first traversal.

What people are really sick of is that people use these tools as crystal balls into their own psyche: You as an interviewer tend towards "he got QuickSort wrong!" if you subjectively didn't want him in your company, "she got QuickSort almost right, except for some finagling around the pivot!" if you subjectively did want her. What's missing is deliberateness: choose what you want to know deliberately, choose the question deliberately to satisfy some goal: either saying "yeah my hunch was wrong about that, they did pretty well" or else "no, I really tried to give them every opportunity to show me that they can code and even trying to give them the benefit of the doubt I am still unconvinced."


> Some studies show that based on folks viewing the first five minutes of an interview, they can usually predict the outcome. One can interpret that in several ways

I have interviewed dozens of people, possibly hundreds of people.

I do get an impression within the first three questions. Some people I ask three questions covering different areas, and they knock all three questions out of the park. Some I ask three questions of and they kind of fumble around and give a very limited answer for two of the questions, and can't answer the third at all.

In my experience, I can't recall ever seeing a divergence after this. If they fumble through the first three questions, they will fumble through the rest of the interview. If they can answer the first three questions coherently and in-depth, they will be able to answer virtually all of the subsequent questions coherently and in-depth.

I'm not really confirming biases because I generally ask everyone the same questions. The only time I mix it up is if they're answering all the easy questions correctly, I start asking slightly more difficult questions which I don't ask much. I don't ask them much because if someone is stumbling over the easier questions, there's no reason to ask them tougher questions.

Also, my first technical question is almost always an incredibly easy, standard question. I just ask it so they can answer one quickly and correctly and relax a little. But maybe 20% of people botch even this.


Now that's very interesting. I think you're right that within those first questions you do get some sort of perception about technical capacity in some raw form. Maybe it's even worth not trying to disprove that hypothesis by asking a separate question.

But what I'm confused about, that I'd like to hear your opinion on, is why if you have this confidence that your first three questions contain 90% of the signal, that you spend the rest of the interview doing anything vaguely related to that? Like it sounds to me that you just told me, "I get a 90% accurate measurement in the first, say, 30% of the interview, and then I waste my time for the remaining 70% trying to up it from 90 to 98%," and that just doesn't seem like a great use of your time after that point. Can you help me understand -- why don't you pivot to something like trying to evaluate their conscientiousness or openness to learning or emotional intelligence or ownership of their previous projects and mistakes and such? Or do you mean that you ask these programming-type questions and you indirectly learn about those other dimensions of success?

Also, I'd be curious to know what you do about teachability for some of the skills -- I know that sometimes it's just "we're hiring for a senior role and we cannot accept a junior/intern at this time," but very often it seems like it's just "we're growing in general" and then you are open to saying "hey we don't have an opening for you as a mid-level developer but we would be willing to hire you on as a junior-level dev and if you are as good as you say you are we'd also consider promoting you after your 90-day-review when we've seen you really shine." Are you trying to assess the level of the role that you would offer the candidate with your questions, in other words? Or are you just judging fitness for a fixed role?


> why if you have this confidence that your first three questions contain 90% of the signal, that you spend the rest of the interview doing anything vaguely related to that?

When interviewing people maybe one in six answer questions correctly and in-depth, one in six do very poorly, and the other two thirds do so-so. The two thirds are all kind of interchangeable - they can answer most simple questions in a basic manner, missing a few here and there, but can't go into much depth on things.

I would say it is kind of a strict culling thing. The purpose is to find the one in six who can answer the questions well. There's not much reason to change it up much with the twelfth interview who is doing so-so, because if I interviewed twelve people I probably just talked to two people who are a standard deviation above the person doing so-so. So I want to talk more with them, not the one who doesn't know anything in-depth.

On the interviews where they're really stumbling, sometimes I have felt like standing up after only a few questions and going back to my work. Initially I am never focused on one particularly area, I am jumping around in case the last question was in some area they have a gap in, and perhaps they have a strength in some other area. I use their resume, and ask them directly about the areas of strength, to guide this. If they tell me they are strong in an area I will certainly ask them some questions in that area. If I know early I will say no, but keep asking questions any how, I suppose I stay the minimum amount of time I can without seeming rude.

The one in six who do well, those are the ones that I "pivot to something like trying to evaluate their conscientiousness or openness to learning or emotional intelligence or ownership of their previous projects and mistakes and such". Or more broadly, get a sense of their personality, how they will fit on the team, what their expectations are etc. I suppose it is almost the pattern as the technical questions - one in six will have some obvious personality quirk (they really don't listen to what we're saying at all, or they are very obnoxious, or they seem uptight beyond being nervous at an interview - or they take a pen out and start writing on different objects in the office, and the the boss comes by later and asks who wrote over everything and you say the interview subject, and the boss asks why you didn't stop them from writing on desks, walls, machines etc. with a pen - yes, this happened), two-thirds seem normal, and one in six is very friendly and/or seems like a real go-getter with a lot of accomplishments under their belt. I'm pretty lenient at this stage, although others are less so (also, I wouldn't mind having a teammate who would gripe about features and projects they thought were BS, but maybe my team lead or manager would mind - so a pass by me for this type might be a block by them).

So in short, I don't even try to fathom their "emotional intelligence" if they're stumbling on easy questions.

Insofar as teachability, we're usually hiring for a role. They're also being interviewed by multiple people - my technical assessments of the one in six who are good usually matches my coworkers. I tend to be more lenient on personality, emotional intelligence etc., they will say no to people I say yes to. So even if I was more lax, they are not, and they won't get in. Usually there is not much leeway, we are hiring for set roles. The main leeway people get is if they know someone in the group already. If their friend works in our IT group, and they know the subject but not in-depth, we often OK them on the basis of the recommendation that this person is the type that will get up to speed.


If you blew the second question in an interview, how good would your game be for the rest of the interview?


""estimate how many auto mechanics serve Portland" or "I shrink you to the size of a pea and throw you in a blender, how will you survive?""

Those are two very different questions - the second is so abstract it's essentially meaningless.

The first however - is not a bad question at all.

Note that it is not a 'brain teaser' with a specific answer a-la Microsoft's 'river crossing' type.

A similar question used in consulting is 'if Quebec separated from Canada, how would they divide the debt' although that is less mechanical.

The 'Portland mechanic' question is actually quite good, and I enjoy asking them.

The objective is simply to see how people might think creatively at arriving at an answer.

I once asked the infamous 'how man tennis balls on a 747' question - and the interviewee started calculating how many balls would fit in the area of various body parts, it was ridiculous.

You can tell a lot by how they answer such questions, and get a good grip on their common sense, however, that type of question might be more suited for business analysts.

They are good questions to just get a rough idea of how someone might think about ambiguous problems that need solving - of course, they could never be used in isolation.

FYI - these arbitrary data questions are quite valid for software. We don't expect people to write a perfect 'bubble sort' or know specifically how the various sorts work - rather - the questions just provide an opportunity for someone to work through a simple problem - and then code the solution.

A crazy number of people have trouble with basic things, it's important to weed them out. Smart, coherent people will shine in these scenarios and it will generally be clear.


> I once asked the infamous 'how man tennis balls on a 747' question - and the interviewee started calculating how many balls would fit in the area of various body parts, it was ridiculous.

I guess the implication here is that this was a bad approach? This is why I would be skeptical of this sort of question, it's really not clear to me why you think that answer is any more ridiculous than the question.


Because the question is entirely reasonable.

In fact, to the extent that it's novel to the candidate and they haven't heard it before - it's an excellent question.

It requires someone to actually think a little bit, to reason about an unknown scenario, and to try to apply intelligence to an otherwise undiscovered set of variables.

Do they know the formula for the volume of a sphere? Will they consider how balls might fit in a lattice? Can they estimate how many balls fit in a cubic meter? Can they reason about how large a plane actually is? Can they even do basic math?

Can they make rough estimates, piece everything together and come up with a rough estimate? Can they parameterize the estimate?

There are any number of tangents to move along and consider.

Watching someone answer this question can tell you quite a lot about how they think, reason and communicate.

There are nary any circumstances in business that have clear data points, clear outcomes, or perfectly clear objectives and unfortunately, our current interviewing paradigms generally don't account for that.

The best question I was ever asked was during a McKinsey interview: "it's 1965, Sweden is going to switch from driving on the left, to the right side of the road, you have 2 years until this day, how do you plan it?" ... which leads to some amazing discussions. It's definitely a good consulting question.


I guess I'd reply after some consideration that what I think is unreasonable about this question is just that I as a physicist know exactly how to answer it because I have received years of training in reasoning about things that I don't know and years of understanding error bars and orders of magnitude, so that I can say "well within a half-order it's this."

But I came to know a lot of professional mathematicians in that time, and they are people with very big minds, extraordinarily creative -- but several of them would potentially struggle on these sorts of questions (indeed some of them would struggle with arithmetic, having long since just unloaded that burden onto calculators and computers to free their brain for more interesting thoughts) simply because they're not used to reasoning about uncertainty. They are likely to simultaneously be very good hires, and to only give you the barest of lower bounds. "I mean, it's at least 30 because I can fit 30 in a carry-on. Oh, but there could be a hundred seats and a hundred carry-ons, so I guess it's at least 3000." "It's, uh, it's way more than that." "Right, I said 'at least'. I mean we could probably stuff at least two carry-ons in each seat, so that gets us towards a better lower bound of 9000 or 10,000?" "Yeah, uh, still much much more."

I know other folks who would make great junior or mid-level developers who never had any formal education and wouldn't be comfortable approaching this either from the "let me look up or estimate volumes and recall packing densities" or the "I can give you lower bounds based on the things I am absolutely sure about" approaches. They wouldn't necessarily know, for example, that one might just say "okay let me look up the size of a 747" in an interview, that such a thing is probably "within the rules" of what the question expects -- unless prompted, and even then.

Let me put this a different way, do you think that responses to that question could be improved if before the interview you walked a candidate through a similar problem for ten minutes? Because I would imagine it could be, but maybe you ask it in a way that already provides for that. The problem is that if people can do substantially better with a ten-minute nudge then you are actually just testing for who has already had the sort of life experience to give them that ten-minute nudge, and that doesn't seem like an "excellent question" to me.


Sometimes I think that interviews steal business time. They don't show what a person can do exactly. Your qualification shows your test work, trial days. So, you have to show how you are in the job. Because interviews create a stressful environment. These questions in the article show that you can memorize answers and tell them immediately. Very soon AI will check our skills and fit us with the most relevant projects for us. I see it's the future. For example, two startups as https://talent.works and https://periodix.net . Their AI algorithms check a CV of a candidate and match them with open positions. It really works! First one is for searching for full-time jobs. The second project is for remote and freelance workers. I believe that AI destroys interviews in a wink.


How do you account for people inflating or lying about their work on their CV?

I think those sites you mentioned only get you to the point of actually having your skills vetted. It's not a solution to the proposed problem of interviewing.


It's not just the liars, they're usually easy to spot.

It's those who genuinely believe they're good... they do well in an interview, because they have been working as a "developer" and can talk the talk. But as soon as they have to write a piece of code that's beyond a basic CRUD application you're basically code reviewing a bowl of spaghetti.

Google one of their variable names, and guess what site pops up.

And then there's the ability to debug something - they'll either pester other people, stare blankly at the screen, or start asking on forums.

I know this may sound bad, and some will dislike this, but: no matter how much "experience" or training, some people just don't have what it takes.

edit, to add:

Yes, I've worked with, recommended, and hired a few of these. That is why I now insist that I see some code first. Take it home, during interview, or github - whatever's best.

I'll also give a piece of sample code understand and improve. No brain-teasers or gotchas, just something I believe someone with their experience should easily do, accounting for interview pressures.

If I scare off some prima-donnas, then that's a bonus.


How to check a lying person on an interview?


Ask them a few questions. It's pretty easy to figure out that that AWS "expert" has just clicked around on the console for a while and doesn't really understand things by simply talking to them about it. Or if someone says they've been programming in python for 10 years, make them talk about the projects they've done, how the code was structured, etc.


In my role I've frequently had to interview software developers for the company I work for. We ask similar questions to this as the type of software we write needs an understanding of what's going on at a basic level.

We've interviewed developers who are amazing on paper, but can't answer simple questions that are mainly around enumerating a collection of integers.

I'd say communication is something we look for. Can you articulate why solution x is better than solution y? What are the compromises? Do you need to ask for more information?

So many candidates don't know an interview is an invitation to a conversation. You need to be able to engage with the people who are interviewing you. Ask questions to clarify if required.

The key advice I'd give developers is to be humble. I've met and worked with developers who are very talented, but also difficult to work with because of their personalities, we filter for this as a result.


There's literally no question in that list (or questions like those) I would bother asking during an interview to any candidate at any career level. My expectation simply is that a solid candidate would be able to figure out anything on that list given a reasonable amount of time commensurate to their career level.

I've been interviewing software engineer/dev candidates for 10+ years, and I would be extremely concerned if these kinds of questions either sunk or swam a candidate on their own. Obviously if someone couldn't figure out any of these, they'd be a bad hire - but I've found simply discussing work/projects people have done before, peeling the onion away layer by layer, to be a far more effective way of determining skill/fit/critical thinking skills.


"My expectation simply is that a solid candidate would be able to figure out anything on that list given a reasonable amount of time commensurate to their career level."

The point of the interview is to determine if they are in fact a 'solid candidate'!

Also, sometimes very experienced engineers have difficulty thinking, communicating and writing code coherently even if they can 'discuss' project issues in detail.

Here's a famous and interesting article on that [1]

[1] https://blog.codinghorror.com/why-cant-programmers-program/


Oh, I had an edit to my original post that included something about coding challenges (either homework or pair programming) - didn’t want to give the impression that should be ignored - forgot to submit that change to my post.


> Obviously if someone couldn't figure out any of these, they'd be a bad hire

Patently false.


No, I think it’s quite true. My deal is: I’m not going to ask you condescending, useless questions like these, but you gotta know your stuff. Not unreasonable, IMO.


You can think it's true all you want. That doesn't make it true. Plenty of people don't know the answers to all those questions and they do great work. Your biases reflect the way you think the world should work, not the way it actually does.


I think you’re actually arguing my side. All I’m saying is the candidates should be smart and capable enough to figure out those problems, which is more than enough for me.

I’m not going to hire someone who can’t solve one of these problems if they absolutely needed to: these are pretty basic computer science problems at the end of the day. I’m looking for problem solving capability, not rote memorization.

But at the same time I’m not going to ask them because they are condescending; it’s like asking someone with a math degree interviewing for a teaching job to explain basic algebra in an interview, which would be absurd. And if they really couldn’t do basic algebra, it’d come out through other lines of questioning.


Just the first two question's solutions are wrong. Did not bother reading past that. I assume whoever wrote this article either, a) wants to make sure other people fail these types of interviews, or b) is trying to hire people and wants anybody who google this to get these wrong.


If you were trying to hire Jimi Hendrix to play guitar for you, would you make him audition by playing Mary Had a Little Lamb? Don't dictate what they should play, let them show you what they got.


This is sort of like how if you want to hire a chef for a Michelin star restaurant, you ask them to make an omelette, not come up with an innovative dish.

https://www.reddit.com/r/AskCulinary/comments/2qw1rp/executi...

It's something that virtually everyone knows, and is a test that they have a very solid command of the basics as a foundation to build more complex things on.


Jimi Hendrix would absolutely crush Mary Had a Little Lamb.


Stevie Ray Vaughn played a pretty funky Mary Had a Little Lamb. :-)


What about graph problems? Admittedly, I only did a 10 second scan, but nothing jumped out.


As DHH said: "I would fail to write bubble sort on a whiteboard. I look code up on the internet all the time. I don't do riddles." I tend to agree.


I would immediately put a X mark on a company which asks as stupid interview questions as those...

Thats not how you look for developers with passion - people that care about you as a customer and want to solve your problems the best way possible.

Thats how you hire coding monkeys which produce sub bar code and dont care about their job..


No Fizz Buzz?


[flagged]


then perhaps you'll struggle if they hired you.

Actually, no. Though problems like these do (occasionally) occur IRL, the conditions under which you're expected to solve these problems in interviews ("on the spot, while I stare and grin at you, and occasionally interrupt you - and your result better be near-perfect, or GTFO!") --

have basically no relation to real-life development work.


I agree - doing these under "interview conditions" is a farce. It tests for other attributes which are probably not relevant to the job (and not a good thing if they are).


I agree with you. I have interviewed people who clearly memorized some of these solutions. They just write the algorithm straight down almost without thinking. But completely fail with other simpler problems.They will struggle on any real life situation.


Then stop asking these questions. During an interview you should be determining if a candidate is a good fit for the job they're going to have, talk to them to suss out their fit with the existing team and maybe bring up a few issues you recently hit in your day to day work (or that were recently hit by people doing what this hire will be doing). A beautiful hidden point in this, assuming this task wasn't solved entirely in isolation by one person then discussing what the company did after hearing their thoughts will demonstrate the sort of communication levels the candidate could expect if they accepted the position.

By asking rote questions like this you're much more likely to decrease your chances of getting people who are good team builders as the lack of a focus on team cohesion will turn a lot of people off.


I do agree. For this reason I try to come up with my own problems. But these problems are good because they are easy to explain but they stop being useful once people have memorized them.


Whose failing would that be though? The interview should gauge capability and motivation which should accrue towards fit for the team and role. There should be some expectation of learning in the role and not just hit the ground running.




Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: