Hacker News new | past | comments | ask | show | jobs | submit login
Data structures and algorithms interview questions and their solutions (quora.com)
397 points by adamnemecek on April 16, 2017 | hide | past | favorite | 103 comments



When I had less than 5yrs experience I used to ask these types of questions, thinking they showed the ability of a developer to reason out critical software issues. Then I met a guy who had practically memorized everything I could ask. He spit out answers so fast it was appallingly clear he was regurgitating answers from a book. This proved nothing about him and everything about me.

Today, I recognize that good software development owes nothing to data structure knowledge or obscure algorithms. This knowledge can't hurt but very few jobs or languages require you build a skip list from scratch. Instead, writing good software requires the ability to maintain a system, to debug problems, and to give constructive code review. That's it.


For functional purposes, I consider myself to know something if I could page in the details well enough to implement it in 5 minutes with Wikipedia or less.

How this plays in interviews is interesting for just how inconsistent it is. Sometimes I win the algorithm lottery and have had reason to implement the obvious solution myself at least once, and the interviewer learns my memory works. Sometimes I don't know what they're going for but come up with a reasonable answer anyway and the interviewer is thrilled I managed to have an independent thought. Sometimes I lose and the interviewer is clearly trying to guide me to the answer they want to see and I have no idea what they're getting at.

Though not quite as bad as a friend failing an interview for not using a hash table, even though the answer he described was a hash table—just with the identity function as the hash.

(Though maybe about as absurd as one interviewer digging for any formal training I might have had in C++. Like you can teach that language in a semester—please. There are maybe 4 proper experts on the planet.)


I think there might be a happy medium. Knowing how to implement a specific algorithm is not necessary. Knowing of algorithms and picking the right one is important. For example, a junior developer apparently forgot about hash sets. He used a list for lookup. Worked fine in his local tests. Fail miserably when we had to load millions of records. A more seasoned programmer should be able to explain why a hash would work well here.


Accidentally quadratic is good for a chuckle too

https://accidentallyquadratic.tumblr.com


I used to do interviews in one company I worked for and we had this online test, which consisted of four tasks very similar to those in the list.

There was this one candidate who, by mistake, was given the test twice. Apparently he took offence to that. He found that his friend did the exact same tasks so he copied them.

The designers of that test tout that their system is able to detect plagiarism, but that was not the case here.

Anyway eventually we hired him because even though his first attempt wasn't particularly impressive and he cheated a face-to-face interview revealed that he was a skillful programmer.

One thing I learned from this experience is most of the time these test simply confirm that the candidate went to college - sometimes a particular one.


I agree 100%. I'm hoping that Disseminating these practice materials will eventually topple the whiteboard interview by exposing it's futility.


>Today, I recognize that good software development owes nothing to data structure knowledge or obscure algorithms. This knowledge can't hurt but very few jobs or languages require you build a skip list from scratch. Instead, writing good software requires the ability to maintain a system, to debug problems, and to give constructive code review. That's it.

While I don't think that the current state of interviewing is perfect, this statement is pretty out there. Good software development owes NOTHING to data structure knowledge? Choosing the right data structure is critical for software that performs well (or performs at all once you get to a certain scale), and demonstrating knowledge of various data structures shows at least that a candidate knows a bit about which tools they have available to them on the job.

Of course not many people are implementing skip lists from scratch in their everyday job, that's why in the list of 500 questions none of them are "Implement a skip list". Data structure questions rarely ask you to implement some data structure from scratch, they ask you to look at a problem and figure out which data structure best lends itself to the solution.

I won't argue that asking questions about obscure algorithms isn't a problem, but good interviewers aren't asking questions that need obscure algorithms, they are asking questions that need algorithms like DFS, basic variations of sorting, and basic DP problems.


You're focusing too much on a particular phrase and not enough on the overall meaning. Read this again:

> Instead, writing good software requires the ability to maintain a system, to debug problems, and to give constructive code review. That's it.

Code review is the best time to talk about data structures and algorithms as they are a time to apply theoretical knowledge to real world problems. Good, constructive code reviews are about education and maintaining a quality code base.

Interview questions about data structures and algorithms rewards memorization and classroom knowledge. I've worked with way too many mediocre developers who are great at whiteboarding interview questions but who couldn't build a maintainable system if their lives depended on it. Consequently, I don't give a damn about a candidate's performance on these types of questions. Instead, I'd rather learn about some projects they've built, problems they've experienced and their experience with code reviews.


Yeah, the strange "data structures and algorithms are either everything or nothing" polarization in the opinions on tech interviewing is strange.


Could you explain your reasoning? Someone doing really well doesn't seem like a reason to stop asking other people algorithms questions.


Because he understood that this data point falsified the general theory that answering questions about algorithms and data structures implies an understanding of algorithms and data structures.


Check for understanding by not asking book questions and expecting book answers. Ask variations on book questions.

For example, in junior math the prof went carefully through deriving the Fourier Transform. On the final exam, the question was to derive the Hyperbolic Transform. This was straightforward if you understood the FT, but not if all you could do was parrot it.


So that said how do you vet talent? Ask them to tell you stories about maintaining systems, debugging, etc?


Personally, yes - I like asking about high-level run-downs of a project on their resume. I then choose points to stop them (either randomly or if I know a lot about the subtopic) and dig deeper and see where their comfort level lies.

There are a lot of benefits to this approach:

1. Keeps the topic on something they are presumably comfortable with.

2. Gauges their seniority level: more senior folks can talk deeply about a wider variety of topics.

3. Tests their communication and explanation skills, again about a topic they should be comfortable with.

Some potential pitfalls:

1. Makes it harder to objectively compare candidates. I think this is impossible anyway, so I don't consider it a big deal.

2. Some candidates, especially junior ones, simply don't have many projects on their resume to talk about. In this case I'll often ask them to explain a skill or technology they are experienced with.

3. It's harder to conduct these interviews. Giving a problem and passively watching and helping as needed is a lot easier than keeping up an active dynamic conversation.


I've seen (from both sides) two things that seem to work.

Firstly, write actual code to solve a simple but real, or at least realistic, problem. That could be a take-home exercise, or it could be on-site pair programming, or some other format. Memorable examples: a stock exchange (submit orders, get trades); a Tetris engine (submit blocks, get rows filled); an auction house (submit bids, get winners); an uploader for the Azure blobstore (submit blocks, get files); an online metric summarizer (submit samples, get summaries). This tests someone's ability to actually write code.

Secondly, suggest an algorithm for a real, or realistic, problem that doesn't have a well-known good solution. Memorable examples: subset-sum, but on an ordered set, where the solution must be contiguous; placing primers for PCR [1]; online k-anonymisation [2]. This tests someone's ability to think on their feet, and synthesize facts they've learnt into something new.

I'm a bit dubious about the "tell me about something you worked on" question. Firstly, because it requires the candidate to have worked on something chewy, and there are lots of capable people out there who are unfortunate enough to have had crummy jobs where they don't get to do that. Secondly, because it attempts to evaluate the judgement of an individual by looking at the work of a team. Too often, as a candidate, i've had to answer "why did you do it that way?" with "because the client specifically asked for it" or "because my colleagues who were working on it at the time decided so", which isn't useful for me or the interviewer.

[1] http://wiki.c2.com/?StridingAcrossSteppingStones

[2] https://en.wikipedia.org/wiki/K-anonymity


But an easy comeback from the interviewer would have been "What other ways could you have done it? What are the advantages and/or disadvantages compared to the way the client or your colleagues mandated?"


True. But now you're talking about ideas, rather than about experience, so you're testing something a bit different.


The best way to memorise hundreds of different algorithms (or anything else for that matter) is to understand the few underlying principles beneath them. I would be surprised if he knew them all and couldn't come up with intelligent ways of mixing them for achieving a new task.


The best way sure, not the only or even most popular way though.


> Instead, writing good software requires the ability to maintain a system, to debug problems, and to give constructive code review. That's it.

What do you mean by "maintain a system"?


I guess I need to elaborate, since I am getting down voted.

I have only heard of maintaining systems in contexts where it means keeping a system running. I don't see what that has to do with the ability to WRITE software (good or otherwise). Hence, my question.


Bit late to the party but this is a valid question. There's a lot of focus on writing the code, but the system includes the non code components. Can the developer keep the code relevant to the organisation.


Does your company have current openings? I think I can prove myself better in your metrics.


We do, check out MediaMath. We will ask you design, debug and like the questions. Well also do a take hone problem.


Speak the truth. I'd hire you again, man. <3


Yeah and you'd be the first guy I'd think of in an early stage start up. You're a gets shit done kind of guy.


Thank you!


- Programmers are increasingly more "API plumbers", not "pathfinders". The change being you have to more often create an "easy to maintain" glue layer between lots of existing code (most of which your company wrote a bunch of years ago, or is external to your code in a framework or library) instead of solving a problem from scratch using CS basics. Interview questions haven't shifted to "this library has such and such function calls, how would you expect it to report an error?" or "describe two distinct UI frameworks you have used and explain their differences". They're still very much in the era of OK we have two rectangles, how do we find the overlapping pixels.

- Software is increasingly built in teams. We don't ask about "how you handled a difficult person on your code review" or "how would you break this change in to two logical commits" or even "what's a good name for this function", all of which are more important than knowing the big-o for some obscure interval tree.

- CS 101 algorithms have very little to do with how they are implemented. e.g, in a recent discussion with a co-worker I discovered libstdc++ hashtable's iteration orders depend not only on hash code but also on insertion order. Looking at how libstdc++ and MSVC implemented hashtables was both very difficult and exhausting to follow and had many many more details than the typical CS 101 approach of just hash the key and direct lookup the bucket! Turns out writing code that is really fast needs more than knowing the big-o.

- Big O is such a garbage simplification of all the "fundamental" knowledge. Why not just ask some fundamental graph or combinatoric counting questions? Why nothing about what is computable and what is not, or which languages are regular? What about information theory? How come no information recall or database questions? or even these days with all the ML, why not some basic linear algebra?


Here's my gripe with this interview process. It encourages engineers to spend months and years into learning and remembering optimal solutions to textbook problems.

A REAL engineer would spend the same time doing something productive....things like:

- Writing a sql optimization for their current company that saves resources

- Creating shared memory interfaces for faster processing of logs

- Modifying filesystems or kernel to create their own log processing

- Modifying JVM to optimize critical code

- Rewriting parts of a server to set up security alerts

- Fast encryption and decryption of in-memory cache records

- Writing an interpreter for faster generation of low level code for high speed processing

- And on and on

My point is, today's interview process treats engineers with such complicated experience as shit because they spent time working on complex projects and couldn't memorize these questions. But then these same companies will hire some mediocre engineer who cannot think out of the box but is an expert at memorization.

Said mediocre engineers then hire other mediocre engineers using the same process, because these mediocre engineers do not even have the ability to grasp what kind of engineering would lead to above points.

To top it off, these same mediocre engineers sweep this problem under the rug claiming they are trying to reduce false negatives. Lol!

Sad state of the industry!


Google style interviews don't screen for the best engineers, but they screen strongly for subservience. The fact that a candidate has to go out of their way to prepare for these interviews indicates that the candidate is likely willing to jump through whatever hoops they're told to jump through.

It's similar to how investment banking jobs place such a high premium on Ivy league degrees with high GPAs despite the actual job itself being so simple a high school student could do it. Someone who's willing to jump through all those hoops to get into a top school and graduate with a high GPA has a strong track record of following orders and is less likely to complain about 100 hours workweeks.


>> Google style interviews don't screen for the best engineers, but they screen strongly for subservience. The fact that a candidate has to go out of their way to prepare for these interviews indicates that the candidate is likely willing to jump through whatever hoops they're told to jump through.

Try saying this obvious observation to some Google engineers in real life. I've never seen such butthurt adult-kids!


I mean, that's all true, but what matters is whether you have some personal goal which requires you to go through time at Google.


If this materially impacts the bottom lines of companies that are practicing this approach eventually companies that are not using this process will prevail.


There are other equilibria, likely because the employment market does not have the characteristics necessary for analogies of the the Efficient Market Hypothesis to apply. (most glaringly: people are not fungible commodities / the agents involved do not have rational expectations)


>> If this materially impacts the bottom lines of companies that are practicing this approach eventually companies that are not using this process will prevail.

It would, if companies realize that they can save millions by improving their interview processes and not have to aqui-hire so much.

But this insight requires smart people (who they eliminated with this interview process) to be already hired in those companies.


While I highly dislike this trend it's hard for me to call say Sundar Pichai not smart for example.


What do you think about Marissa Mayer? For every outlier winner person you point out, the industry is littered with losers. Perhaps stop correlating wrong things together?


Well if you theory holds how did Rob Pike, Ken Thompson, Guido van Rossum, Dan Bornstein and on and on ended up at Google?


>> Well if you theory holds how did Rob Pike, Ken Thompson, Guido van Rossum, Dan Bornstein and on and on ended up at Google?

You realize they got hired because of their fame and expertise? They didn't have to go through intense whiteboarding. And even if they did, they would be hired even if they didn't solve a stupid N-Queens problem?


I realise they have 0 problems with CS fundamentals while being at the same time top experts in respective fields.


>> I realise they have 0 problems with CS fundamentals while being at the same time top experts in respective fields.

I suspect your definition of CS fundamentals is only Algos and Datastruct problems. Said experts are very good at what they do. That does not mean they will whip out an algorithmic puzzle solution on a whiteboard under pressure.

I also suspect that those experts have better things to do than get a job at Google. They'd rather not go through this process. Google wants them to be there not the other way around.


There are several problems with your and the parent commenter's approach here:

1) You've not defined intelligence in an agreeable way, such that all parties are discussing the same thing (you seem to have interpreted his point about Sundar Pachai as successful due to being CEO of Google, but that may not be the case).

2) You've not defined intelligence in a rigorous way, such that all parties know it's meaningful even if they agree on it (is being CEO of a publicly traded company a robust proxy for intelligence? Can we positively correlate corporate success with intellectual formidability?)

3) You have not established any empirical way to assess intelligence in these people aside from what you'd consider tautological (CEOs are intelligent, Pachai is a CEO, Pachai is intelligent) and therefore self-evident. But you can't discriminate between the relative intelligences of members in this group, especially remotely on a forum, which makes this an issue even if 1) and 2) are solved.

I know, taking dialectics this seriously on an internet messageboard is a major pain in the ass. But this brings us full circle to software engineering interviews. One person mentioned a potential proxy for intelligence they consider self-evident, someone else argued against it without full and transparent clarification, and the cycle continues. This is how whiteboard interviews were born, this is how they became controversial, and this cycle continues even on discussions of whiteboard interviews.

People who want to filter for "intelligent" in their interviewing process can't define the term. People who come up with methodologies for assessing it, or identify proxies to it, can't prove they work. People who argue against these assessments or proxies can't prove they don't work because no one can even agree on what should be assessed. It's a Wittgensteinian clusterfuck all the way down, but the same human habits that gave rise to whiteboard interviews are perfectly manifested in the counterculture reaction to them and the resulting controversy.

I propose instead that you resist the temptation to use the word "smart", even in situations you'd consider the quality self-evident. "Intelligence" is like AI - once it's rigorously defined, no one agrees it's AI and it's so narrow in scope that no one cares about it because it's not weaselly and magical anymore (example - the IQ test). When you appease everyone's inclinations and talk about it without defining it, madness ensues.

Rigorously define a problem space, correctly map a set of empirically measurable capabilities to it, and hire people who demonstrate those capabilities. But don't use descriptions like "smart" which mean everything to everyone.


>> 1) You've not defined intelligence in an agreeable way, such that all parties are discussing the same thing (you seem to have interpreted his point about Sundar Pachai as successful due to being CEO of Google, but that may not be the case).

My last line was, "Perhaps stop correlating wrong things together?". My intention was to indicate that the person who wrote about Sundar Pichai is trying correlate something about the Google CEO to my original comment about aqui-hires.

It was not an exercise to define intelligence. I simply had to point out naivety of any correlations the parent commenter was trying to make.

>> 2) You've not defined intelligence in a rigorous way, such that all parties know it's meaningful even if they agree on it (is being CEO of a publicly traded company a robust proxy for intelligence? Can we positively correlate corporate success with intellectual formidability?)

Again, there was no exercise to define intelligence. You are misreading the context

>> 3) You have not established any empirical way to assess intelligence in these people aside from what you'd consider tautological (CEOs are intelligent, Pachai is a CEO, Pachai is intelligent) and therefore self-evident. But you can't discriminate between the relative intelligences of members in this group, especially remotely on a forum, which makes this an issue even if 1) and 2) are solved.

Same response as above.

>> People who want to filter for "intelligent" in their interviewing process can't define the term. People who come up with methodologies for assessing it, or identify proxies to it, can't prove they work. People who argue against these assessments or proxies can't prove they don't work because no one can even agree on what should be assessed. It's a Wittgensteinian clusterfuck all the way down, but the same human habits that gave rise to whiteboard interviews are perfectly manifested in the counterculture reaction to them and the resulting controversy.

Interesting you bring this up. Let us play a thought experiment. Who would you rather work with?

1. A person who wrote a VM for handling their cross-platform message queue but couldn't solve N-Queens problem

2. A person who can solve N-Queens problem on the whiteboard but doesn't have the engineering appetite/desire to learn something like VMs in detail

Now that you know who you want to work with (hoping you chose 1), let me ask you this. Why is data king? Have test scores ever been a representative of smartness? Is a person with 3.5 GPA worse than 3.75 GPA?

If your answer to above question is no. Then why is Google hiring committee rating people with their own notion of GPA and hiring people with a certain Google GPA?

Because, in that GPA competition, candidate 2 will win. But you didn't want to work with candidate 2 did you?


Companies that hire the best engineers already use better processes.


How can we check this to be true?


You probably have to become a real good engineer, or work among them, to see it.


How does this prove or disprove anything about the process they use?


So what approach do you suggest for the interview process? How has it been your experience hiring under this process?


Without spoilers, here's an outline of an interview problem I did last Friday:

1) Here is a very small VM (fewer registers and fewer instructions than you can count on your hands). Write an interpreter for it.

2) Add an I/O instruction.

3) Having your interpreter and your I/O instruction, you are given at least one example of a program with a specific property. Write a program analysis which will detect the property soundly and completely.

Managing to finish (3) relies on knowing about certain fundamental concepts from automaton theory and program analysis. The company does actually use this knowledge regularly, so it makes sense that they'll ask. However, even parts (1) and (2) actually manage to test basic programming in a reasonable way.


Registers? You were lucky! Last time i did an interview problem with a weird machine, i didn't have any registers at all! And only three instructions, none of which were arithmetic!


By the end the interviewer was telling it was all right for me to have mistakenly thought the registers were 32-bit words, and thus concluded the machine wasn't quite Turing-complete, and he was noting that our analysis should still work for the Turing-complete version with bignum registers.


What do you work on? Can I interview with you?


I was the one interviewing for a job! If I get it, then maybe I can invite you. For now, precious is mine!


Haha! Good luck. Sounds like an awesome opportunity if they deal with VMs and registers!


Step 1: Read the fucking resume

Step 2: Find somebody with relevant expertise to interview that candidate. Don't send the generic engineer who doesn't know anything beyond using hashmaps for all Java programs.


> Here's my gripe with this interview process. It encourages engineers to spend months and years into learning and remembering optimal solutions to textbook problems.

Wrong. Questions aren't designed to be memorized. It shouldn't take months, or years, to prepare for this stuff if you already have a CS background unless you're really slow.


> Wrong. Questions aren't designed to be memorized.

Do you know that 'detect a cycle in a linked list' question? I can't imagine many people being able to answer that without having seen the question previously.


Recent surprise: exponentiation by squaring. Not by name, but by being the correct answer to the question. (This is in the list of 500, by the way.)

It's trivial to explain:

    x^n = x(x^2)^((n-1)/2), if n is odd
        = (x^2)^(n/2)     , if n is even
Knowing that (or having figured that out yourself if an interview was your first exposure to this problem), it's even more trivial to implement, making it a math quiz far more than a programming exercise. (Though I later turned it into a programming exercise entirely to exploit an arbitrary eye bleedecution bug in the human brain: https://gist.github.com/LnxPrgr3/7154873d3eb8b1e5960851628c7...)

Maybe that's what they intended. It's certainly legit if the job will have you actually doing, rather than applying, math. Except that it concludes you can math if you're smart enough to pretend to derive the answer you memorized after encountering it before. You might accidentally hire a bunch of crypto nerds.


For what it's worth I learned that explicitly in a data structures & algorithms class. I've seen it come up in a bunch of other introductory material, so it seems like a relatively standard part of "the curriculum." I think somebody might have a chance of coming up with this shortcut on their own, if, for example, they wanted to compute the Nth Fibonacci number efficiently (using matrix exponentiation -- the bignum arithmetic makes it O(n^2) if you just do n multiplications) or if they wanted to compute a polynomial more efficiently by memoizing the values x^(2^n), or if you kidnapped them and threatened to kill their family if they couldn't figure out how to compute x^n using O(log_2(n)) multiplications...

The difficult thing about that shortcut isn't that you couldn't come up with it, it's that you wouldn't think to try coming up with it unless you are in a particularly painful situation.

I don't view this exponentiation speedup stuff as math-specific, it's something you can come across when efficiently updating string checksums under random access modifications and concatenations, or when solving some graph problem where you exponentiate an adjacency matrix.


I'm glad your algorithm class covered this. Mine did not. It surely covered some other obscure material that you don't know. You are missing the point. If we had a standard CS curriculum, you can fault people for missing out on stuff they should know. I think part of the problem is people is that CS is too big of a discipline at this point. A systems person couldn't pass a graphics person's interview, etc.


But I wouldn't require somebody to know of faster exponentiation in an interview. It's still too obscure.


One data point, but I was asked that as a Microsoft interview question when I was a freshman. I figured it out on on the fly pretty quickly. I'm no genius.


I figured that problem out on my own the first time I heard of it. It wasn't Floyd's algorithm, there are many solutions.


Who do you want to work with? A person who can regurgitate CS trivia or a person who has interest and track record in solving interesting problems?


Like I said, it's not regurgitation. There's a reason companies try to keep their questions secret.


It's all a variation of the 500 questions listed. Nobody memorizes code line by line. But there are enough hires at Google and FB who memorized algorithms and solutions and got in.

Do you want to work with people who memorize solutions or do you want people who are innovative, genuine problem solvers who've solved interesting problems?


It's hard to imagine an innovative problem solver being unable to tackle Google's interview questions. You gave the example of optimizing an SQL query. That's a data structures problem itself.

Google apparently has an NDA for the interview questions they ask, so they're trying to avoid quizzing people on stuff that they've memorized. I know any place I worked at always tried to ask "fresh" questions to people. There's just no point if they already know the answer.


I had a phone interview with Google a few years ago. I was asked to implement a Soduko solver. I didn't manage it during the interview, but half an hour later a solution came to me when walking down the road. Should I have been a hire or a no hire?


What is your point? They've got limited time and don't want to waste candidates' time. There is some variance in the process, which they try to minimize by asking lots of questions.


Point is that it is a very poor way of assessing someone.


I'm someone who cycles over my thoughts for long periods of time. Interviews really don't capture this at all, but it's one of my strongest intellectual assets.


>> It's hard to imagine an innovative problem solver being unable to tackle Google's interview questions. You gave the example of optimizing an SQL query. That's a data structures problem itself.

Dude, is your definition of problem solving limited only to Algos and datastructs?

Because I can imagine sql optimization in the following ways: 1. Compiler modification to use different instructions 2. Hinting branch predictors 3. Modifying the database environment to use different operations under the hood 4. Use a different hardware 5. Use a different language 6. Allocate different memory configs 7. Use SSD

Do you even realize that CS is not algos and datastructs only?


The big optimization would be avoiding full table scans or full table joins. That would be at the query level.


>> The big optimization would be avoiding full table scans or full table joins. That would be at the query level.

An insight that can only come from experience and to someone who is actually interested in this.

I want to work with people like you, not with the person who can solve some stupid dynamic programming puzzle but can't do any real interesting stuff in the real world.


Well, I meant that the biggest optimization you can do is algorithmic in nature.


It's a tragedy of our industry that this super FUN list of basic algo problems, is connected in everyone's mind to lame-ass whiteboard interviews. These problems should be like playing scales: engineers should just work through these, practice them till they're no-brainers. But not to land that sweet entry-level SWE job at Google, but just to grease your own wheels in day-to-day coding and problem-solving.


I've never been to an interview that would ask this kind of crap, and would be very surprised if a future interview did. So yeah, I'm just looking at it as a list of possibly interesting problems, which makes it a way better link than "interview help".


I agree -- in addition, such problems can also be seen as a springboard into research - tweak any of them or ask if you can prove that the solution is optimal, and you are quickly in the realm of open questions.


As a fun warm up-question, I always ask the candidate what editor and keyboard they use. The candidate who geeks out about their plugins and cherry switches inevitably passes all 5 rounds; the candidate who uses eclipse with maybe a vim plugin will go on to fail about 75% of the time. It's a good filter too; I'd say 1 out of every 20 candidates expresses strong convictions.

I'm seriously considering using it as a dog whistle and skipping the experience deep-dive (which tests if you are a good story teller), the systems design questions (which tests if you can sync to the interviewer's sense of judgement), and the algorithm hazing (which tests social anxiety). But I'm worried there is some kind of discriminatory correlation baked into that metric.


I worked at a company that tried that for a while and eventually decided to completely abandon it, because it was correlating so strongly to a particular set of experiences shared by the (mostly male, CS degreed) interviewers. Particularly with more junior engineers, where having a sexy setup is more likely to correlate who you hung out with in college than whether you actually coded enough to find a need for it yourself. So if you're going to play with that as a signal, I'd advise you to be very careful.


I have exactly the opposite bias. I think people who get really into keyswitches, alternative keyboard layouts, and .vimrcs are prone to getting distracted by unimportant details. Of course, we all want to use the most comfortable and productive tools, but a skilled person will attach more weight to things that have more impact.

Still, i'm glad i know people like that, because it's really easy to get a knowledgeable recommendation when i need a new keyboard!


As someone who totally geeks out about my editor (vim, what else!?), I can understand that.

However, I'm pretty sure that out of the many, many developers I've worked with, including some brilliant ones, I'm one of barely a handful who gives much serious thought to editors. This could also reflect a bias of location - HN'ers are more likely to geek out over editors, but I think in general people in Silicon Valley tend to care more about this, as opposed to people in Israel, where I'm from.

I do sometimes ask similar questions about more technical topics that the candidate should be familiar with, e.g. which DB do you use (backend engineer), which JS framework you use, etc. This is something that the strong candidates must have brushed up against, and should have a strong opinion about.


Just out of curiosity to see how I do, how would you rate using stock Intellij IDEA and a Macbook? Additionally am interested because some top algorithm competitors (of which I am not one) have this very setup.


I often ask people about their editor setup too - I tend to find that the level of enthusiasm they have about their editor is a moderately good indicator of their enthusiasm in general (though not, necessarily, their competence - I think that's stretching it too far). If they give me a look as if to say, "Why on earth is he asking this?" or they reply with something like, "Oh, whatever to get the job done... what do you use?" then it doesn't suggest to me that this is a person who gets much pleasure out of programming.

The other nice thing I find about asking this is that it isn't a question they've prepared for, so you tend to get an honest answer.


nano and a $10 Logitech k120. What's your verdict?


I really wanted to play this game, but after thinking about it, it really depends on why you use nano and the k120. The k120, in particular, is a very stiff keyboard with a mushy feel. It requires a fair amount of strength. I couldn't use it all day -- especially not with nano.

My initial thought was that you are a very careful programmer that tries not to make too many mistakes. I also thought that you probably try to write code in one pass and that you avoid TDD and other techniques that cause you to go back and revisit code again and again.

But, I cheated a bit and noticed in your profile that you have sysadmin experience, so now I wonder if the cheap keyboard is simply the keyboard that happens to be there. Nano is because you can't be bothered to spend your life learning vi and nano is the other editor that's guaranteed to be on whatever machine you happen to be logged in to. So that would make you pragmatic (as any good sysadmin is).

What's kind of interesting about this process for me is that I now have this utterly fictitious view of you based on ridiculous stereotypes. We could probably have a good conversation where you poke a million holes in my view, but without really challenging anything that I hold dear. Not really sure if it's a good idea, but it is definitely interesting to think about.


Sounds like someone that at least knows how to use a terminal and maybe some linux commandline tools like gdb.

I think the person we're replying to is looking for someone that wrote a few popular VIM/sublime plugins and uses an ergodex they built themselves.

I don't use an ergodox because I wanted an RGB keyboard, and I use sublime with a handful of simple plugins because that's what I like. I used to hate vim, but some of my coworkers love it, so I respect that and have tried to use it more often. :) The good news is the coworkers that live out of vim like sublime, too. They're just used to vim, nothing wrong with that.

It's not a terrible place to start though.


I'm probably just reading this into your comment, but the challenging tone makes it seem like you think OP is pretty misguided... like there's no information to be had here.

But isn't an interest in tools pretty common in most crafts/trades? If I were hiring a woodworker for my cabinet shop, I'd probably ask about favourite tools. I wouldn't actually care if they talked about how a bandsaw beats a tablesaw every time, or launched into a sermon about shopmade wooden handplanes versus metal monstrosities. It's the presence of an opinion at all that matters.

edit: I don't actually have a cabinet shop. :(


Tools, yes. But seeing editors and keyboards as the tools indicates that someone is operating at a pretty low level of abstraction. If you asked a civil engineer what their favourite tool for designing bridges over deep valleys was, and they said "a Rotring isograph" [1] rather than "parabolic arches", would you be impressed?

[1] http://www.cultpens.com/i/q/RT04273/rotring-isograph-technic...


I love the office keyboard. It's so fun!


I have a friend who memorized all of these types of questions and answers. It worked because he got offers from Google, Facebook and Uber, amongst others. The key is to pretend like you're figuring it out on the spot and not just regurgitating it from memory. I plan on doing the exact same thing when I look for my next job. It's unfortunate, but the reality of the state of our interviews.


I have two kinds of questions: (1) tell me about a problem you've solved, and (2) let me tell you about a problem I'm facing. I'll never ask candidates about made up problems. Why would I waste our time talking about fake stuff, when there's so many actual real problems in front of us that we could dive deep on?


The best, most fair interviews I've been through follow this pattern. I remember one of them told me about a current problem they had, and told me "You can google anything you want, you have 30 minutes. After that, you should tell me how would you solve this problem, including any tradeoffs you've made".

This is by far the most realistic scenario employees of most companies face. It doesn't matter so much that one doesn't remember the specific details of tree traversal or how linked lists are implemented. It matters much more that one is able to find it out quickly, understand it and apply it. Those are the skills you're looking for.


If they solve it but you have other reasons not to hire them, would you implement their solution?


In my opinion, it is great having people trying to "memorize" those questions. If they understand them, it would be like one half of formal computer science studies. What is a shame is people graduated from Computer Science university courses not remembering a clue about algorithms, because they "memorized" just to pass the exam. BTW, some of the interviews may be are too much hard, requiring being a Donald Knuth, having a "intelligence" index of 160, and being a nice team player (if you target very smart people you'll deal with crazy jerks most of the time -until those jerks become tamed, somehow-). Super-smart people should target working for the NASA or similar, not for writing APIs in Python 50 hours a week with just 3 week per year holidays.

Anyway, algorithms matter.


Nice list, but a side-note: all this focus on interview-preparation may give the false impression that algorithms and data structures is a closed body of work, when in fact it is an active and lively field of research with many basic questions not yet understood.

Someone could go through these 500 questions and for (almost) all of them formulate variants/extensions that would be open research questions. Or alternatively, ask for each of them: is this the best possible solution? Can I prove it? What if I restrict what the algorithm can do? What if the data comes online? What if the input is noisy? etc. etc. etc.


Dear HN,

Please stop telling companies not to use this method of interview.

Sincerely, Michael.

I use it to filter myself out before wasting anyone's time.

What I see here is an AND gate, and their use of this process itself triggers the false in my own filters-for-job-positions system.

I see it as an uniform filter. The company is filtering me out, or in. And, if I understand the reality of their options for filtering methods, I can use that information to filter them back. Interestingly enough, this one is pretty uniform for me, if they use the method of out-of-a-book problems in either an on-the-whiteboard or pair-programming session, they have solved the problem of whether I want to work in that position without my having to trouble myself with memorizing problems to answer, nor waste their time screening me.

The filter I am applying here is against the out-of-the-book part of this. If a company wanted to provide a previously solved bug in a section of their own code, with enough information to debug but perhaps not enough testing written into the system, to me this would not filter that position out. As long as the job would be similar in setup to the methods used during the interview this would be fine by me. Difficult, as the level of chaos internally to any interviewee is very high, and most positions would not be nearly as chaotic as the induced level this hiring process creates.

I think asking on-the-board questions, or running some pair programming, is a filter for perspective, and attempts to identify how an applicant handles chaos.

To put it another way, the company wants people who when they hear an alarm will file out of the business in an orderly fashion, doing that job that has out-of-a-book interviews during hiring. I personally walk towards alarms, not away, and am plenty happy not to be stuck in a position that isn't supposed to have someone like me in it.

My system is tragically flawed, of course. A company could easily use an interview process they consider arbitrary to filter the lowest tier of applicants (non-tech people fishing for a few months paid internet browsing), and I could, not recognizing the insignificance of it, preselect them false when they would have stopped that interview after ten minutes and hired me on a handshake.


I would love to see a "this can be solved in O(n)" annotation on each, separate from the solution itself.


If you are just asking Algo and probably pseudo code, these questions might help, but if you ask to write the proper code with boundary conditions etc, that makes it difficult even if you know the Algo. Of course I am assuming that no one memorizes the code ;).


I used to give people a description of how an algorithm was implemented, and ask them to write it out in code. Specifically we'd take the conversation to boundary conditions, failure conditions, testing, etc, from there. I was reasonably happy with the outcome.


The solutions here make the common mistake of ignoring the cost of stack space, and thereby incorrectly claiming that recursive solutions have lower space complexity than iterative solutions.


Not if you point out that whatever platform you're coding in employs tail recursion I guess? But yeah the intermediate results will still have to be stored somewhere.


here is an article by google vp about their approach to hiring: https://www.wired.com/2015/04/hire-like-google/




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: