"What happens when I'm asked a question like X? I spout a mutated form of some canned response I cribbed from a list I found on HN a while back. Because I hear that's how you're supposed to 'rock your coding interview', these days."
3. What are the things you should consider if you were writing your own database server?
"Look, you know as well as I that this job has nothing -- as in, nothing whatsoever -- to do with actually building production-grade database servers. Or anything even remotely analogous to it. Debugging that tangled mess of poorly conceived, never-reviewed JSON APIs left behind by the other developer (while you were pestering them about 'disruption' and 'just needing to get this thing to market') is more like it."
"But hey, since you're playing that game, I can play too: here's a bunch of catchphrases like 'non-blocking I/O, sharding, blah blah.' Because I hear that's the killer answer to give to questions like these. And BTW, if you really think that people Michael Widenius or Salvatore Sanfilippo would be interviewing for this job, then your problems are way bigger than I could ever hope to help you with."
I'm sorry man no cares that you and everyone who upvoted this comment are Super God programmers with 30 offers everytime they say they are looking who can afford to tell every company where they can stuff their interview.
Us mere mortals are willing to do whatever it takes to get our dream gigs.
These guides are largely targeted are getting into the some of the best, thus pickiest and most selective, companies in the world.
And if getting on means being a dsalgo monkey in front of a whiteboard for a few hours so be it.
Let us share interview tips in peace please.
This "hurr durr technical interviews suck" but I have no real alternative and have never built a company nor seen any really flourish at the top without this is monotonous.
Put your money where your mouth is and start and only support companies that skip these kinds of interviews.
Like these fine people
But the snarky comments are beyond annoying.
My own opinion is let’s ignore companies that make us jump through hoops, as it only encourages them to become increasingly ridiculous.
Unfortunately not everyone is in a position to do that.. "beggars can't be choosers"
I’m of the opinion that if you are a company having trouble hiring people (everyone other than google/amazon) you should not copy their interview process.
There are great developers out there that do not do well in these interview environments, who are not worse than those who do well in them.
If you are going for a high power attorney partner job at a big firm they do not ask you to recite obscure case law. Id you are a master carpenter they do not ask you to build a cabinet using a hand lathe.
IMO provable previous work experience if a candidate has it should trump conceptual knowledge spewing, and interviews that ignore experience and give the same theoretical knowledge quiz as they do a new grad are shooting themselves in the foot. And I think everyone knows it, but most engineering orgs are terrified of letting a bad hire through so they blindly do whatever the “top” companies do.
Yup, that seems to be it.
As if there's some "secret sauce" to interviewing that all the other shops know except their own.
But the bigger point is that - even assuming for the sake of argument that these questions actually do "work", on some leel - they've become such a huge fad that people are just adopting them solely on the basis of authority, without intrinsically understanding why these questions work, or more importantly, how to interpret candidates' responses to them
In other words, they're just following a script. I wouldn't say quite it's on "blind faith", per se (because the companies do have some reputation). But it's not much better than blind faith. Such that this precious quote of yours, from just a few days ago, seems quite apt here:
This practice of blindly adopting ideas just because they work for Google needs to stop.
So I guess the problem with objective-seeming (and often just simply gratuitously hard) "filter" questions like those suggested here is that their real purpose seems to be give the process an appearance of objectivity - "We had a guy in here, his code samples looked great, and he talked a great game and all -- but would ya believe, he totally froze on that Lego robot puzzle question! What a poser!" -- when of course it's never going to be anything of the sort.
Why is that weird? The "beggars" I was referring to are people looking for jobs, not employers, so those actually seem like things that would go hand in hand, since if all companies only want the best of the best there will be many people without a job. Thankfully it seems like companies all have a different idea of what the "best" is so I am still employed
How do you know it is you "dream gig" until you have actually worked there? Have you never been excited to accept an offer from a company only to get there and realize its just the same old bag of petty politics, egos, kool-aid drinking and grunt work?
Google, Facebook and the rest of the big SV companies that "only hire the best" are no different than any other big corporation. Also not everyone that works at these big SV companies gets to work on the really interesting and sexy projects.
There seems to be this myth that if you can just pass the Google or FB white board interview you will have it made and your "dream job" awaits. The propagation of this myth has given life to a whole cottage industry of prep courses and books on how to pass them.
I also think you have misunderstood the OPs position, I don't think they were saying that they were a programming god who can't dictate where they work, quite the opposite.
What happens between you typing a URL into your browser address bar? isn't a question that requires a canned response to be parroted back – it's a starting point for a conversation about your knowledge of the web technology stack. It's not at all like there is a 'correct' answer, but demonstrating that you have a vague idea of how parts work, can identify gaps in your own knowledge, and make reasonable choices about how you might fill them.
Maybe what are the things you should consider if you were writing your own database server? is somewhat less directly applicable to most jobs, but I'd argue much the same sort of thing applies – it's a kicking-off point for a conversation about systems that you should have at least rudimentary knowledge about.
I'm curious as to what you would prefer in a technical interview - more directly relevant questions about the specific systems you would be working on?
The one time I have actually been asked this question it sure felt like the interviewer was expecting a certain explanation rather than trying to start a conversation. Hard to tell, since that was the only technical question I got asked in that phone interview and they rejected me after that.
If somebody is going to ask those questions in an interview, at the bare minimum this person must know deeply about the subject, so they can recognize a correct answer and judge their quality. Instead, from how people talk about this (I've personally stopped interviewing for a while) it looks like everybody asking those has no idea what a correct answer looks like, and just expects candidates to follow a script.
There are two things to note. First, it is incredibly unlikely that your interviewer knows deeply about both the HTTP protocol and databases implementation. Thus, anybody asking both is already full of bullshit. Second, it may be valuable to get a hold of the interviewer script and study it. People full of bullshit are not normally very creative, so the scripted answer should be easy to find at Google.
If I have to study things unrelated to my job for some shibboleth interview, that's time I could have spent on my actual skills instead. Besides, any interview that does not demonstrate my ability to write a unit test and encapsulate data and functionality means I know what horrors I'm likely to encounter when I get to the code.
At least that's what I try to do when I end up interviewing someone. For example, can he suggest what sensors would be good for a Lego robot that would knock a tennis ball off the ping pong table without itself falling off? Can he describe the logic? How would he test this? How long, on average, would it take? What position of robot and ball would give the worst case performance (success or time)? His job will not involve Lego robots, but IME the person who can give sane answers on those would do a lot better than a coder who knows one toolkit well and nothing else.
Right off the bat, this is very muddled question to ask.
For one, "sensors"? I think you meant to say "actuators". Because a sensor by itself can't knock anything off a table.
More fundamentally, "Legos", for those of a certain age, were entirely non-mechanical components in their day. So (even though yes, we've also vaguely seen and heard about, and rolled our eyes at, the newer kind), this just sounds like asking them to (seriously) consider the question of how to build a logical circuit out of tinker toys. Yes, it can be done - but on the face of it, that kind of a question comes off as an invitation to join a mental masturbation session. Not the kind of question you'd ask of someone you respect, and whose time you know to be as invaluably precious as you own.
The person who can give sane answers on those would do a lot better than a coder who knows one toolkit well and nothing else.
This is of course a false dichotomy. Especially given that it was -- not sure how else to put this -- a silly question to ask in the first place.
What I personally am looking for is an engineer -- someone who can look at problems formulated in a fuzzy fashion (this is critical), figure out what technical solutions exist, probing the customer/user to flesh out the initially vague requirements and come up with a few different ways to implement this.
You are looking for coders: someone who can take well posed, unambiguous problem and algorithm descriptions and turn those into a working code. That is probably a better fit for a "coding interview" referred to in the title. It is just my experience that I get much better code and value for $ by hiring engineers even though I have to pay them higher salaries. That's just me though, YMMV.
No I'm not - I'm very much looking for engineers in the mindset you describe. I just don't think your Lego robot question is a good tool for sniffing out that mindset. (For the simple reason that "muddled interview question" != "fuzzy real-world problem" - in fact there's a world of difference between the two).
But then again, we have a disconnect. Which is of course totally fine.
There's no need to make it any "harder" - the point is that just wasn't well-articulated to begin with. (Noting again that "well-articulated" is very different from fuzzy / ambiguous, in the engineering sense).
And given that the people you want to hire, by definition, are generally very busy (and have many, many important things to do with their time) -- even when they aren't currently working (despite the urban legend that "they always have jobs") -- as such, it just isn't a good use of their time.
The database question seems better, because any programmer should have slight experience with it.
Which inevitably leads to outcomes like:
"Oh I see you're from Kazakhstan, and got top grades at one of the top schools there. And your GitHub account just blew our team away, the minute we saw it. And we can tell, from talking to you for 5 minutes, that you obviously know FP and SQL/NoSQL and database internals and caching and consensus protocols inside and out."
"But wait -- what's that look in your face when we ask you about solving this little Lego problem? Do you not like puzzles? Or wait - you mean you've never heard of Legos, for chrissake? We'll, what can I say. Sorry to break this to you, but we're looking for engineers here, not mere 'coders'. And how do you tell if someone has the soul of an engineer? It's because they dig shit like Legos and robot kits and stuff. They totally geek out and go to town on problems like this, in fact!"
"But then again, that kind of person obviously isn't you. Feel free to apply again in 6 months (after you've passed another 3-hour HackerRank test and our mandatory Skype session on culture fit, of course)."
I was asked a similar type of question at Microsoft once about operating systems. I don't think the interviewer expected that I know the answer to what he was asking but rather explain what I do know, and then based off my limited understanding work backwards to see what might be done and what tradeoffs are being considered.
At least that's what the interviewer said to me when I responded with "Oh God I have no idea I have never done anything like that before." For what it's worth I did get the offer despite not knowing anything.
That being said, I did interview at another company where I was asked something that I guess the interviewer thought was very basic and I responded with something similar. His advice to me was to never say I don't know something, which I don't really agree with. I didn't pass that one.
Sounds like they're the ones who messed that one up - not you.
I hear it as:
"Tell me how detailed your understanding is of database internals in a conversational format; here is a talking prompt to help..."
regarding typing in the url question:
"Tell me the level in which you understand networking protocols"
In the real world I would hope these questions would be more of a discussion; "ah, you mention ACID, what kind of pattern can you employ to help implement that", "MVCC" etc..
This format of interview is hard to pass on simply regurgitation alone and does a pretty decent job of vetting technical understanding in a natural, conversational way.
1) Everyone swears up and down that it's an absolutely vital, fair question.
2) Everyone justifies 1) by giving conflicting (to others' answers) criteria they're judging it by, such that no answer could possibly satisfy them all, except by saying something smart that impresses the interviewer and might not correspond to rigorous judgment criteria.
Many of these questions are very vaguely scoped which gives the confident and familiar plenty of room to show how clever they are, but also makes it more intimidating for those who haven't seen them before.
Or, to put it another way, the linked site looks like a great way to make yourself look like you have a junior-to-mid-level understanding of a lot of different areas. But the process it's optimizing for starts to break down the more senior a developer you either (a) are or (b) are trying to hire.
If you're interviewing for a front end position, you can make a nod to interrupts and continue on with BGP, ARP, DNS, TCP, IP, routers, etc. If you want to show your knowledge of security you could mention heartbleed, poodle, or even say why this isn't vulnerable to shellshock. An overview will take a minute, and the interviewer will follow up with more targeted questions if needed. It's a chance for the candidate to shine.
I'm always ready when asking something very open ended to follow up after 15 seconds or so to say "let's talk about [area X] first." For URLs, maybe this is "let's talk about what the actual network request, if you sniffed the traffic, looks like."
Well if someone claims to be "full stack" - yes.
"Every step?" Not at all true, in my experience.
What you need is a reasonable sense for the breadth and depth of the topic -- which quickly reveals itself to be far broader and deeper than what most engineers (including many very capable people I know) can hope to "absolutely understand every step of". And the ability to drill down (and go down rabbit holes) when needed. Combined with the experience of actually having gone down more than a few (and having from relatively unscathed, eventually).
In fact, if anything, your observation isn't just overstated - it actually misses the point of what makes some of the most effective people as effective as they are: their ability to manage complexity, and in particular, their ability to (reasonably) "connect the dots" in a complex system despite not "absolutely" knowing, at every level of detail, how every component in that system works.
Mind you, for the right value of "reasonably". It's when knowing when their level of understanding, while imperfect, is still good enough to solve the problem at hand - and when it isn't - is that makes these people so effective.
I don't see that question as being any different really. They're both just ways of guiding a conversation around the area we work in to gauge what candidates know / are interested in.
Personally, I've poked around in Postgres a bit because I find it interesting. And having deeper knowledge means that when you're about about to add a new not null column with a default value to a massive table you can stop and reason through how you're about to shoot yourself in the face :-)
You need to have a bit of an idea, sure.
But in order to give any kind of a meaningful answer to the question, you need to have not just a "bit of" an idea, but in act, a deeply solid and intuitive idea. Either that or -- crib the answer from websites like these, know that it's one of the top 30 questions you're likely to be asked.
Which, very sadly, is exactly what the modern interview process (greatly) incentives people to do, these days.
Do they talk about tables? What about indexes? How about sql? Do they ask enquire about the requirements / data model?
Interesting, because as soon as I got over the "why the hell do you want to create a new DBMS?" question, my first thought was about filesystems. I haven't digged into any DBMS to look at, but all those you point look so easy when compared to atomic data writing...
Questions are not inherently bad. It's how the interviewer interprets the answer, and whether it's used as a gateway to a discussion or a one way q&a session.
If your default answer is "Just use PostgreSQL until something screams at us that we shouldn't" then you are correct, you don't need to know about database systems.
Yet, these will be the same people screaming to use <flavoroftheweekQL> instead just using PostgreSQL.
You'd be surprised how many senior developers freeze when they are asked to reverse a list using any language of choice.
The better question is why you think that’s a good thing
Asking easy and qualifying questions upfront saves money for the company.
That's a really nasty attitude to take when someone is trying to evaluate paying you a large sum of money for many years of work.
Considering that I've run a ton of technical interviews, there's no good way to evaluate someone for competency without these kind of hypothetical questions. (This is how I screen out the morons who don't know how to use a database.) It's a two-way street, though. You can learn a lot about an organization by the kinds of questions they ask and the direction they send you down.
In the first case, I got a nice offer. In the second case, I cut short the interview.
The questions were virtually the same. In the good interview, some silly fizzbuzz stuff, followed by "how do you do X", and me doing it. In the bad interview, do this thing, in bash. Had to read a man page on it, asked the interviewer if they had man pages, found out they didn't, asked if I could use the interwebs, at which point he became more flustered and overtly hostile. I then proceeded to solve the problem, while typing in vim, while talking him through the whole process. He was visibly angry and shaking after this. Tried a few networking questions, some of which I've known, some of which are far outside my domain, have never had a reason to study/investigate, and were completely unrelated to the task at hand.
The good interview asked me similar things, and when I indicated some of the boundaries of my knowledge, proceeded to work with me on how quickly I could learn something and become useful.
There is a huge difference in these organizations. One of them is thoughtful, has great people, and would be a terrific place to work at.
The other ... not so much. Later confirmed by talking to people who worked there.
Many people are justifying "hard" problems using various reasoning. When I ran my own company and hired, I never used this. What I wanted to see, was how something thought through something they've not seen/done before. This is what I cared about. Could they learn, and apply this learning, in situ, at high speed, and grow as a worker.
It seems my viewpoint is in the minority here, with a number of people preferring to test random knowledge fragments, as compared to actual ability to learn and apply what you learn. This is a shame. This generates groups whom are really good at recall, but possibly less so at applying this (as the "applying this" is not tested/measured).
I've seen the focus on velocity and accuracy of regurgitation in grad school. Test scores from people who focused on that were high. Capabilities as researchers ... quite low, with very rare exceptions.
The same effect is likely to occur in this scenario.
Technical interviews are hard. Probing depth of knowledge is challenging. Getting a sense of thought processes requires creating things that people can learn from, and apply.
I've had precious few interviews like this. Happy my current position is one of them.
"How did I get to this situation? Why am I here? Why am I writing my own database system instead of using an existing one? Where should I leave my resignation letter? Should I give them two weeks notice or just quit right now?"
Not only is this a perfectly reasonable question to ask - one is almost tempted to fault someone for not coming back with this as the very first question to ask in response to the question that was initially asked.
Given that 99.99% of the time you're tempted to go that route (as with, similarly: your own ORM, "scripting language", whatever) - you're going to wind up with, at the very least, way more of a challenge than you initially bargained for.
My last job was to work on a proprietary, in house database, and while the case for it wasn't quite as strong as when it was originally written, it still did things that no external offering did that were considered important to our business. I probably wouldn't want to repeat the experience, but I learnt a lot there, and that company is wildly successful, not least because they make truly strategic decisions about buy or build.
You'd be surprised how many people freeze when they are asked to reverse a list of elements.
Another reason to ask fundamental questions is to ensure that an otherwise fit candidate understands the tradeoffs involved.
When I interview, I ask for her to write some code, yes. But I make it very simple: fizzbuzz, or reverse a string. But then inform the canditate that I am looking for style, craft and personality. Somewhat akin to a chef being asked to fry an egg
You're looking at it backwards. That type of question catches people who misstated their qualifications. It's a buy-in to sit at the table.
People can debate whether they should have to answer questions, but in the real world one or two small qualifying questions of this nature save the employer a lot of money and time in the hiring process.
I ask about quantification because I see it as the only way to reduce/eliminate bias.
Answer: I would consider why I'm writing my own database server, as opposed to using something off the shelf.
Maybe what I'm asking is tips on how to steer an interview. One of the reasons I wouldn't even consider a tech giant interview is that I feel like I'm just one of many cattle being processed through a long gauntlet of nonsense that distils me down to quantifiable traits rather than who I am.
I don't even ask softball questions. Mine are more like tee-ball questions. Too often, the tee pitches a perfect game.
The salient point is about halfway through when they put up the big chart of a couple thousand interviews and the stddev vs. mean of technical score, and say:
As you can see, roughly 25% of interviewees are consistent in their performance, but the rest are all over the place. And over a third of people with a high mean (>=3) technical performance bombed at least one interview.
Now, what this points to is a very clear failure of these standardized interviews with their allegedly "tee-ball" questions, because the results shown would be astoundingly unlikely in a world in which the interview process had any kind of consistency in identifying technical skill.
So, no. You're not doing what you think you're doing, and if you've got a high failure rate the problem is almost assuredly not with the candidates. It's with your interview process, or with you.
Also, forked repos are fairly obvious, and there's a dropdown to just show source repos, and if they are manually forking to prevent it from showing as a fork on there, that's a pretty shady indicator that you may want to follow up on...
An interview is not like the SAT. And even the SAT is not an unbiased test.
I honestly don't want to have to test if someone can handle a simple algorithm. (I'm not talking about anything crazy here, mind you.) Can they read code? Can they think about how to test?
I've seen faked code samples, and our test has a half dozen answers on github.
I want to just assume that anybody with five years of programming experience can code, but I've worked alongside enough counterexamples that I know I can't assume that.
That said, I'd really rather not have to deal with it as a candidate or as an interviewer.
If someone asked me a problem that basically reduced to a textbook case of something in the middle, difficulty wise, like reservoir sampling, and I said "oh, nifty, resevoir sampling, what do you use that for in your stack?" and they said "nothing, I just want to make sure you can code," then depending on how my experience to that point in the process had been going I might challenge them on that to some extent, a la "do you find this more useful than questions that are more representative of the work you're expecting me to do?" If the loop had been going particularly robotically, this might find stronger forms - I've walked out on a mismanaged interview before, though it was way worse than this (and way worse in a very different way - they just weren't at all prepared and the tooling they wanted me to use wasn't actually functional).
Or they might answer something more like "we use it to select test cases from our logs to ensure that we're adequately testing against real behavior with proper weights" (or whatever—it's been a while since I actually did this in a past job :) ) in which case I'd still try to have what I consider a more "real" conversation ("what primary problem are we trying to solve, and what else might be options to consider"), but in that case at least they had a good answer to the question and I'd be happy to write the code too.
But lastly, frankly, as a hiring manager I absolutely want to hire people who are willing to "not focus on the task at hand" and question things. I don't want people who just blindly do what I tell them to, I want people I can trust to disagree with me and be able to make a case for why they disagree. Being a manager doesn't mean my instincts are 100% and theirs are 0%.
Yes, because tasks representative of the work we're expecting you to do are too obvious and simple for an interview 95% of the time, and are too hard and take too long for an interview 5% of the time.
It's kind of like asking someone interviewing to be a cashier at a fast food restaurant multivariable calculus problems, because 95% of the math they're going to have to do at their job is "too obvious and simple".
If these high-priority and hard 5% were easily separable from the tedious 95%, this logic would be correct.
This seems a little too general to me. Many things are not clearly thought out, and I'd rather work with people who can recognize when that is the case and communicate well enough to correct the course. A big part of successful development in my experience is knowing when not to implement things, or having foresight to implement things in a better way. Working with people who just do what they're told is often a huge waste of time.
And those people are what we call “assholes”.
Everyone makes mistakes. If you make a habit of questioning people’s competence because they make mistakes, and didn’t ace your whiteboard coding tests beforehand, you are 100% the source of the problem in this industry.
I only take issue with this. Many things in organizations that have been through explosive growth (and sometimes others) are that way because for no reason in particular and amount to bandaids upon bandaids because nobody's ever had a chance to actually look at the process. It very well might suck. Questioning it is the first step in making it not suck.
That said, as an interviewee is probably not the right time to do so. You're an outsider, you have very little rapport or relatability compared to the people that already work there. Even if you can point to something and unambiguously prove that They're Doing It Wrong, your advice isn't in a position to be heeded.
Moving the bit about the scope of the role (jr vs sr vs lead) up to preface the rest of the comment would've toned it down a bit, but even there I disagree with the "taking a risk" part -
a senior candidate shouldn't feel like it's risky to offer alternative perspectives. If I got the impression that the hiring manager thought asking questions would likely be a sign of "cannot focus on the task at hand," that's going to set off my own flags a bit.
If you're married to your interview process I'm worried you'd be even more married to your day-to-day process, and I haven't met a perfect company yet, so I consider being overly adherent to process instead of results a warning sign.
Also, as an interviewee, you stand to learn a lot more from that interaction than by rejecting it. Hell, I welcome coding challenges as an opportunity to size up the interviewer's chops!
If the challenge is conducted in a reasonable way, it should feel like any other design/pair programming session you've had at the office, and it can be quite revealing of the tendencies of your potential colleagues.
> Also, as an interviewee, you stand to learn a lot more
> from that interaction than by rejecting it.
> question things that have clearly been thought out
> ahead of time.
Call me old fashioned, but I thought that was my job as an engineer. To question things and find ways to fix/improve them.
Am I wrong?
If you're focusing on the trivia, you're probably too focused on the code and not on solving the problem -- my experience tends to be that there's at least one or two interesting edge cases you encounter if you try the naive approach, and those are generally what the interviewer wants to discuss. The reason is that discovering edge cases in requirements and solving them is what a lot of the job is, and so how you approach that matters (and to an extent, more than if your code on the board is "correct").
Another approach is to extend the problem by asking clarifying questions -- if there's something unspecified that might influence your design, ask about it and tell the interviewer why it matters. Again, this generally is more what they care about than the details of the coding.
Finally, if those skills make you an asset -- why not apply them to the question? Even tangentially by talking about how you would (if you had more time/were doing a real assignment).
Coding on the white board is more about the journey than the coding, assuming you can write vaguely correct code. (Though writing on a whiteboard is definitely a skill you have to practice.)
"That's a cool problem. I'm curious though - I didn't see that on the job description. I'm not expert on that, and I didn't bone up on that last night. (ha ha everyone laughs)
I can step through it if you want, but it would help me if I knew why you asked about this. If you're looking to understand how I solve problems, that's okay, I'll muddle my way through it and ask if I need any clarifications."
They'll tell you. Interviewers often hold similar views to the candidates, but may be asking for a different reason than you think.
After the interview was done I questioned the person why have another coding test. He said it was to test on different areas of programming. I later messaged the VP/CTO who wrote the blog post on not having live coding test. The person apologized for the experience and said that the team I interviewed for did not come under engineering org. It was a frustrating experience but at least the person responded positively.
real life example: an MIT PhD in distributed computing walks into the room, looking extremely tired from interviewing people like me. I can tell this is going to be rough unless I'm controlling the conversation and we have a good rapport.
He's wearing a Fender shirt. I have played guitar a bit. After he introduces himself and I myself I open with, "Oh nice shirt. I assume you play some guitar?" and there goes 20 minutes of the interview talking about guitars.
So long as I don't completely bomb the technical aspect I've left this person with a positive and relatable experience.
I have been on both sides of the equation.
1) Brain teaser questions sometimes aren't as easy or as straightforward as everybody thinks--don't ask them as an interviewer unless you genuinely KNOW the question.
Digital VLSI design loops almost always have a question that makes you use a state machine--this is actually a decent question for a whiteboard as long as you don't do something stupid like use J-K flip flops in CMOS. There are a host of basic questions you can use ... but beware of traps.
I was once asked the following question: "You have a serial bitstream that represents a binary number. Keep track of the value of that number modulo 3." This is a decent question ...
Or a really hard question ... depending upon whether the number is being clocked in MSB first or LSB first.
Unfortunately, when I wrote the numbers down on the whiteboard, I chose the "hard" direction. And had to really think ... and the interviewer kept getting more and more agitated and finally complained "Look, this is easy, just do 3 states like this". To which my response was "But clock in any of the numbers in that I wrote on the board. Your machine fails at the second clock."
His machine, of course, works just fine for one direction. But that wasn't the direction I picked and documented picking by writing it on the whiteboard. And his machine didn't work with the bitstreams I chose. And 20 minutes wasn't enough for me to see what he got wrong and demonstrate it back. And he didn't understand his question well enough to flag the reverse order.
Exercise for the reader: There are two answers to the question depending upon MSB-first or LSB-first. One answer only requires 3 states. The other answer requires 6 states, but it is NOT clear that 6 states are enough without some serious thought (you can prove 6 are enough inductively if you are clever).
2) I used to always bring to electrical engineering interviews 2 printouts with the same program--one in Python and one in Perl. This meant that when I was asked "Do you know Perl?" I could force the interviewer into MY subset of Perl--and I could explain why I avoided Perl nowadays with a compare and contrast. This simply steamrolled all but one interviewer.
3) I was interviewing with an RF company who was needing VLSI digital designers. A poor slob of an analog engineer drew the short straw for handling the digital questions and was so completely out of his depth that I felt sorry for him. After about half the interview of glazing his eyes with advanced digital architecture, I threw him a bone. I made an off-handed comment about digital really not being digital and how the old audio guys used to use the 4000-series CMOS inverters as audio amplifiers. Baited and hooked. We spent the rest of the interview discussing that as an audio amp (of course, I knew the answers to that question cold--that's why I mentioned it).
4) As an interviewer, I used to run a really difficult section of the VLSI design loop which used to intimidate candidates horribly. I had very stern words with HR that my section was supposed to always be at the end (we almost lost a REALLY excellent candidate because he hit me first, did excellently but THOUGHT he bombed, and then flubbed everybody else afterward). I only had a couple of questions, but they were bottomless. If I had a PhD in front of me, I could make him explain complicated things. If I had a newbie in font of me, I would take him one step further than he was comfortable with and see how he reacted. The candidate steered the interview, but within the bounds I set.
5) My favorite interview was when an interviewer asked me my own difficult question back. It took a moment until I realized exactly what he did, and then I launched 30 minutes into the bottomless pit until I was sure that everybody had glazed eyes and then some. We all had a good laugh afterwards.
At places like Facebook, I heard that you're given several interviews with 2 coding questions to answer perfectly in 45 mins. It's basically luck of the draw whether or not you know the answer, and the best way to game it is by memorizing as many answers as possible. I know this is possible because I have friends that have gotten offers from Facebook and Google by doing exactly this.
All this does, of course, is create an arms race where interviewers ask harder and harder questions. It's frankly ridiculous unless we start taking the algorithmic portion away from the interview.
So these increasingly ridiculous interview techniques have a more sinister purpose. They are an extension of the frat tests to become part of an 'exclusive club'. It's just another brand trick to try and get you to invest so much time and energy into an application that if you actually get picked you will work for less money and worse conditions.
You might be the only applicant, but by suckering you in and getting you to jump through hoops you are less likely to walk away when you finally find out they want God on a stick for two pennies for 70 hours a week.
Look up psychological loss aversion and sunk cost. Humans hate to write things off and walk away.
If I'm hiring a developer I can look at their Github resume and see what style they are and how they work. Then really I need a chat with them to see if their personality fits with the existing team personality (ideally you interview them with the team. The team need to work out if the new prospect fits with their group personality, e.g do they think chickens with lips are funny). You pick one based upon that process and set them on a side job to see how they perform in practice. If they don't perform during probation you let them go and try again.
That is a far less costly process to go through than rigging up seven round interviews with everybody.
In some respect if you do set up a seven round interview process and invite applications you actually want to talk to those who declined the process. Particularly if they told the selector that it makes no business sense to have such an expensive and time consuming process and therefore there must be another reason for it.
Because those are then the ones who can look at the system from the perspective of the other guy, work out what is actually happening rather than what people think is happening and then determine it isn't efficient and can be improved. Which is usually what you are after.
> Generation Y interviewers (between 20 and 30): Bring along visual samples of your work and highlight your ability to multitask.
> Generation X interviewers (between 30 and 50): Emphasize your creativity and mention how work/life balance contributes to your success.
> Baby Boomer interviewers (between 50 and 70): Show that you work hard and demonstrate respect for what they've achieved.
This strikes me as an incredibly ageist thing to do. Even looking at the "suggestion" for my supposed group — what? No; answer the question as correctly as you can; if you can't, try to reason though it as much as you can, and try to convince me that you know something.
Some of this is good, but that part in particular … just no.
It tells you the person understands data structures and some aspects of programming, but translating that to productivity/success is questionable.
In addition to competitive programming you've got "collaborative programming". Programming for maintainability, growth, construction for verification, good practices, etc.
A good programmer has a balanced combination of both.
I usually make it a policy not to bad-mouth my fellow coders, but I once worked with a BRILLIANT developer, knew all the trivia, rocked his interview, impressed the hell out of all of us.
He then spent two solid weeks building an email address validation micro-service. It was perhaps the most exquisite and technically correct email address validation micro-service ever built, but was that really what the business needed?
Some people have a better intuition than others when it's moment to capture those requirements. Maybe it's better to spend a few more minutes doing the requirement analysis to set those expectations clear.
Now, sometimes people tend to underestimate some things. URLs for instance, similar to e-mail addresses, can seem trivial to deal with at first but they're actually more complex than people intuitively think. Incorrect handling of URLs is a significant attack vector.
"Coding Interview University" https://github.com/jwasham/coding-interview-university
Coding interviews test for skills that aren't really the same skills you need to do well as a software engineer. I'm surprised the industry hasn't found any better way to evaluate job candidates yet, given how much is at stake.
I also recently put this list of resources together about Algorithms practice which I think can help out anyone who's preparing for an interview soon:
It hinges on the fact you can get a much better sense of a developer after you've worked with her for a few weeks.
So, here's the startup idea:
Hire a group of developers to to work with you on a small project for a few weeks. They come in as a (virtual or real world) team and build something you've been meaning to do, but it's on your low priority stack.
All of the developers in that group are actually looking for a full time job. As you work with them during those 2 weeks you can see which dev you gel with best and offer them a job.
In the mean time, all the devs get paid for doing useful work for you for a short period of time, even if they don't get hired.
So... the startup is a central agency that engages developers looking for a full time job and then hires them out as a group to help clients do a small project, with a view to picking one or more of the developers for full time work.
This is a win for the agency because they have to do less work to convince a client about a candidate. It's a win for developers because they get paid to be on job interviews. And it's a win for the client because they get to try before they buy and as a result will make much higher quality hires.
Possible brand name: TeamHire
Edit: If you like this please feel free to take it and run with it :)
The problem is that the contracting firms still want to stuff your office full of their workers, so we still need to run diligent interviews. (This means making sure that they don't coach the candidates through the interview, because once the candidates are coached, we can't really evaluate them.)
The good news, though, is that we get much better results working with a firm like this. Most of the people we interview we hire, unlike in the past, where most of the people we interviewed were straight up morons.
Especially since the current job market gives me plenty of much lower time investment options?
Well, for one thing it would probably be a fun hackathon type team experience.
It would also be good for growing your professional network and increasing your luck surface area.
Seriously, though, that's not a bad idea. I've seen similar startups that pre-interview a bunch of candidates (instead of farming them out for small jobs), so this is a different take to solve the same issue. Both sides get a chance to win, either way.
Only gripe, the online ide you use sucks balls. Half the time I'm debugging whether I used tabs or spaces, and dealing with errors. No matter how careful I tried to be, it seemed to default to whatever I wasn't using and threw errors. Please get a new ide and add in some test cases that maybe tell me how I'm doing in those tests.
Otherwise seriously awesome site.
After all, the interview is supposed to be a two-way process. But they usually give the candidate a lot less time to ask questions.
They want to ask me all the difficult questions to see if I'm good enough for them. And I likewise want to ask questions (fairly basic ones) to see if they're good enough for me. Few things suck more than the company demanding a lot from their candidates, and then sticking you in a team with people who don't seem to know the basics.
But I've been doing a bit of the leetcode/hackerrank challenges lately, and believe it or not, I actually think it has made me a bit better, by refreshing bits of knowledge that were getting stale, and really forcing me to think about performance (many test cases in the challenges will fail without the fastest implementations).
And they are kind of fun.