Hacker News new | past | comments | ask | show | jobs | submit login
Programmers are confessing their sins to protest a broken job interview process (theoutline.com)
303 points by CoolGuySteve on Feb 28, 2017 | hide | past | favorite | 226 comments

I am a real Software Engineer. I don't remember all the details that are seconds away via Google or other reference documentation. I focus my brain power on thinking about the problem, trying to see it in the wider context, and understanding the needs of the users in my problem domain. I weigh the different solutions, choose a reasonable one, and implement a prototype that I can refactor into shape, step by step. And I have been doing this since 1981, before people talked about Agile, Patterns, and so on.

Best practices are best for a reason, and the reason is not the language around describing the practice, not the experts who wrote books and give seminars about the practice and not because of some Computer Science professors choice of course material.

What is wrong with thinking about problems before rushing into a solution? Since when does a certified computer science solution solve real world problems for real users? (answer: occasionally). The world of software development is driven as much by mindless fads as fashion, entertainment and politics.

But it should be more about hard nosed engineering, up against the wall face to face with harsh realities.

This reminds me of my mathematics university professor friend who thought he was a year younger because he couldn't correctly subtract his birth year from the current year.

Why would he? He's a mathematician, not an accountant.

Are you interviewing successfully? I keep getting tripped up by concrete cognitive type in interviews.

I concur.

Whiteboards are useful for conversations about software architecture, but that seems to be the less common application in interviews.

Writing pseudocode you've memorized from a book is fundamentally meaningless. You have to realize it isn't really a programming test: it's a filter for whether you're willing to do what's required to fit in.

The principle is the same as in classic British private schools where boys were required to memorize long passages of ancient Greek and Latin texts. In fact, tech companies would probably get the same hiring results by replacing Cracking the Code Interview with Ovid's Metamorphoses, and asking about Phaethon's fall instead of binary trees -- the substance doesn't really matter. (The non-whiteboard parts of the process would do the same job as they currently do of ensuring some degree of practical software competence.)

I don't remember the name of the company nor the question but in the dot com era there was a startup in the central square neighborhood of cambridge which was notorious for asking candidates one of those gotcha "really hard to answer if you'd never seen it before" questions. The catch? Apparently that question was something which every MIT CS had seen before but wasn't frequently used elsewhere. Thus they were effectively filtering for MIT grads without being obvious about it.

What an apt comparison!

It appears that you have completely failed to realize that your example from British private schools regarding memorization is used to weed out the "riff raff" as it were. Don't want to hire "brown people", women, or other non-white privately-educated individuals? Institute a British-styled rote memorization test and make claims that it is fair because, "anyone can learn the requisite knowledge."

Whiteboard interviews are actually quite similar in both their implementation and impact!

I assure you, I realize the intent of mock-meritocratic memorization tests that happen to ensure a "cultural fit".

Your use of that phrase is chilling. I will never be able to view it the same way again.

It sounds so innocuous... "Cultural fit." But wow does its meaning become sinister when used in the context of (illegal) discrimination.

actually, I think the Greek and Latin text memorization WAS there to weed out the riff raff. If you are going to hire someone young and energetic to do a job that is not exactly taught in school, would you rather have someone who can work hard, even struggle to learn, but ultimately succeed? Or someone whose character is unknown even if they can say all the right things in an interview?

And since this is about historical reality, we know that the British public school system of rote memorization of Greek and Latin texts, along with translation into and out of Greek, did produce capable people who built up an empire and maintained it for many generations.

You're making a pretty silly claim. We could just as easily claim that the British empire grew to what it was because British schools had uniforms or it happened because the British eat more biscuits or Marmite.

Rote memorization of Greek and Latin texts probably had very little to do with the expansion and maintenance of the British empire.

TL;DR: Seriously?

Rote memorization of Greek and Latin texts probably had very little to do with the expansion and maintenance of the British empire.

No, but the sorts of schools that required rote memorization of Greek and Latin texts tended to be 'good' schools that also taught other things that where considered useful (including things like how to behave in a 'proper' British manner) . Testing for Latin knowledge was a proxy for testing for a good general education from a good school.

hahaha this is hilarious. BTW I think you are right.

> The process “freezes out many of the people who are underrepresented in the software development field,” Larson wrote. “If you’re busy working and raising kids, you want to spend as much of your scarce time as possible learning to code — not performing rote memorization that won’t matter once you start your job.”

I'm not sure I buy this argument. If you have one candidate who has seen all of this stuff before and can use it directly from memory, and another candidate who has been busy raising kids and has never seen this stuff before, who do you want to hire, all else being equal?

I could make an analogy to seeing a doctor. Say your kid is sick with a high fever. Do you want to see someone who did an EMT course at a local community college and hit the basic highlights of how to treat different symptoms, or do you want to see a doctor who had to take everything from organic chemistry to biology to anatomy to pharmacology courses? Sure the chance of any given piece of that training being relevant is small, but you want someone whose decisions are informed by a deep knowledge of the field.

If you have one candidate who has seen all of this stuff before and can use it directly from memory, and another candidate who has been busy raising kids and has never seen this stuff before, who do you want to hire, all else being equal?

The one who's raised kids† obviously.

As that's approximately 3-4 orders of magnitude more difficult -- and indicative of not only of general character (by itself hugely important), but of the more generalized cognitive skills necessary do to high-quality software development -- than the ability to memorize a bunch of scarcely-used algorithms. Because, you know, all those websites and Big 4 recruiters told you to.

† And, to bend your requirements (which is necessary because you're presenting us with a false dichotomy), has darn well "seen this stuff" not only in college (unless you think they're lying about that degree), but in the 0.5% or so of real-life development work which actually requires you to conceive, analyze, and flawlessly implement some non-trivial (if not outright esoteric) algorithm right there, on the spot.

That is, I don't dispute that these skills have some importance -- just that the current default interview process assigns a ridiculous degree of importance to these skills, way out of proportion to their actual importance in the field. And to the exclusion of other, more generalized skills which are actually way more important, but alas, much harder to measure -- with the current crop of faddish (turn-the-crank) interview techniques, anyway.

I would say further, that If you are in a position to be implementing some non-trivial algorithm that's not currently in the toolbox, then your best approach is probably to go read a few papers first, and make sure that the algorithm you're looking at has not been surpassed. (Or, perhaps, you just need to dig something up from the '70s. A lot has been forgotten).

Then again, if you can implement Paxos on the first try...

The best personal analogy I can think of for myself. Six or seven years ago I was updating some legacy code coming across an unsalted MD5 hash being used, "Huh, I thought MD5 was becoming obsolete." I did my research to confirm and also to determine what would be the best hashing algorithm to move forward with.

Of course that is not exactly the same as some other problems presented by white board coding.

I just googled Paxos and I still can't implement it. =(

Try raft. It's the easy version that works most of the time.

Meh .. I think Paxos is easier than raft. A lot of Raft's complexity is in leader election, getting rid of old leaders, etc. Sad that I have failed Google's whiteboard interview soooo many times that I don't bother returning messages from their recruiters.

Raising kids is a metric of nothing other than having had sex.

Every parent I know is a "great mom or dad" simply because you're not allowed to say people aren't great parents unless they are truly bad.

I was referring to the practice of raising kids. Not to participation in one side or the other of the insemination process.

Whiteboard programming isn't very relevant to day-to-day programming.

Raising kids is doubtlessly even less relevant.

Raising kids is doubtlessly even less relevant.

Right. But if done successfully (or even just satisfactorily -- given how difficult of a job it truly is), at least has some correlation with character, and general cognitive fortitude.

And hence -- unlike the ability to withstand repeating whiteboarding sessions -- has at least some correlation with actual fitness vis-a-vis the role being applied for.

"So you seem to have the experience we're looking for and our engineers seem impressed with your interview. Just one small final thing, a formality really, could you bring your daughter in tomorrow so we can talk to her and see if she's been brought up right?"

If whiteboard testing is frowned upon, I imagine testing a candidates kids would be doubly frowned upon.

No worries; I was responding rhetorically to a rhetorical questions. I wasn't suggestion that it actually be used as a hiring filter.

> Raising kids is doubtlessly even less relevant.

That's a bit dismissive. From personal experience, if one isn't a complete failure, raising kids to a satisfactory degree would exercise one's managerial and time management muscles, not to mention their ability to work under pressure/sleep deprivation. Parents of infants

I reckon that'll change significantly as AI grows increasingly advanced and complex.

A better analogy would be if hospitals had doctors do o-chem problems during interviews. Who would do better? The student who just took the class, or the doctor who's been practicing for 10 years?

Also, doctors look stuff up all the time.


Yes, that's why technology underlying watson are seen as threatning their job.

> If you have one candidate who has seen all of this stuff before and can use it directly from memory...

Then this candidate has likely done many tech interviews before, what does that tell you?

> who do you want to hire, all else being equal?

That's the point. It's freezing out people who are equally competent, but can't perform as well at this arbitrary step.

They're equally competent except that they can't design an algorithm on demand based on remembered strategies for solving problems, or talk intelligently about the trivia of the field that can come up in a technical interview, or tell you the asymptotic running time of a given algorithm, or comfortably rattle off a bunch of code from experience without needing to look things up online, etc.

That's not the same as being equally competent.

Rushing into coding something using abstract mathematical computer science techniques is a poor way to solve real world business problems. The person who has spent time in the real world raising kids realizes that you do not build business value by being a smartass know-it-all and that when time and money are of the essence, something quick and dirty, maybe even brute force, will save the business, please the customer, and provide you the money to fight another day.

Do not hire the smart ass. Take the person who lives in the real world.

> Rushing into coding something using abstract mathematical computer science techniques is a poor way to solve real world business problems.

(If you want a disclaimer, I'm not really sure which side I come down on with this, but:)

Knowing how to model something abstractly, and what known good algorithms might be applicable, is exactly not 'rushing into coding'.

It is when it's done within an artificial fifty minute-long timeframe.

This is a strawman argument. Nobody is saying that algorithm and data structures knowledge is preferred over pragmatism, commercial awareness and real world problem solving.

What brianpgordo is saying is that algorithms and data structures are an important quality in addition to those.

That presumes a specific type of engineering role.

Here's another: the founders already coded all the brute-force versions of everything. Now they want their software to scale. You will use your knowledge of algorithms and data structures (and distributed systems) to turn their one node that handles 10 events per second into a cluster that handles 10 million per second with low latency across several geographic regions.

Yeah but at this point how are you going to be able to know what to ask? If you have someone who's a better algo ninja such that they can do a whiteboard interview on the things that matter, then have them write your solution.

If you don't, either look for someone who's done something like your problem before, and ask about resume/experience instead of whiteboard gauntlets. Or acknowledge that it's a research&implement position, and... again, skip the gauntlet.

those algorithms were discovered by PHD students spending years to come up with them. doubt someone will invent them in a 20 minute window without any reference material.

the question is asking if you seen that before, period.

I neither agree or disagree with you, but what concerns me is that your same logic can be applied to the "top tier university" farce:

"They're equally competent except that they didn't go to a school like MIT or Carnegie Mellon known for the highest quality CS departments. They obviously would've been accepted if they were top-tier grad material, so they are qualitatively not the same as being equally competent to someone who graduated from one of those programs".

I do agree with you inasmuch as they need to be able to speak intelligently about CS. Whether or not they can recite some textbook taxonomy of algorithms, they need to know how to describe and critique an algorithm or describe how a program works.

Equally competent, but lacking in basic knowledge? It's a miracle!

> If you have one candidate who has seen all of this stuff before and can use it directly from memory

Except that the continued success and popularity of books like Cracking the Coding Interview suggest that this really isn't the case.

The candidate who has spent the last month cramming on the best way to reverse a string in place, or the difference between the plethora of types of trees isn't guaranteed to be the best candidate for your position. All they're proving is that they can pass your screening process.

Edit: and to answer your question, I'd rather have the person with 20 years of experience in the field who can't remember how to draw the Lewis structures for arbitrary molecules off the top of their head than the doctor fresh out of school who doesn't have the slightest clue what this blotch on my abdomen might be.

Worth underscoring that point - these processes lead you to hire worse candidates.

One problem with that argument is that the person who's been doing it for years and has seen it all will not be able to do it directly from memory, because those problems almost never come up in practice.

It's like interviewing a doctor by giving them the symptoms of some rare 3rd world disease, that they maybe studied back in school, and judging their ability based on how well they diagnose it.

i did my masters degree while working full time. at 5pm i started studying and doing homework, i continued (except a break for dinner) till 12 or 1 am. on weekends, the entire time was used for studying / homework.

then comes interviews, where i was pitted against students who took 2 classes a semester that spent many hours a day practicing with things like leetcode and cracking the code interview.

I have never had to use any code I have seen in that book or the website's competitions. asking people to memorize it for a white board interview seems crazy.

one time someone asked me then what? i dont know what is best, but seeing who finds the best answer on stack overflow would be more realistic.

You can probably read and meaningfully cover CTCI in about 8-12 hours total. Double or treble that if you read slow and do every single question.

I'm not defending whiteboard interviews, but I don't think it's a secret very very few candidates answer the questions from real experience. Rather, it's obvious the main approach to dealing with whiteboard questions is to read up and memorize the stuff before an interview.

To what extent are companies using whiteboards to test who is serious about getting the job? And who is willing to put in extra time and occasionally do unpleasant extra work?

Again not defending whiteboards, and I can think of ways to improve it especially for the more experienced engineers. But all else being equal in terms of two candidates experience and skills and fit; the candidate who was able to memorize things just for the sake of the job, who did research and who came prepared etc. Well that candidate might have the upper hand when making the hiring decision.

Right or wrong, companies do want to test for things besides current programming skills, like being a team player and do things that aren't easy or enjoyable. 80/20 rule maybe applies. How else to test this stuff if companies decide it's important?

Except that my experience is interviewers (in general) prefer to discuss abstract math puzzles they clearly looked up in a book themselves to exploring a relevant question I propose from their actual business domain.

I don't mind "how would you implement this basic server idea" or even "talk me through how you'd explore this algorithm puzzle", but I've definitely been docked for not having $OBSCURE_INSIGHT to $RANDOM_PUZZLE in 15 minutes, nevermind that I can usually get them sometime that evening when not standing at a whiteboard being interrupted when I pause narrating for 15 seconds to think, which is an appropriate timeline for any actual algorithms work. (I mean, I can count how many times in my career solving a math puzzle couldn't wait a single day: 0.)

Of course, I suspect some of those interviews are designed to fail you as part of H1B scams.

> To what extent are companies using whiteboards to test who is serious about getting the job? And who is willing to put in extra time and occasionally do unpleasant extra work?

But it doesn't say anything about who's serious about getting that specific job, if every tech company does the same kind of whiteboard interviews.

> Do you want to see someone who did an EMT course at a local community college and hit the basic highlights of how to treat different symptoms

I'd ask the 2 doctors to perform the exact treatment on their own kids, look at the outcome, and then decide.

Basically, "build me X" instead of "show me your algorithm skills on whiteboard" is a far better proxy.

>build me X

Pay me

nobody has asked me that because this normally happens within the typical 3-hour interview scope, but honestly that's fine too

I'd personally take the EMT. The doctor in this scenario sounds like someone who lacks field experience. Deep knowledge of anatomy and pharmacology doesn't do a whole lot of good if you pass out upon seeing blood :)

Same for the prospective programmer who raised one's own kids. That translates to real-world understanding of problems rather than being too deep into the theoretical side.

What do you think the doctor is doing when he walks out of the room after seeing you. That's right, they are looking things up in a book or on their computer.

Your argument is invalid.

I don't think the technical test is a problem per se, it's the way we implement it in our field.

I think it's perfectly reasonable that actuaries take an exam that shows knowledge of vector calculus. I don't think it would be reasonable to ask a senior actuary to do vector decomposition in an interview, because he or she has already demonstrated this ability on a proper and widely vetted exam.

Similar things go for medicine. Totally ok to have boards and exit exams, crazy to be asking these questions of a radiologist with 20 years experience during an interview. Same for lawyers. They must take the bar, but the don't have to retake the bar every time they change jobs.

In programming, we have no widely recognized, properly vetted, industry credential. So we leave it to private companies, who subject applicants to some variant of an industry exit exam over, and over, and over. And what's worse, they do it in secrecy with minimal feedback to avoid lawsuits.

It's horrible, and if the industry is experiencing a "shortage" of candidates, that's a good thing. It's a sign that things need to change. The government should not rescue this industry from the consequences of their actions by creating special programs that allow corporations to force would-be immigrants to subject themselves to these interviews and work in this field as a condition of living and working in the US.

Let people be free, and they'll make their choices. If tech is too abusive, then they'll either have to go without workers, or change. But seriously, take away this crutch of employer controlled visas. I'm not saying don't let talented people into the US, I'm saying let talented people decide how and where they'll work, don't let corporations control that and dictate their choices.

Its counter intuitive but whiteboarding is actually more inclusive IMO. Anyone who has 35 bucks can lock themselves in a room for 3 months with a copy of CTCI and a pad of paper and come out with an 80+k salary. You cant do that in any other industry.

If we judge candidates purely on work experience that just favors the incumbents who have already broken into the industry and presumably already quite comfortable in life.

That assumes the interview accurately measures something useful. The more I read about this area the more I think a 1-2h pre-interview practical coding assignment, then a 1 on 1 discussion during the interview is better. You then get to understand how they think, how they design solutions, and some sense of how they interact. In other words, how they'd do the job.

Good design instincts takes lots of hard earned real world experience to develop. I wouldnt expect the average 21 year old programmer to really know how to design stuff maybe they can provide me crammed textbook answers but that is as "irrelevant" as whiteboarding.

I think a good approach is to not emphasize the whiteboard as much for senior hires, so people with good work experience can mostly get in on their resume and design questions and people with lesser experience still have the opportunity to break into these top companies by grinding algo questions.

I once hired a senior programmer because of his perfect resume and it was one of the worst choices I made while hiring. He was probably a net loss for the company. What difference can be in CV of two people, who both have 15 years of experience in the industry?

Many interviews are not pure whiteboard algos. And many interview processes are different, each company is unique.

When I interviewed a couple of years ago, I saw various kinds of interviews at companies:

* 1-2hr on site coding exercise, with the internet, alone.

* 'Here is a 100loc piece of code, fix all the problems with it'.

* Chat with product / manager / team lunch. ('Are you an asshole' test / 'culture' interview / scenario testing)

* Design / outline an app that does X, with boxes and lines on a whiteboard.

* Solve this algorithmic problem on a white board / on your computer.

The reason why everyone freaks out about the algo interview it's the only interview type that most engineers have to prepare for. The other 4 usually you don't have to do much prep work at all.

> a 1-2h pre-interview practical coding assignment

then you'll only select for people who have nothing better to do with their time than to donate it to your fun programming escapade. Experienced devs will say it's not worth their time and wont apply.

> then you'll only select for people who have nothing better to do with their time

1-2 hours on a practical assignment vs 3 months cramming suggested by the parent comment. Which of those has nothing better to do with their time?

> Experienced devs will say it's not worth their time and wont apply.

Will they? I'm an experienced dev and think this approach sounds great.

The problem is that in reality, what can you code in 1-2 hours? Of the coding assignments I've seen, many have started to bleed into requiring 5 hours or more.

No, it's just moving those 1-2 hours to home instead of at the interview. If you can take time off for an hours-long interview, you have the time to complete that kind of task.

It's asymmetrical though. A reviewer can yea-or-nay my 1-hour assignment in a minute or two. If I'm desperate for work, sure I'll do it. But my position is in high enough demand that I can choose another position that doesn't require me to do homework. If they made some attempt to get to know me, talk to me, and then give me homework, I might bite. But I'm not going to do hours worth of work just for the chance to get a job.

We give these assignments but if we give you one you're already coming in for an interview. What I dislike are the companies who use it as a screen like you describe.

Why would you apply for a job that you don't want? And if you want it, why wouldn't you put at least some effort into getting it?

Effort does not only include writing a cover letter that will be appealing for the company you apply to instead of just forwarding HR your resume. It also includes small activities like these.

You sound more like a spoiled brat, pardon my French, than an artist looking for new opportunities to make a difference.

No he sounds like a skilled artisan who is bringing his unique skill set to the table in an equal trade. He doesn't need to doff his hat and thank the kind Master for this opportunity.

You sound like someone who doesn't realize this isn't 2006 anymore and people stopped pretending to change the world with every new job.

You don't need to change the world, you need to make a difference. Without making a difference, you're no better than the next person of equal skill who wants a bigger TV and a fancy car.

To make some tech company rich ? Is that what "making a difference" means to you ? To help fund the CEO's golden parachute ?

> an artist looking for new opportunities to make a difference.

The only "difference" I want to make is in my checking account. I want the deposits from my employer to "differ" by greater margins as time advances.

Im good at my job. I solve problems that provide business value. It has nothing to do with art. If you ask me to work for 2 hours, I'm sending you an invoice.

Do you not put honor in that piece of work? If no, carry on. If yes, you're an artist who cares.

Anyone who has 35 bucks can lock themselves in a room for 3 months with a copy of CTCI and a pad of paper and come out with an 80+k salary.

If anything, this just tells us what a low bar to entry the CTCI drill is.

Say you had a pool of candidates of roughly comparable intelligence, educational background, and work experience. And then you gave them 3 months of all living expenses paid† to go an "improve themselves as software development professionals."

One of them comes back and says "I cranked out an MVP (relying in part on a language I only barely knew to that point) that's actually being used by people, and getting good reviews. Here, check out he repo."

Another says, "I thought it was about time I got a handle on this machine learning stuff. Here's a project I did, based on a generalization of some ideas in such-and-such paper. Wish I had more time to spend on it, but looks like my error rate's not too bad."

And another says "It took literally 6-8 hours of devoted practice, every day. But I finally got some serious traction on Japanese. I honestly thought I'd never be able to do that."

And then there's the guy who says "Oh, I looked myself in my apartment and did every exercise in CTCI. Because, you know, I heard that's what you're supposed to do. Damn, my back is sore. I think I need a better chair or something."

Who would you rather talk to first?

† Which costs a lot more than 35 bucks, BTW.

EDIT: 'looked' -> 'locked'

I get your point, but I'll also point out that costs a lot more than $35 to lock yourself in a room for 3 months. Many people don't have guaranteed room and board for that long.

>Many people don't have guaranteed room and board for that long.

If a person has been in the industry for more than than three years and doesn't have 3 months cash reserve saved up[1], they are severely mismanaging their finances. _Particularly_ since we're seeing increasing warnings from the financial sector that we're in a bubble already.

[1] barring some unexpected catastrophic expense

Whether or not they have it saved, they still have to spend it.

Having the ability to lock yourself in a room and study for three months is an enormous privilege, particularly if the thing you're studying is only marginally useful.

This is the amazing thing about IT. Also once you are in the computer doesn't care what your background is, what your colour is etc. It will refuse to compile the bad code of an ivy league grad.

I'm not totally down on white board interviewing but I think it is difficult to do well. By "do well", I mean get a candidate into a state where they aren't feeling like "deer in the headlights". For some candidates, That Is Never Going To Happen. As a result, the "It's just an interview. I wonder if they'll buy me lunch?" candidates like myself are going to "score" better than some really great candidates.

I've been doing this coding thing longer than many (most?) of you have been alive. I know enough to choose a suitable algorithm for the task at hand. I ship product. I copy paste out the wazoo.

That said, the people interviewing you aren't professional interviewers, they are developers who are winging it. They may have never have had a "good" interview so they repeat how they were interviewed i.e. a bad interview. Like the candidate at the whiteboard, the interviewer is out of their element as well.

I actively interviewed job candidates while my wife went through her licensing process to be a doctor.

A doctor's job interview spends no time making sure that a doctor is competent. Why? Because the licensing process does that.

The most frustrating part of interviewing a software engineer job candidate is that a degree doesn't show that the candidate is competent. Thus, we need to waste our time on imperfect ways to verify competence when instead we should be selling the organization and judging team fit.

IMO, we need some kind of licensing process where, as an industry, we agree on a way to judge that someone is a competent software engineer.

I'm skeptical of licensing as a means of determining ability.

No other engineering discipline uses licensing (with the exception of PE roles, for obvious reasons) to determine ability. Think about how mechanical engineers get hired, for instance. "What were some technical challenges you had in your last project? How did you solve them?" would be a couple of leading questions into technical discussions with a candidate.

Note that those questions are not specific to mechanical engineering: they are applicable to every engineering and software role.

In the interviews I have conducted, asking these types of questions exactly has generally given me a pretty good sense of the technical ability of every candidate, with no whiteboard session or coding homework required. It only takes 30-45 minutes, and I have also found that this amount of time is enough to get a sense of a candidate's culture fit as well.

Whiteboard interviews only test how much of a candidate's memorized corpus overlaps with the question presented; as a result, I find that process is only 5-10% as effective as a technical discussion. I want to know if and how (and how well) a given candidate thinks, not if they know the same things I do.

Code homework is only marginally better. Most (better) companies have a 90-day probation period anyway. After hire, if a developer is not performing up to expectations for a given role, they can be let go. This is exactly how it happens in every other engineering profession.

California is an at-will state. Silicon Valley is in California. Tech employers should just embrace the fluidity of the market and mandate probationary periods instead of long, arduous interview processes. Candidates are already mobile enough anyway, if they're let go then let them work to improve themselves until the next gig.

I feel ambivalence about formal licensing, but I do agree this would be a positive element.

I would be, even 15 years after the fact, perfectly happy to study intensely for an exam on basic math, data structures, and algorithms, similar to the initial mathematics exams used for the actuarial field. Such an exam would be a lasting and widely respected credential, consistently administered and graded. I wouldn't want to put this kind of effort into a single corporation's exam, because those are graded in secrecy, I have no idea what the credentials are of the people who evaluate it, and I don't get much (any) feedback other than "we decided not to continue with your application at this time...".

My worry comes from regulatory capture. I'd hate to see something like a bar association that forces people to go through a very expensive three year post graduate degree, putting them deeply in debt, in order to meet licensing requirements. Would the governing body be allowed to deny math or physics or non-cs engineering related majors the right to take the exam?

Also, imagine if people who wrote the EJB spec or the ones who decided all client side code should be written in tag libraries gained control over the licensing procedure? Would the test-driven advocates be able to revoke the license from anyone who publicly questions the value of test-driven methodologies?

This could be great - something along the lines of the actuarial exams are appealing to me, as they are rigorous but allow different educational paths to prepare. But this can also go very wrong, it is a real risk.

Great description of the problem. Licensing is going to be difficult in quickly changing fields.

I can imagine licensing being a terrible PIA for programmers. If you work only in c# at work and learn some new language during your free-time, getting a license for the new language could be so time-consuming and not worth it (studying the obscurities of the language and actually taking a test). That is if there is a new license for every language and job type.

Just create a license for algorithms and data structures, since that's the bulk of whiteboarding anyway, since they "don't ever change."

I've been paid to write code for 20 years or so and can't remember any of those friggin algorithms.

Being able to understand a problem enough to successfully google an answer is the most valuable skill for a programmer.

How do you know the answer you find is a good one?

Say you think you need a binary tree, so you can get Log(n) lookup, you look it up, implement it, it works, great. Not bad.

Then a lady comes along who actually understands the problem domain, and knows that cache locality will be a big win in this case, and she uses a sorted array with a binary search. Latency is cut in half and the company makes a brazillon dollars. Better.

And then along comes a 10x programmer who realizes that you can avoid the lookup entirely if you partition your data along this other axis. Latency goes negative and your company makes a few more brazillions off time travel. But the 10x programmer is a misogynistic asshole who can't balance a binary tree on a whiteboard. Now I have no idea which completely fictional character to hire.

Your argument is flawed and represents what's wrong with our industry's interviewing process. Respectfully, of course.

For most developers we don't need to even consider these trade-offs. It really depends what stack you're working with. Are you a front-end engineer? Then this doesn't even make sense because you wouldn't be putting that much into storage anyways that O() is going to matter.

I'm an iOS developer and quite frankly you shouldn't be doing that much business logic on the phone. We implemented an algorithm that had a O(n^4) runtime without optimizing it because the dataset was so small it wasn't worth the time to over optimize.

You're looking for an extremely rare case and discrediting 95-99% of the time we're programming. Sure we can do speed-ups but do you really have to filter every single developer for those criteria? You really don't. Guess what? Unicorns are built off the backs of medicore developers. The codebases once they start expanding are highly unoptimized. Slack was built from contractors. Twitch apparently is still having growing pains on their iOS app because the original developer wasn't very skilled. I was turned down from thumbtack even though I knew more about iOS than the iOS team lead that interviewed me because they valued CTCI more. The iOS team lead was taking advice from me about system design. What would you rather have? Someone that knows system design or someone that memorized how to serialize an n-ary tree? What is actually going to make you the bazillion dollars? A functioning app that can scale, or someone maybe helping you serialize a data structure (with a very known solution) on an iOS app which is completely useless?

It's simply an illusion that you need to have developers who have memorized useless problems in the case that they may one day possibly might see an issue they could optimize 1% better by introducing more complicated code.

For 99% of situations in normal software dev, just being able to find X in collection is enough.

I interact with slow and/or battery draining software every day that annoys me.

1% might be about the share of developers who work on battery drivers, video codecs or similar stuff which would be perfomance or battery draining relevant.

Nobody ever needs to code binary trees, outside of interviews and a few library developers.

As a single point of data, this is not the case at my job

Yeah, I kinda knew I was being overly categorical...

Binary trees have been around for decades, and I assume there are solid libraries for all useful variants of them, and that everyone can use them.

I'm probably wrong, but in what way?

Care to share with us the nature of the work or maybe some examples?

I'm kind of sick of doing CRUD apps. Maybe hearing about what you do would inspire me to explore something new.

I've come to the conclusion that the 3 algorithms on whiteboard interview is generally worthless for determining an engineer's on-the-job performance.

Instead, "Build X in 1-2 hours using your computer and resources available" was always a better proxy for productive engineers and largely removing both false positives and false negatives.

Anytime there's a blog post or a thread complaining about the 'broken coding interview process' it ends up being upvoted by everyone and their brother on HN but very rarely have I seen people offer a reasonable alternative instead.

I work for a big tech company and for our org, typically we have a couple of Algorithms/Coding rounds, a system design round, a technical communication round and a loosely structured interview/chat with a hiring manager. I think this works because people, at least on my teams, have to work on a wide array of features that could involve writing backend queues, or Hadoop jobs or some business logic on the API layer that would need to leverage caches/DBs/other appropriate key-value stores. It definitely does help to have people who have broader knowledge of Computer Science to be working on those features.

I feel our interviews have low recall and precision but a good accuracy (we end up hiring smart people we can count on to pick new things up). And FWIW I don't think we ask unreasonable questions during the coding rounds. I also feel it's a reasonably scalable way to organize a generic 'objective' and 'fair to all' interviewing process at a big company that can hold a reasonable hiring bar.

If I was running a smaller company then I probably would have kept in place a process that was closer to the bare practical requirements of the job.

Also, if the existing interview process was really that bad in practice with an abysmal correlation with successful hiring, wouldn't the companies have dropped it already?

I know this kind of sucks for applicants who think they have all the skills that are needed to practically do the things they'd have to do at the job they're interviewing for (and in some cases rightly so), but arguments bashing the current interview process would seem more valuable with the proposal for a better alternative that can comes off as reasonable after the same amount of critique and scrutiny that the current process gets.

Also, I often get confused - do people not agree that there's any correlation with hiring good engineers and the current process, or do they just think companies should have a more developer-friendly interview process? If it's the latter, then do the companies have any incentive to do so? It can't be that 'their potential hiring pool becomes wider' because if that really was such a big problem they would have changed already.

(All opinions mentioned here are mine and none are my employer's, obviously)

(Edited comment twice to attempt to make thoughts more coherent)

I think there is nothing wrong with asking a candidate to do live coding/algorithms given that they could be solved in reasonable amounts of time and the interviewer has practice in helping a candidate feel comfortable doing so.

If I'm in an interview and I feel judged/nervous enough I'm going to struggle to spell my own name, much less write a merge sort. If I feel like I'm having fun and the interviewer is just sort of pair programming with me to make sure I'm not BSing that I could code, then sure, I can probably do most medium level algorithmic problems after some stumbling around and mis starts.

I'm with you on that and I do try to be as friendly as possible whenever I'm interviewing people for that reason. If you think there's anything else people can do to help in that regard then I'm all ears.

Edit: I guess companies should at least have a screening and training process for the people they allow to interview but, generically, I think this often gets overlooked in a lot of big companies.

Here's a heuristic:

Instead of trying to find weaknesses of candidate, try to find their strengths. You will genuinely be surprised at what kind of amazing programmers are out there.

I was recently rejected in a programming test because I designed my own implementation of a graph which was unlike the classical Graph class (containing addEdges(), findPath() etc.).

If the interview heuristic was "trying to find his strengths" instead of "oh..there he goes. Doesn't know textbook implementation", I would actually be picked for being a smart programmer.

This is a tough subject. I don't believe deep understanding of algorithms and data structures is necessary for the vast majority of engineering tasks. Most (almost all) engineers can get by understanding complexity notation and a matrix comparing time/space complexity for various algorithms and data stuctures, no memorization required. And even that is overkill, as most of time the "natural" one suggested by a problem will suffice. For example, "I need to map keys to value, so I'll use the Python dict class" will get the job done with acceptable performance almost every time.

Whiteboard interviewing based on these things isn't useful unless the candidate is being considered for a position in which knowing these things is fundamentally necessary (e.g. they'll be responsible for developing new algorithms or squeezing out the last 10% of optimization available).

On the other hand, substituting requirements for having public code profiles, and the assortment of take-home projects, in-office projects, pair programming to solve a problem the company is facing (especially if unpaid) is just as bad, for different reasons.

And certainly not asking technical questions is problematic.

I just don't know what the solution is. It's one of those things where I know a bad interview when I see it.

Agreed. The Google style questions are silly. But an engineer really should be able to at least answer an EASY question. Stick with a question where the answer is "put it in a hash table". No trick questions. Just a baseline, does this person have a pulse, so nobody's time gets wasted.

I throw real life problems I've run into (and fixed!) at people and ask them to give me things they would do to troubleshoot, and compare it to what I did. Always nice when somebody suggests something new.

I also ask 10,000 foot questions. "Say you're wanting to build a new app that behaves like Instagram. What's your hosting platform, your technology stack, and explain why?".

> I throw real life problems I've run into (and fixed!) at people and ask them to give me things they would do to troubleshoot

Yes. How does the person think and work? I've interviewed ops folks who are technical gurus whose walked through a problem with me and pulled up with "well, in that case the server is fine and it's not my problem". They can work somewhere else, I don't care how good they are at "making the server work".

I've also interviewed ops people who are all "well, if performance in production has degraded I want to sit down with the source control tools for your config management system and apps and work through the changesets with the devs and work out where we should start looking." I don't give a fuck if that gal needs to google some of the answers to the problems she finds, she's got what I want.

>I also ask 10,000 foot questions. "Say you're wanting to build a new app that behaves like Instagram. What's your hosting platform, your technology stack, and explain why?".

Please forgive the uncomfortable question but isn't that a tacit admission that many developers' role consists of simply gluing together libraries and frameworks built by better engineers who actually do know how fundamental algorithms, etc. work?

"simply" indicates you've never tried to stand up a fairly recent javascript stack :D

that being said, you're not wrong. Maybe 5% (maybe less) of the guys out there are smart enough to build libraries and frameworks. the next 45% of the guys out there are smart enough to see all the pieces and figure out how to fit them together.

the other 50% understand syntax and can write code, but aren't capable of any sort of bigger picture stuff.

and if my recent interviews are any indication, the split outside of silicon valley is more like 2/18/80.

Shoot, I'm not in that "let's build brilliant frameworks" crew. I don't think I am at least. Haven't really tried :D

Had a 2 hour technical interview last month where there was a lot of white boarding, but the panel of interviewers understood when I said "Well I don't know the specifics of X but between Google and stack overflow I'll manage." - got the job by the way...

Hello, my name is Tim. I'm a lead at Google with over 30 years coding experience and I need to look up how to get length of a python string.

<sarcasm> The problem with Python is the arcane, arbitrary and nonsensical syntax. It sure as hell isn't anything as obvious as length(str) or string.length like any reasonable language.

And god help you if you want to do something complicated like define a function. </sarcasm>

But seriously: I have no problems with the rest of the article, but that one just seems silly. How many times do you have to look something like that up before it sticks? For those of you not versed in the more esoteric secrets of Python, you would use the built-in function len, as in len(str).

Of course there are alternatives: http://stackoverflow.com/questions/3992192/string-length-wit...

Edit: Added <sarcasm> tags for clarity.

As someone who has worked with a lot of languages over the years I can never remember which one is `length(x)`, `len(x)`, `x.length`, `x.len`, and so on.

For me the answer is "never", apparently. Then again I tend to struggle with basic syntax no matter what language I'm using or how frequently I use it.

You forgot sizeOf()!

If it makes you feel any better, I am in the same boat.

I used Python for years, although I'm a little rusty now. I ran into that kind of problem all the time. Your first two examples are what came to mind first.

Just as many things I love about python, I hate about python.

Each time I get to the part where I need to modify the path which the import mechanism uses to look for modules, clean up the loaded modules to make sure I'm not reimporting an already loaded module by the same name, in order to import a module which is in the SAME DIRECTORY as - get this -__file__, I need another coffee.

Also "len(...)" is not obvious.

Seems overly harsh. Why is the actual code, "len(s)" so much worse than your suggestion of "length(s)"?

And I can't count the times I tried to do s.len() or s.length().

Of course, s.__len__() actually works.

The particular thing I like to test in an interview is "How are you at communicating something that (should be) in your wheelhouse -- Asking questions, explaining your thought processes, discussing tradeoffs, etc". For a recent graduate of a CS program, that's probably "Solve a data-structure/algorithm based problem". For anyone who isn't a recent CS graduate, there's probably a better question to ask.

Mostly what I'm going for is:

    (1) How does the candidate communicate something they know that others might not.

    (2) How good is the candidate at what they claim to be good at, for the purposes of calibrating their resume.
People have cargo-culted an interview process from a time when developers were largely CS majors hired somewhere out of undergrad, and then stayed at one large company for a few decades, often because it is what they went through. But this process doesn't translate to mid-career or non-CS-major hiring.

Asking developers to program in front of strangers is primarily testing for ability to perform in front of strangers and developer skills secondarily (if at all.)

I never ask people to code in front of me. I ask them questions to get a feel for where their areas of expertise are, and to get to know them.

I wrote a post about my feelings on it last year: https://medium.com/@ebbv/my-case-against-coding-tests-6b15c1...

A good interview should be conducted more like pair-programming than performance art. The interviewer should be explaining things about the problem space and asking questions about the interviewee's code (especially when they're going astray).

there is experimental evidence suggests that the ad hoc approach is the worst approach.

why not have them write code, not in front of people.

give them a laptop and some time alone?

My approach is not ad hoc. I have refined my questions and technique over a decade of conducting interviews and much more being on the other end of them.

Let me be clear; it's what works best for me. It may not work best for everyone. But I feel our industry doesn't even really put much thought into interviews right now. They just find some coding puzzles online and say "Here do these." That sucks. There's always something better you could be doing IMHO.

The job I actually landed had the technical part of the job interview just like that - a laptop, a set of requirements (fixing an existing application), internet access, and some time.

I did a lot better than I do on whiteboard style interviews, so I'm definitely biased to preferring this :) .

A few points regarding CS 101/201 data structures and algorithm coding interviews.

The coding interview is a new phenomenon. In the 1990’s and early 00’s, these type of interviews were rare. Typical interviews were more well rounded with questions about interests (what do you want to do), past experience (what have you done), personality and culture (how do you like to work), as well as some technical and coding questions, often relevant to the actual job to be performed. Also some brain teasers like “why are manhole covers round” which Google has since condemned.

As someone who works largely on complex algorithms and mathematically oriented software such as video compression, speech recognition, and gesture recognition, I can say from experience that these tests frequently have nothing to do with what algorithm developers actually do. These are solved problems in CS such as how to sort data, how to implement a hash table, etc. In practice, developers rarely reinvent the wheel but rather use existing implementations in libraries or source code. The coding interview questions tend therefore to discriminate against older, more experienced developers who are many years past CS 101 or never took these classes.

About half of practicing developers are “self-taught” which typically means they come from some other STEM (Science, Technology, Engineering, or Mathematics) field where they learned to develop software by doing, not by taking CS classes. The tests tend to discriminate against them as well.

As for women and underrepresented minorities, which typically seems to mean people with visible African ancestry and Hispanics with visible American Indian ancestry, this is difficult for me to judge. But in general the CS101/201 data structures and algorithm coding interviews are not a test of the skills used in most (not all) professional software development.

What we can say is that these tests are also not blind tests of technical skill. They are analogous to the bad old days when orchestras auditioned prospective musicians by viewing the musician playing and few women ever made it through. The introduction of blind auditions where the candidates performed behind a sheet and with other measures to hide their gender, race, and other presumably unimportant factors — only the actual music was evaluated — seems to have resulted in an increase in women passing the auditions.

> Hello, my name is David. I would fail to write bubble sort on a whiteboard. I look code up on the internet all the time. I don't do riddles.

Does the interviewer explain what bubble sort is, or do they expect the candidate to know it? The first is a reasonable interview, the second one isn't.

Bubble Sort is a common sorting algorithm. I would expect anyone w/ a CS degree to know sort of what it is.

But, for the most part; I can't imagine anyone ever having to write it on the job unless they are doing super low level stuff. All the 'current' languages provide sorting algorithms built in.

I feel like it probably functions more as a proxy of general algorithms knowledge. If you've forgotten even the most basic algorithms like bubble/insertion/selection/merge/quick sort then that indicates that you've been neglecting your algorithms knowledge since college.

It also kind of depends on the person. If you're interviewing an intern who's still in college, absolutely they should still remember bubble sort. If they don't know it off the top of their heads, they weren't paying attention in class. If you're interviewing someone who went to a boot camp then it would be expected that they wouldn't know a lot of the trivia like bubble sort that nobody really uses.

I guess I would expect just about anyone to know what a bubble sort is, but I wonder if that's bias from the classes I took. Every class on algorithms I ever took would start off with a bubble sort as an example of a sorting algorithm, then use it as exhibit A for figuring out the big O performance of an algorithm (the outcome of which was always, "...and that's why you should never use a bubble sort").

I assume that most people were taught most of the same material as I was, but in reality that may not be the case.

It's very hard to pin down what sorts of things "everyone" should be expected to know. I agree with the consensus that whiteboard coding and CS trivia is not a good way to run interviews, but I bet many/most of the people who complain the loudest about interviews have their own misconceptions about what constitutes common knowledge for programmers.

Any hiring process based on asking several current employees to take turns talking to someone for 30-60 minutes and make a hiring decision based on that is going to be deeply flawed. However, the only alternatives I've heard are so much more costly, lengthy, and elaborate that I just can't imagine many companies moving to a different process anytime soon.

This debate is also complicated because different people want very different things out of the process. Job seekers want processes that minimize false negatives (i.e., they don't want to fail the interview even though they'd make good employees), but employers want processes that minimize false positives (they don't want people who get through the hiring process that end up being bad employees that the company is now stuck with).

The current process probably doesn't do either of these things. But every more elaborate process I've heard people say they want seems to go to one extreme or the other- it either greatly favors the job seeker who hates false negatives, or greatly favors the employer who hates false positives. It's a hard problem with no good answers yet unfortunately. :(

> I guess I would expect just about anyone to know what a bubble sort is, but I wonder if that's bias from the classes I took.

It was covered in my first year CS courses... something like 9 years ago. I've not once needed to be able to re-implement it (or any sorting algorithm) in the intervening time finishing my degree and five years doing software dev as my day-job, much less re-implement it from memory. I would need to look it up to refresh myself on its implementation and characteristics.

Or tell them you want them to write an implementation of bubble sort on the whiteboard and then give them a 1 minute allowance to use the Internet. Then you can test their Google skills too!

quicksort would be fine too :-)

I had a phone interview where I was asked how I would write quicksort. I responded: I wouldn't. Interviewer chuckled, said good answer and moved on. I got the job.

Wasn't this in reaction to the story that Homeland Security had asked the question to a programmer entering the US? I don't know how this was extrapolated to a broken job interview process.

I believe it's in reaction to the malpractice of hiring programmers according to their results on a purely theoretical exam at the whiteboard, and sometimes disregarding completely their precedent works, published projects, etc.

I'm sure there are other examples predating it, but the following I believe is the 1st account of a Google interview gone bad for someone who very likely didn't deserve to get kicked in the butt (Max Howell, author of Homebrew for MacOS). https://twitter.com/mxcl/status/608682016205344768

I thought DHH posted this before the guy who had border problems.

But well, I only had one strange interview where I had to programm Tetris and failed. In most interviews I just got asked, if I could do the job...

Do you have a reference? This would be interesting to read about.

can you share a link?


There have been others come to light since, but this was two weeks ago.

what others have come to light? I apologize if these stories have been confirmed, but at the moment I am not believing it.

that was joke

it wasn't

where was this verified to be true?

Bubble sort the first sort I learned 17 years ago the 3rd week in my first year in CS at college. 4th week we learned quick sort and a bunch of other sorts that are faster aka forget everything you learned about it.

Foolproof way to get a job offer... Create a digital pen that captures a question you write, uses ML to find the best answer on the web, and display the result on the whiteboard with a laser.

I have found three different categories for tech-related interviews:

1. Algorithm and data structures trivia: which includes the classic CtCI [1],

2. In-house problems: where you have to analyze and propose multiple solutions to a problem that has been faced by the current development team in the company that is executing the interview,

3. Homework-like projects: which usually take 3-7 days to finish.

After an exhausting 2016 full of interviews I do not know which one is worse.

On one hand, having a common set of problems like the ones proposed in CtCI is good because you just prepare once (kind of) and can apply to multiple companies at the same time. On the other hand, in-house problems are good if you really want to work for a specific company and want to know what are the problems that your future co-workers are facing, the problem is that you have to be well versed in a wide range of concepts to give at least one answer that pleases the lead developer. And finally, the take-home projects, I think they are the worse in comparison, you are expected to work for free for at least one day in a problem that is either very basic or too complicated but they always expect you to have a perfect solution just by yourself. How many times have you done this in your own job? Do you start working in a solution to a problem without asking questions? Without a discussion with your colleagues? Without full specifications? Without multiple attempts? Take-home projects are really not suitable for an interview.

I understand the point of these tweets but I doubt they will change anything. Much of the craziness of nowadays interviews comes from the recruiters who have no idea about these problems, they don't know algorithms and/or data structures, they don't know what are the problems that the company is facing, they only care about the money that the company will give them if the candidate is hired. Now you know why websites like Hackerrank [2] are so popular, recruiters love to have a standard set of questions and if your answer is not the exact same that is in their sheets you lose.

From now on I will just resort to colleague's referral. With this I know that at least one member of the team will vouch for me to join the company and make the interview process as painless as possible because they already have an idea of how good of an engineer I am.

[1] https://www.amazon.com/dp/0984782850

[2] https://www.hackerrank.com/

> And finally, the take-home projects, I think they are the worse in comparison, you are expected to work for free for at least one day in a problem that is either very basic or too complicated

One of the solutions my team has come across for this (also mentioned in previous HN threads) is to pay the interviewee for a day's work or give them an Amazon gift card for an equivalent amount. And we aren't looking for a perfectly crafted solution. We're just trying to get a handle on their general coding style, maintainability, coherence from variable naming and/or comments, DRY, whether they bother with edge cases in their solution, robustness/anticipate other types of errors or bugs that may pop up, etc.

So this makes me think: Has anyone here, when presented with a whiteboard problem, tried to google that problem (or algorithm) during the interview?

I'm generally not a fan of whiteboard problems because: 1) They don't test what I most care about (architectural things) 2) Why not just pull up a laptop and go at it?

but one thing I'd say in defense of them: Thinking through a problem on a whiteboard is a valuable skill I have frequently used.

So it seems the problem is less about "problems on whiteboards" and more about a) the kinds of questions asked and b) the "unrealistic" "working environment".

I actually really enjoy whiteboarding interviews if they're closer to what you'd use a whiteboard on the job for: discussing some problem, pros and cons, designing a solution, even details of an algorithm (as opposed to implementation).

I still remember a Google screening call I had a long time ago. "You have 1 petabyte of data you need to transfer halfway the world, what kind of network protocol would you implement for it". I, of course, "a truck/ship/container with tapes?". "No it needs to go through <i think a satellite link>". "Well, lemme see, larger frames, window sizes, capacity of the link becomes sort of the leading issue, but seriously, I'd Google for some papers and take it from there". I found it an extremely strange interview, as it went on like that.

Never change, Hacker News.

Just to play devil's advocate, just imagine you are in the seat of a software engineering manager, probably an engineer who got his MBA. Doesn't code professionally, and hasn't for a while.

He's got to hire someone that, due to the nature of the work, he cannot effectively micromanage. Even a moderately complicated piece of software, with best practices, a robust code review process, and experienced hires, can go off the rails, and very quickly. Due to no fault of the developer, mind you; sometimes, PMs and BAs suck. We've also all seen what happens when you have to maintain the mess a rookie software dev has made.

So, you need to hire someone experienced, and competent. You are looking for non-falsifiable indicators of both. Ideally, since new grads are both cheaper from a salary and health insurance perspective, you'd like to get a good, new grad, and ride him for a while until he gets paid market rates for experienced devs.

So what do you do with new grads? You ask them computer science brain teasers, to test if they were paying attention in class. It's hard to fake, it's a quick conversation in terms of time, and it's biased against older hires. Win win win from the pointy headed guys.

My 2 cents. Not saying it's right, but if you want a better system, invent something that can accurately rate software engineering experience. It's a gold mine.

I've been coding for almost 40 years. Without an Internet connection, my productivity drops by at least 60%.

"Cut and paste fragments from {SO,Github,Gist,.etc} and edit to taste"

Admittedly, I was without internet working from home for a few days, and I found it a very nice experience. Not being able to go to Hacker News while compiling managed to keep me focused and on the flow. When I had to look up stuff I found myself digging in /usr/share/doc to find the reference manuals... in a way it took longer than looking up the answer in Google, but it forced you to better understand the system you are using and build your solution from that understanding, sometimes considering third options that one would have missed when stack overflow already contains a bunch of snippets to copy paste.

Of course, I am idealizing the experience (eventually I really needed internet and ended up going to public libraries to use it)... but like everything, maybe we are getting too used to the instantaneous and forgetting that the alternatives are actually not so bad and sometimes empowering.

Very interesting topic ... IMHO .. white boarding would be a useful process if it is used to poke into the thought process of the interviewee, independent of the answer being right or wrong ... think most important is to hire guys with good personality .. then motivation .. then ability to reason n learn ... the technical specifics would not be n should not be a major factor if one is hiring for the long term ...

Who is hiring for the long-term? Most devs are gone in 2-3 years.

I'm conflicted about this. On one hand I agree with the idea that the best way to test whether someone can do something is to look at evidence that that can do that actual thing (i.e. write real code using best of breed tooling).

On the other, a whiteboard provides a very real constraint and tests something a little less tangible than can someone copy/paste. A whiteboard test is not just a test of 'can you solve this riddle', but more about how you think and communicate your thoughts about software development. It's this point I'll defend below.

The example DHH gives seems like one I'd like to specifically call out. If you can't write a bubble sort there's some real problems. You're unable to take a few hints about an implementation that are obvious from the name. It's a sorting algorithm, and it the way it works looks like something to do a bubble otherwise why would it be named that way. Now I can't exactly recall the algorithm, but that's enough to make some guesses about how it works. Start at the first element of the list and bubble up the highest number we see until we get to the last position. Do it again for the second to last position, etc. Which leads to the following:

    for (int i = 1; i < len(list); i++)
      for (int j = 0; j < len(list) - i; j++)
        if (list[j] > list[j+1])
          list.swap(j, j+1)
A good interviewer would understand a situation where a candidate was unfamiliar with the problem stated and be able to provide hints that would get them to this algorithm. A good candidate would not let the lack of knowledge of the algorithm deter them from attempting an answer by asking clarifying questions, and talking over how they think something might work. Software development is just as much about communication as it is programming.

I wrote a bubble sort in Grade 12 but I can't remember the details well enough to write BUG-FREE code to do it on a whiteboard. Nowadays if I want to sort, I use quicksort() or I use the accepted language way of sorting list.sorted() in Python for instance, or I write an ORDER BY clause at the end of my SQL statement.

And so should you. Writing sort algorithm code is a dumb thing for a software engineer to do and I don't care if you have never heard of bubble sort because it is useless historical knowledge for everyone except historians.

But you definitely should be able to make some wise suggestions about how to handle sorting of some set of data in a certain specific context.

I know there is a fad for reinventing the wheel in regard to databases these days but so few engineers will actually have to do this and so few employers will get any real value from this, that I think it is best ignored. Just consider these whiteboard interviews as a way of warning you that the potential employer probably is not such a great place to work after all.

> I wrote a bubble sort in Grade 12 but I can't remember the details well enough

I'm not sure that I've ever written a bubble sort. Like I mentioned, this is a derivation purely based on the name of the algorithm and some vague recollections from my childhood.

>to write BUG-FREE code to do it on a whiteboard

Do you think that you'd want to work for an employer that expected bug free code to be written on a whiteboard? I wouldn't either. But I would work for one where if they noticed a bug in code that I wrote, we then talked about the bug and what the impact of that would be and fixed it. This shows communication skills, as well as the humility to be able to admit situations where you're wrong and take action to fix it.

> Nowadays if I want to sort, I use quicksort() or I use the accepted language way of sorting list.sorted() in Python for instance, or I write an ORDER BY clause at the end of my SQL statement. And so should you.

You know, a whiteboard coding task can be a good place to talk about alternatives. It's not just a place to jump straight to code solutions. If you're treating it that way you're missing out on a bunch of opportunity to sell yourself as a potential employee. All of the above are valid topics of conversation that show your skills and experience. They're not wrong. In a recent interview I was asked to solve something and my first response was "There's only 2000 or so cases, write a lookup table, move on with life, but you know that doesn't show that I can code, so here's the solution you're looking for that would fill that lookup table".

> Writing sort algorithm code is a dumb thing for a software engineer to do and I don't care if you have never heard of bubble sort because it is useless historical knowledge for everyone except historians.

Given that you want to understand how a person communicates when talking about and writing code, what would you ask instead?

>Just consider these whiteboard interviews as a way of warning you that the potential employer probably is not such a great place to work after all.

I don't think this is the case at all. The conversation you have in front of that whiteboard is significantly more telling than the presence of an erasable writing surface.

We interview people based on their passion for the craft and never use code tests or whiteboard tests. If they have a GitHub account, great, we'll look at what they have available. It's not required.

Our team is bright, engaged, and productive. This system seems to work pretty well for us. And yes, we have a pretty diverse group of engineers.

> We interview people based on their passion for the craft

What does that mean?

> never use code tests or whiteboard tests

> If they have a GitHub account

So if they have no open source project, you hire them without seeing any code from them?

"Passion for the craft" refers to a general set of attributes, most often typified by an excitement for development, a desire to tinker, curiosity on how things work, and demonstration of sustained levels of these traits over a long period of time (e.g., years or decades). It's pretty easy to tell the difference between a candidate who genuinely loves coding and a candidate who just wants a job.

And yes, we sometimes hire without even seeing any code from a given candidate. We have yet to be disappointed.

Are you hiring and are based in SF bay area?

We were hiring up until about a week ago, but we're staffed up now. We're based in Minneapolis.

I think it was the "flash boys" by Michael Lewis, that mentioned a Russian programmer that was very good at programming with pen and paper, and the reason was in old days there were not many computers around or you had to use punch cards to program. So the only option was to have the ability to write the code on paper and make sure everything is well thought of before you run the code on an actual computer.

It only makes sense if the interview process followed the same pattern in those days. The difference is there were not much information to remember, most probably if you knew few data structures and algorithms that was enough.

I feel the tech interview process, like so many other things, is still stuck in the past. Seems that the monks are still tying the cat to the tree without knowing why. Everyone seems unhappy about this but nothing changes.

To guys in SV: are managers, scrum masters, product owners etc also whiteboarded during interview??

(disclaimer) From my freelance experience in EU and the amount of "fun & satisfaction" from a the actual work is inversely proportional to the amount of drilling and grilling during interview.

Liz's recent tweets on the subject have struck close to home and have so much insight. https://twitter.com/feministy/status/836407798476881920

"how would you distill an unwieldy set of options into a concise API?"

I'd rather implement Bubble Sort, tbh...

I think a lot of the confusion about how hiring is done or should be done stems from not clearly identifying a goal. For a company on the sale of Google a generalized process makes sense and they have, implicitly, data to back this up. But for everyone else there's just too much subjectivity to have in place a canonical job process. Not to mention the impracticality of using, say, the secretary problem. Having said that, using CTCI or some other book for example to source questions from can be the proxy for problem solving if an interviewer can frame the problem in the form of steps that one needs to flesh out a solution. In that context not having someone google for information makes complete sense and its really easy to use as a filter.

I think we should differentiate between the interview process of Google and such, and (for lack of a better word) average companies. Average companies almost never need sophisticated algorithms and perhaps, even if someone put such sophistication in, the return would be marginal ("sub-linear"). However, this does not seem true for Google and such - the return on investment of a sophisticated algorithm looks to be super-linear. So it makes sense for Google to filter candidates by knowledge of algorithms (among other sophistications) and for most other companies to not copy Google's interview process.

Github commits will hopefully change this in the future. During an interview, it's probably easier to go over algorithms and patterns for code which one previously wrote and is comfortable than some random whiteboard question. Plus it shows a little appreciation and respect for the developer. Coding is just as much an art as it is a technical craft. When an artist tries to get their work in a gallery, the manager doesn't ask them the techniques they are familiar with, she asks to see the artist's body of work and explain it. This is how I feel interviews should be structured.

While I don't think whiteboard interviews are the best can we at least suggest other metrics? This is a problem I see with the argument. It says something sucks, and it does, but doesn't suggest an alternative.

Experience correlates with quality but doesn't necessitate it.

GitHubs aren't always available and a person's best work might not be there.

You could give someone a coding challenge and an alloted time with computer access. (Screw those interviews that don't let you google). This could be blind too, so you don't know the ethnicity or gender of the interviewee yet.

Any other suggestions?

Honestly I'd suggest that the company in question takes a little bit of time to construct a toy project of medium size that's relevant to their application domain and asks the candidate how they would solve it.

This gets the candidate to talk about their problem-solving process in broad strokes, like, for example, they could ask a few questions of the interviewer, say things like "I'd break X feature into these 5 pieces, each of which are discrete functions, and I'd have to look up an algorithm to do Y for piece 3, etc."

As many people here have pointed out, what you're looking for isn't knowledge of any particular algorithm, but an idea that they know how to take a problem and get it from a business objective to an operational definition of a software product. If they can do that, the implementation part can be (and often is) literally all copy-pasting stack overflow posts, and that's fine for the intents and purposes for the business.

Furthermore, a process like that should, if the company already has competent engineers (And if they don't, how do they think they're going to successfully assess further additions to their team with any certainty?), allow said engineers to assess the new candidate's ability to get up to speed on their problem domain and meaningfully contribute, without relying on indirect proxies of knowledge.

A lot of these bad practices come not from a broken process but from the programmers themselves. There are many people who have an ego to protect or some concept of fairness to enforce.

Some examples:

(1) successful startup lead engineer who isnt that good but still the lead b/c of early faith in the company being unable to out ego aside.

(2) chinese/indian h1b who studied algorithms for hundreds if hours to get a position believing it is fair to expect the same from others.

Easy interviews:

Main programmer left/is-leaving so a body is needed to maintain technical debt.

Startup founder needs anyone at all to make a PoC before funding runs out.

My theory is that many immigrants, mainly Chinese and Indian, study very hard to get into these companies. So they expect the same level of effort from everyone else. It's not even expectation, it may be that someone who has obviously studied a lot resonates better.

There is also a subculture of nepotism or low standards that is allowing in people who don't care or can even code at all and may even cheat by saying they did some large complex project.

So this problem and solution combination creates this other problem.

“Never memorize something that you can look up.” -- Albert Einstein

I disagree with this quote. There is value in memorization. Times tables are an insanely useful thing to memorize. Memorizing historical events and literature can make you into a great lateral thinker.

I think you're taking the quote the wrong way. I believe there is an implicit "work to" that isn't being stated.

'Never (work to) memorize something you can look up.'

This has the meaning of, let your (short term memory) figure out based on how often you look something up if it should be remembered or not.

I prefer this version: "Never pass arrays by value, always by reference."

Hope you don't own a dictionary.

Calling it hazing is pretty fair.

The only time it made sense is when I was coming in to interview for a lead/manager role. Even then we were (the two interviewers and me) were just walking through architecture level stuff.

My hiring philosophy was to bring in contract to perm w/ 60 day contracts. If the programmer could not provide value we just let the contract expire. Otherwise, there was no way to screen out bullshitters.

Has anyone ever noticed that a whiteboard is the 1960's way of writing code? I started in the 80's and we used a scrap of paper sometimes and then a video terminal to type the code into an editor. Does anybody do coding interviews using an IDE hooked up to a fully functional Virtualbox server complete with all the usual tools?

We sometimes use online code editors

Reminds me of http://rejected.us/

literally copy and paste from it

I cannot load this page on mobile. When trying to scroll it moves back up. Firefox doesn't offer me to set up the "view mode" for it and Save as PDF is also broken. It's funny how, from memory, all the pages that were broken because of some fancy effect were about programmers.

But I bet their devs could whiteboard the heck out of a Fibonacci function!

Hey franciscop, I'm a dev at the Outline. Can you tell me which browser version / device you are on? Thanks for the feedback

Sure, Firefox 49 for Android in a OnePlus One with OxygenOS (Android 5.0.2)

Most of the tweets are about implementing specific famous algorithms (bubble sort) or about language/library trivia (reading from input streams in Java) or esoteric computer science theorems (np compete). All of these are the exact opposite of whiteboard coding interviews.

The whole point of whiteboard coding is that candidates are given a small problem/spec, and to solve it using any language/psuedocode/algorithm they want. The fact that you can ace a whiteboard coding interview without knowing anything about input streams, or bubble sort, or np-complete, is the entire rationale behind this interview style. The fact that whiteboard coding interviews are so precisely objective and formulaic makes them so much less prone to bias, as compared to bullshit interviews where you're asked to reminisce on past projects and your greatest weaknesses. Almost every tweet shown in the article only succeeds in validating the whiteboard coding interview.

Weird that this post got 90 points in about 20 minutes but has suddenly dropped more 10 places from the top of the front page.

I was interested in seeing the discussion here but it looks like the article is getting penalized for some reason.

figuring out why that is happening would be a good interview question!

yes the process is broken. why? because optimizing the process itself is a Hard problem

the naive solution would say, hey, this whiteboard stuff is a synthetic benchmark. just use a real one! (ex: solve a coding problem with access to the internet)

but for large companies (and this applies to school applications as well), it is not merely a process of evaluation, but of filtering.

there are more qualified candidates than there are positions. and thus whatever process is used to reduce this number will always be somewhat arbitrary

perhaps it would seem less "unfair" if it were random, but is it really?

Whiteboard coding questions are useful as a baseline test of preparedness.

You don't need to be a genius to solve them. You just need to spend some time practicing them and reading cracking the code interview(OR you need to be a recent college graduate who did well in their classes).

I agree that the super difficult Google style questions are silly, but if an engineer can't solve an easy-mid level difficulty whiteboard question (For example, I usually ask something like, "traverse/print a tree"), well... my response to them is why didn't you prepare for it?

You really should have known that you were going to be asked to do something like this, and you didn't even bother to prepare for the easy stuff.

Sarcasm much?

Not sarcasm. If you don't know what fizz buzz means, it means that you didn't even bother to spend 5 minutes googling "programming interview questions".

Do you really want to hire someone who didn't bother to put in even the tiniest, most basic level of effort into preparing for your interview?

Its like going to an onsite interview and asking "so what does this company even do anyway? I didn't check before coming here today."

Why should I have to prepare for some arbitrary question, though? You could ask any number of basic "whiteboard" questions and I have to prepare for them all? So you are going to deny what could be the best candidate for someone who just did a bunch of studying prior to the interview? How does this normally turn out for you? Do they tend to last and be good employees?

But to answer your question more seriously, I would not hire someone who hasn't prepared at all.

Why? Because they don't care. Why would I want to hire someone who cares so little about the position that they aren't willing to put in the most basic, easiest level of effort?

If they don't care about preparing, then they probably don't care about other things, like the quality of their work.

Why should I have to spend 5 minutes looking up what a company does before coming into an onsite interview?

Would you deny what could be the best possible candidate because they didn't do this?

It's Google/Facebook/etc interview rules. You can accept them or go to work in startup. That's all. I don't understand why they posting about a broken process.

If only interviewers could ask questions that they didn't already know the answer to, they might be surprised to find out that some candidates actually know more than they do.

They could, and I often did ask questions like that, it's a good way to remove any bias.

It's a clever intervew if you expect the answer to be "I would find an appropriate module on cpan".

People shouldn't be rewriting these algorithms.

This. The last thing I want to work with is the guy who re-implements well-understood problems with well-implemented solutions.

So many security and performance problems I've run into boil down to people not bothering to use a good standard library or framework for solving a problem.

Or, at a more fundamental level, people who can't be fucked learning SQL properly so haul heaps of data into the application and try to beat Oracle at implementing joins and GROUP BY[1].

[1] If you can beat Oracle at that, you are wasting your time working with me. Larry, IBM, or Microsoft will give you swimming pools of money to come make their databases better.

the correct answer would of course be that I would not use an O(n²) algorithm

It was funny how this was happening on Twitter around the same time when Celestine Omin was asked to balance a Binary Tree at JFK airport.

Somehow I still don't buy that dhh would fail to implement Bubble Sort.

30 years in the business, just recently failed a screen coding test call because I blanked on find string pairs question. Just couldn't think for a minute. During the normal day, look it up, slap head and go on.

There are edge cases that are easy to forget about even if you have the general idea. This article is about binary search, not bubble sort, but it has a lot of the same problems. https://reprog.wordpress.com/2010/04/21/binary-search-redux-...

this. maybe he would need to be reminded which sort it is, but he could do it. unless the white board interviewer expected perfect syntax in a particular language.

which, I agree that is usually always silly

If they don't do whiteboards, what does foursquare do now?

There's a link to their blog post where they describe it in that exact sentence.

> “Whiteboard” interviews are widely hated. They also discriminate against people who are already underrepresented in the field.

I don't particularly like whiteboard interviews. How can they be discrimination if the expectations are laid out and are such that anyone can reasonably practice to reach them.

I understand you can create a test that is deliberately discriminatory (e.g. voting tests).

You can create a test and realize later that people from certain backgrounds do worse (IQ tests apparently?)

But with whiteboarding you are taking knowledge people should have (lets set aside the idiocy of some of these problems). It's foundational stuff for the industry. So the problem might be with the nature of the test, but anyone can go grab a set of whiteboard markers and start working problems. In fact this is part of why I so dislike these interviews, they are setup to make it easy to game them. So how can that really be discriminatory?

underrepresented candidates have been acing our whiteboard interviews, and getting hired.

The whole meme is so stupid I want to cry.

"Asking field-relevant questions with objective answers is discriminatory, instead just have a chat. "

Mother of god.

Yes I understand the limitations and downsides of whiteboard questions, and in fact in future I'm thinking I won't do them on white boards anymore. But I'll still be asking people who claim to be programmers to do some basic programming before I hire them.

The alternative is not no technical interviews, but it is important to recognize that different interview styles have different biases. Like a take home problem is biased against programmers with young kids. Whiteboarding is often a skill in itself, and as a result, you end up selecting for people that have the resources to study and practice whiteboarding.

I learned how to whiteboard through peers and career programs in college. There's a lot of people that don't have that guidance, or all those people to practice with. And there's a lot of people that will believe that flailing in their first whiteboard interview means they're unqualified to be a programmer, not realizing that it's a skill in itself.

Whiteboard questions also tend to be the nonsensical algorithms that you memorize for interviews and never use in practice, which again is influenced by one's peer group - you learn, study, and practice together. This also is why race and gender matter, even when you control for income and education.

The point isn't that these are insurmountable for someone from untraditional circumstances, but it is harder. And that means you let in weaker candidates with more advantaged upbringings in favor of stronger ones with less traditional ones. This is why google offers a seminar for all its interviewees to give them interview practice - so that people can start from a more equal playing field.

Obviously, a lot of the alternatives suffer from the same issues. But like all good programmers, I believe 1. this analysis and process is more important than the specific implementations, and 2. by understanding the issues and looking at the data, we can iterate to better solutions.

I think the point is if someone is under-represented and they merely do "OK" on the whiteboard, other latent biases might downgrade the candidate in the final analysis. I do think it's relevant to ask people to code in some form, but this article makes me rethink the things I'm going to look for in those interview questions. Also, perhaps more useful might be to give them a small problem before they come in to the interview (1-2h work at most) and then go over it with them as part of the interview. That way you evaluate better how they work on a team, explain their thought process and actually do the job they'll be hired to do.

>I think the point is if someone is under-represented and they merely do "OK" on the whiteboard, other latent biases might downgrade the candidate in the final analysis

But if you don't do a whiteboard interview, everyone is by default only OK. Biases rule the day. If you do a whiteboard interview, then at least they get a chance to do great and defy your biases.

Asking them to do real programming is good. Asking them to remember things from an irrelevant layer of abstraction and write code in a way that is very unlike the actual job is the problem.

What I don't get is why interviews arent 4x 30 minute sessions, an hour of research time, lunch, an hour with a manager, another hour of research time, and then 4x 30 minute sessions answering questions and presenting solutions?

It still fits in an 8 hour day, but it has a lot more in common with how Id actually interact with the team and do the work than the contrived 1 hour sessions.

At least one large company I interviewed at does structure their interviews kind of that way (minus the hour of research time, but they do have some take-home stuff to compensate), and it is much more pleasant having smaller chunks of sessions.

Sour grapes.

If the engineering interview is broken as TFA implies then surely the companies employing these practices will lose out to rival companies. They will be forced to change their hiring culture to be able to identify good programmers. The fact that they have not, testifies to the strengths of the current interviewing culture. It identifies strong programmers from a very large pool of applicants using limited resources. Sure there's a lot of false negatives but that's a much lesser concern than false positives. The alternatives proposed (like doing a project together or looking at past work in Github) are prohibitively expensive to do at scale.

Since this issue comes up so often in HN a reader might think that there's a strong consensus in the tech community about the demerits of the current system. However, that would be a false conclusion. The losers under the current system complain a lot but people who have positive experiences don't feel the need to share their side of the story.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact