Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is absolutely completely true:

"Companies do code testing because they have encountered so many candidates who look good on paper, and may even be able to talk about computers convincingly, but can’t actually write a program when asked. Any program. At all."

I interview programmers and I've encountered this a few times, and yet, I still don't do code tests. I talk to the candidate and you can easily ask the right questions to tell who can code and who can't. You can even get a judge of experience level (junior, intermediate, senior) which you won't get from a code test.

We get a lot of English-as-second-language and shy developers (who can code) and I can still figure out by asking pretty general questions about development. The idea that you need a code test to figure out who can code and who can't is false. I'm not arguing against it, I'm simply arguing that it's not strictly necessary.



What has worked well for me (at least for non-junior positions) is to just ask about their prior work. "You were on a team that developed x - great. Assume I'm an expert in that and am happy to geek out about the tiny details. What was a part that you specifically were responsible for? What was most fun/interesting about it? What was one of the challenges and how did you overcome it? What was something new you learned? If you started over today, what was a part you'd do differently, and why and how?" Really just pick something they seem excited about and dive really deep into it. "Why didn't you use y for that? How did you test it?" The key is really simple: find something they're passionate about and get them to talk about it.

The annoying thing about refining interview techniques is you only get a small view of effectiveness: you can't tell how many times when you didn't hire it was the correct choice vs passing on someone great, so I can't judge from that perspective. That said, I haven't regretted hiring anyone that did well on this style of interviewing, without another coding interview.


> I talk to the candidate and you can easily ask the right questions to tell who can code and who can't. You can even get a judge of experience level (junior, intermediate, senior) which you won't get from a code test.

I also think I can do this, but I believe most people (myself included) introduce a lot of bias when doing this. I've spoken to developers who in 15 minutes I might have written off as not very qualified, but after hours/days have realised are great at what they do, and very experienced in types of software engineering that I have had less exposure to.

I think we have to be careful when judging candidates with what amounts to "gut feeling" like this, as it's typically biased, and not aligned within the hiring team.


The thing is: The method described in the post has the same problem. It just doesn't look like it. Most of these types of interview templates are designed by one or two persons and then blindly applied. Sometimes you have a view modifications, but for the most part it's fixed.

Since every interviewee gets the same set people then start to think this is more scientific than "gut feeling" - it is not, but instead of your own biases you have the biases of the template designer.

There are probably a few companies which extensively study if their interview process for the people hired (the only subset they have access to, which already is a problem) actually matches their job performance and then try to readjust the process, but it doesn't seem to happen very often.

This also gets us back to the old problem that we have no idea what a good programmer is or how to evaluate job performance.


I find that asking "What is the worst bug you've ever encountered" is a great sounding board.


I think that question can be great, but it also puts a lot of pressure on the candidate to execute a rather complicated query on the spot. I've definitely encountered lots of interesting bugs in my time, but my initial reaction to this question is a total mind blank. Especially since "worst" could mean a lot of things, and what I really want is "worst bug I've encountered that I can say something interesting and impressive about".

I'd say it's not a bad idea to forward questions like that to the candidate in advance of the interview. That way, you avoid false negatives from people who just couldn't think of a good answer under interview pressure.


Would “what is one of the most interesting bugs you’ve ever encountered?” allay your concerns? I think that’s the main intent of the question above.


(I wrote the article)

I never ask this sort of question because:

1. I'd struggle to answer it myself despite nearly three decades of fixing bugs. I could probably come up with some interesting ones after a few minutes of thinking, but most of the bugs that'd immediately pop to mind would just be recent bugs that were hard to explain without knowledge of the codebase.

2. There's no way to guide an interviewer on how to mark this question beyond gut feel. What exactly does a good answer to this look like? More importantly: what does a bad answer look like? Asking questions which are impossible to actually fail at, or for which the outcome is basically random and unjustifiable, is a very common failure mode for interviewers.

3. It tells you nothing about the candidate because they can easily just repeat a bug they read about on a blog last week. You can't check they actually debugged it themselves.

For instance I just searched for "most interesting software bug" and clicked the first relevant link, the second story on this Quora post could easily just be repeated verbatim to an interviewer over the phone:

https://www.quora.com/What-is-the-most-interesting-software-...

If the interviewer wasn't alert they might be fooled by it. Candidates will quietly Google for answers during talking questions whilst trying to avoid the interviewer noticing, and many other things.

That's why I emphasise that designing good interview questions is quite hard.


(I wrote the top comment)

I agree that's not a great question. I usually have few simple questions that can tell you if your candidate really understands the platform they say they understand -- it doesn't take much.

I usually have one or two projects set aside specifically for new hires; the type of project that is maybe a month-long or longer but that doesn't really require any domain knowledge or big integrations. It allows them to have something to do when they're learning everything else. In the interview, I can ask them how they'd proceed with it and I always offer that I'm here to help them and will answer any questions they have. This is literally no different than what I would ask them on day one of the job if they got hired.

Is that more informative than whether or not they can reverse a linked list? I think so.


> Would “what is one of the most interesting bugs you’ve ever encountered?” allay your concerns? I think that’s the main intent of the question above.

I prefer the even more casual, "What's an interesting bug you've been bitten by?"


I think I like this variation the best, because it opens things up to bugs in tools that one has used, and other kinds of bugs that aren’t necessarily ones the candidate would have had to attempt to fix.


I think that's a better way of phrasing the question. However, I think my real issue is just that it's difficult to rummage through your mental database of past bugs in the space of a few seconds. (Realistically, it's awkward to pause for more than a few seconds before answering a question in a face-to-face interview.)

What would be the downside to giving the candidate a heads-up that you're going to ask this question?


Maybe, but this is something I'm very bad at. I forget about problems once they're solved.


> I interview programmers and I've encountered this a few times, and yet, I still don't do code tests. I talk to the candidate and you can easily ask the right questions to tell who can code and who can't.

How do you know? Is this repeatable -- e.g. can you write down the questions and the criteria such that someone who is not you can follow them and come to the conclusions you would?

Do you ask all candidates the same questions, or do you go with your gut and hope there's no systemic bias?


> I interview programmers and I've encountered this a few times, and yet, I still don't do code tests. I talk to the candidate and you can easily ask the right questions to tell who can code and who can't. You can even get a judge of experience level (junior, intermediate, senior) which you won't get from a code test.

How do you know? Do you check your interview assessments against actual performance among those hired (measured how)? What about the ones you turned down, what makes you so confident that they actually couldn't program as opposed to just answering differently from how you expected?


> What about the ones you turned down, what makes you so confident that they actually couldn't program as opposed to just answering differently from how you expected?

I'm pretty forgiving -- heck it usually takes at least half the interview before the candidate's nerves settle down.

Being a developer involves being able to explain yourself clearly. You can't just sit in the dark and code, talking to no one. If I came away from an interview with the impression that you can't code then I don't want you. The candidate has to do some work too -- they are there to convince me to hire them, not just pass a test and get a grade.


It's much easier to "explain yourself clearly" to someone from the same cultural/social/... background. Doesn't that approach create a huge bias towards hiring people like you?


Literally every single candidate we get is from a different cultural/social background. I'm not sure why giving them a test first and then interviewing them would make any difference.


Obviously if you ignore the test results then the test doesn't make any difference. But if you assign some weight to whether their solution to a test problem worked (which is culture-independent, or at least less culture-dependent) rather than how well they explained themselves to a person from a particular culture (which is easier for people from that same culture), then you end up with a process that is much fairer across cultures. And similarly for non-culture aspects of a person's background.


If you can't explain yourself then I don't want to hire you. Developers are not sitting in a dark corner not interacting with anyone; they are working with the team, they are gathering requirements from end users, etc.

If you can pass a coding test but you can't express yourself in a way that anyone can understand, then that's a no-hire. If that's a cultural bias, so be it. I don't see how it can be any other way.

Now it's been my personal experience that culture doesn't matter too much. For more than half the team, English is a second language. HR will reject candidates for language reasons even before I get to them.


I agree 100% and I would argue that unless you are pairing together just to see how you work together and for some back and forth problem solving, no other code test provides a better signal than a good conversation.

Not set questions, even, just a good conversation. What issue have they run into with Framework X, how do they refactor, what's the weirdest bug they've ever had to track down, etc etc etc.

SV seems to enjoy grueling 7-10 round interviews these days. I had one a couple months ago that was over 40 hours (of my time) and over 20 hours (of their time, that I was aware of). I just can't believe that much time provides 20-40x the value of a simple relaxed conversation.


It's interesting. I've encountered a very different animal, even without having the candidate do a "homework test" or a "coding test" during the interview.

I have found that this different animal is competent, and certainly can code, but is of the very unproductive, slow type of developer.

And so my biggest pain point is rarely around a developer's competence during the interview process (i.e. Does this person know the material and are they able to do the job?) but rather instead around their overall productivity, their "capacity" to build software using their competence.

There are plenty of people in the world who know the material but are extremely "slow" in actually getting things done and problems solved in a day job. Unfortunately, the in-interview coding tests don't do a very good job of measuring the person's productive capacity to get problems solved, only whether or not they can solve some coding problem in a 30-minute timespan.


I could do that for my specific niche for programmers (c, embedded systems [for the systems I know], a bit of web stuff), but not in general for a data analyst or ML for example.

I imagine I would ask for knowledge only practical experience could provide (interest in a topic is not the greatest teacher, but a decent one)

Better than fizz-buzz I think...

But you can never be sure either way. I would think that actually giving people a chance while defining quantifiable expectations would help here. Costly and time intensive perhaps, but I don't believe there to be a reliable shortcut. Especially not if the labor market is in need of more engineers.


How do you determine the false-positive rate of this: people who actually can code quite well but fail the interview?


What are the questions about development?


It’s certainly possible but it’s obviously much harder to do. Let’s say I own a professional basketball team and need to make a hiring decision on a player who walks in the door. Am I going to be able to, with just an interview, determine whether they are a currently good player, or a formerly great player who has only been coaching in recent years and has forgotten how to play? Rather than trying to determine this just by asking questions, why not take the much easier approach of putting them on the court for 2 minutes to see how they play?


This is unrealistic because coding in an interview is not similar to coding in reality, at least for me. When I develop software I silently get to think about reason about problems, try things and read things on Google. In an interview I am with a person I don't know staring at my hands and expected to talk and impress them, all while a timer is ticking.


Strongly agree, I know there is some literature on that, but just wanted to add that for me the process of solving problems is very asynchronous and usually the a-ha comes out in the most unexpected times.


“Let’s say I own a professional basketball team and need to make a hiring decision on a player who walks in the door. Am I going to be able to, with just an interview, determine whether they are a currently good player, or a formerly great player who has only been coaching in recent years and has forgotten how to play?”

No, that would be phenomenally stupid. You would do what teams actually do, and look at their recent playing history.

Note that this is also entirely possible for engineers.


> Note that this is also entirely possible for engineers.

How exactly? Most people cannot take their code with them and show off. It belongs to previous employers, and please do not begin with the "show me your GitHub" - it's unrealistic to expect people to devote their life 100% to coding. Just as I wouldn't expect a carpenter to build houses for fun in his spare time, I do not expect programmers to spend their spare time in front of computers. In fact I believe that socially it's a benefit if they have other hobbies.


> Most people cannot take their code with them and show off.

And if someone turned up with code from their previous employer, that'd surely be a huge red flag anyway.


This is the problem with programmers: everyone gets obsessive about finding perfect unicorn universal solutions, and gives up on obvious stuff that works 95% of the time. Get out of your own way!

First, most programmers have at least some code they can show you. Stop fretting about the 5% who don’t. If you give up here simply because you have a theory about programmers “devoting their life to coding”, you lose.

Second, you call their references. Yes, really. I know it’s not “objective”. Do it anyway. You will learn a lot. Really.

Third, look around for other people they’ve worked with, who weren’t listed as references. Call some of them. See if you get different opinions.

If your candidate truly, honestly has no code available, no enthusiastic references, and you can’t find anyone who worked with them who will vouch for their skill, then do you really want to hire them?

Finally, yes: do a trivial fizzbuzz coding test to filter out the total fakes.

Listening to coders complain, you’d think that nobody else in the world hired anyone, ever. Let go of your need to make hiring a failure-proof system, and you’ll discover a bigger world.


The majority of my successful senior programmers do not have any private code to show off. Reasoning and ability to think abstract is important and impossible to judge from code they have written. I need programmers that can do those thing in a reasonable time frame and in a group.

Most junior programmers have code to show, and most of the time I don't use it for anything. I'd rather do some FizzBuzz level whiteboard that we spend some time discussing and extend on. It gives a far better picture of their future as a developer. I know this excludes some, but that's the trade off I'm making.

I call references every time, but calling around willy nilly is not a good idea. Apart from the fact that it's not legal, most of the time I get "Yes I've worked with X as a developer for Y years" and nothing more.


”The majority of my successful senior programmers do not have any private code to show off.”

You’re exaggerating or you are an extreme exception.

Given the number of qualifiers (“successful senior programmers”, “private code”) I strongly suspect you’re just trying to find a reason to justify your opinion.

What I see in your comment is someone who has decided the answer, and refuses to consider alternatives.


More than half have no code "they own" or OSS contributions they would show off to an interview (the last part is important). That's neither an exaggeration nor trying to find a justification.

They are fathers, amateur musicians, or like to restore old cars. Just to mention a few things. Could they spend 5-10 hours weekly writing code and hacking things together? Yes, but that would be a significant chunk of their spare time, so understandably they do not.


> If your candidate truly, honestly has no code available, no enthusiastic references, and you can’t find anyone who worked with them who will vouch for their skill, then do you really want to hire them?

Graduates?

> no enthusiastic references, and you can’t find anyone who worked with them who will vouch for their skill

This will make your diversity even worse unless you're extremely careful with it.


My advice works better for experienced candidates, but new grads definitely have code to share.

I do think that coding tests have more of a role for new grads, though.


Usually the only people who see engineers playing are their own team members, while basketball players play in public, in front of crowds, recorded simultaneously from many angles by video cameras.


Most junior level developers these days have open source code to look at it (as schools really push that). For higher-level developers you can ask more complicated and telling questions.


That's good and valuable, and it's laudable in its own right in a way that basketball isn't (since open source is a contribution to the intellectual commons) but it's not the same thing as working with someone. A lot of people only publish the projects where they were happy with the results, and often it's hard to tell how long someone took or what kind of help they had when they were writing something.


Unlike basketball, coding is not reproducible and has no clear rules nor set of tools even. Thus your analogy is flawed from the outset.

Additionally, a person who can adapt but has lower specific testing skill might actually be better at everything. As in your coach example, a slightly rusty to player from years past might be better than slightly better new guy with no track record. You're taking a gamble either way.

The lowest rung coding test is only good to filter out people who cannot code at all, and that if simple enough. They tend to be stupidly worded though instead.

Trying for anything tough tests for specific tricks, tooling and skills. Giving a longer time task is comprehensive but rules against people who are actually busy, presumably working, thus already kind of tested.


> why not take the much easier approach of putting them on the court for 2 minutes to see how they play?

If every basketball team did that, every player would practice for that 2 minutes of play and you'd have no idea if they would handle an entire game.

The effect is the same for developers: interviewing developers practice for coding tests; memorize common interviewing patterns, quick solutions, and junior level algorithm stuff from their university days.

If you were interviewing a player, you'd probably look to their past experience and teams they played on, their positions on the court, and you'd check with their teammates.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: