
In Defence of the Technical Interview - telotortium
https://blog.plan99.net/in-defence-of-the-technical-interview-966f54a58927
======
on_and_off
>How could someone with 10 years experience on their CV be unable to start a
new project in their own editor

This is something I need to study each time I want to change jobs. I suspect
that I will have to create a new project somewhere in the interview process
but it is just something I very rarely do in my day to day job, if ever. I
suspect that it is the same for many engineers.

FWIW, at my current company, the interview projects come preconfigured, so you
only have to import them and start coding. We have also already imported some
popular libs.

Watching somebody rightfully struggle to do that is :

\- a pain for the interviewee

\- awkward and uninteresting for the interviewer .. I don't expect many people
to have to create new projects often so it is totally ok to have forgotten it

\- a waste of time for the both of us.

We also actually don't do algorithms.

We have a debugging interview, with a shitty codebase where you have to find
the issues and correct them. When you are done with the bugs, you can collect
your thoughts on what's wrong with that codebase so we can discuss how to do
it better.

Another interview is to have to design a product similar to ours (a simplified
version of course) and you have to come up with API and app technical designs.

~~~
PopeDotNinja
> Watching somebody rightfully struggle to do that is :

Being watched usually makes me a horrible coder. I get so distracted from
having a backseat driver that it can even hard to type at times. The only
thing I can whip up on the spot in a readable manner is something I already
know pretty well, and if the interviewer can't keep up, it becomes a big
conversation about "what are you doing again?"

~~~
on_and_off
That's something I try to alleviate.

It can be pretty hard to make a stranger feel at ease in a stressful situation
so I can't pretend I have a 100% success rate, especially for the one
screening interview we do which is a one hour pair programming exercise.
However we do it, I am going to be there and look at what you are doing.

For the onsite though, most of the time, I explain the exercise and then I
leave people work while I do something else in the same room.

I stay there so they can ask questions if they are stuck but I am not hovering
over their shoulders so they can concentrate.

~~~
PopeDotNinja
> the one screening interview we do which is a one hour pair programming
> exercise.

I'm not toot excited about the pairing part. Most pair interviews I've had
feel like the interviewer is trying to shove a box that says "they arrived at
the solution I like in the manner I was expecting". Most pairing feels like me
going out of my way to connect with a robot interviewer w/ no imagination.

> For the onsite though, most of the time, I explain the exercise and then I
> leave people work while I do something else in the same room.

I love this. I'd love to see more interviewer's do this.

~~~
on_and_off
Pairing is hard. It is the part of our process that I like the less.

We try to make it easier and fairer with a rule like " if candidate reaches
this step = automatic pass" (and the step in question is not that far in the
interview).

But there are still many person to person differences and I often have
dilemmas.

It is very hard to differentiate between giving a pointer and handholding the
candidate.

So sometimes I ask myself if I gave too much or too little help, since there
are cases where the candidate would have gone far enough with just a little
bit more push and others where I feel that I helped them too much. On a 45
minutes exercise, if you help somebody not getting stuck for 10 minutes, it
can make a difference.

Another part that make me uneasy is wondering if accents or culture
differences can make me act differently. Like different micro expressions or
attitudes from what I would expect, can these influence me enough to go from
pass to no pass more easily for some ethnical/cultural groups ?

All in all it is a perilous exercise.

------
wvenable
This is absolutely completely true:

"Companies do code testing because they have encountered so many candidates
who look good on paper, and may even be able to talk about computers
convincingly, but can’t actually write a program when asked. Any program. At
all."

I interview programmers and I've encountered this a few times, and yet, I
still don't do code tests. I talk to the candidate and you can easily ask the
right questions to tell who can code and who can't. You can even get a judge
of experience level (junior, intermediate, senior) which you won't get from a
code test.

We get a lot of English-as-second-language and shy developers (who _can_ code)
and I can still figure out by asking pretty general questions about
development. The idea that you _need_ a code test to figure out who can code
and who can't is false. I'm not arguing against it, I'm simply arguing that
it's not strictly _necessary_.

~~~
anbop
It’s certainly _possible_ but it’s obviously much harder to do. Let’s say I
own a professional basketball team and need to make a hiring decision on a
player who walks in the door. Am I going to be able to, with just an
interview, determine whether they are a currently good player, or a formerly
great player who has only been coaching in recent years and has forgotten how
to play? Rather than trying to determine this just by asking questions, why
not take the much easier approach of putting them on the court for 2 minutes
to see how they play?

~~~
irq11
_“Let’s say I own a professional basketball team and need to make a hiring
decision on a player who walks in the door. Am I going to be able to, with
just an interview, determine whether they are a currently good player, or a
formerly great player who has only been coaching in recent years and has
forgotten how to play?”_

No, that would be phenomenally stupid. You would do what teams _actually do_ ,
and look at their recent playing history.

Note that this is also entirely possible for engineers.

~~~
hvidgaard
> Note that this is also entirely possible for engineers.

How exactly? Most people cannot take their code with them and show off. It
belongs to previous employers, and please do not begin with the "show me your
GitHub" \- it's unrealistic to expect people to devote their life 100% to
coding. Just as I wouldn't expect a carpenter to build houses for fun in his
spare time, I do not expect programmers to spend their spare time in front of
computers. In fact I believe that socially it's a benefit if they have other
hobbies.

~~~
irq11
This is the problem with programmers: everyone gets obsessive about finding
perfect unicorn universal solutions, and gives up on obvious stuff that works
95% of the time. Get out of your own way!

First, _most_ programmers have at least _some_ code they can show you. Stop
fretting about the 5% who don’t. If you give up here simply because you have a
theory about programmers “devoting their life to coding”, you lose.

Second, _you call their references_. Yes, really. I know it’s not “objective”.
Do it anyway. You will learn a lot. Really.

Third, look around for other people they’ve worked with, who weren’t listed as
references. Call some of them. See if you get different opinions.

If your candidate truly, honestly has no code available, no enthusiastic
references, and you can’t find anyone who worked with them who will vouch for
their skill, then do you really want to hire them?

Finally, yes: do a trivial fizzbuzz coding test to filter out the total fakes.

Listening to coders complain, you’d think that nobody else in the world hired
anyone, ever. Let go of your need to make hiring a failure-proof system, and
you’ll discover a bigger world.

~~~
hvidgaard
The majority of my successful senior programmers do not have any private code
to show off. Reasoning and ability to think abstract is important and
impossible to judge from code they have written. I need programmers that can
do those thing in a reasonable time frame and in a group.

Most junior programmers have code to show, and most of the time I don't use it
for anything. I'd rather do some FizzBuzz level whiteboard that we spend some
time discussing and extend on. It gives a far better picture of their future
as a developer. I know this excludes some, but that's the trade off I'm
making.

I call references every time, but calling around willy nilly is not a good
idea. Apart from the fact that it's not legal, most of the time I get "Yes
I've worked with X as a developer for Y years" and nothing more.

~~~
irq11
_”The majority of my successful senior programmers do not have any private
code to show off.”_

You’re exaggerating or you are an extreme exception.

Given the number of qualifiers (“successful senior programmers”, “private
code”) I strongly suspect you’re just trying to find a reason to justify your
opinion.

What I see in your comment is someone who has decided the answer, and refuses
to consider alternatives.

~~~
hvidgaard
More than half have no code "they own" or OSS contributions they would show
off to an interview (the last part is important). That's neither an
exaggeration nor trying to find a justification.

They are fathers, amateur musicians, or like to restore old cars. Just to
mention a few things. Could they spend 5-10 hours weekly writing code and
hacking things together? Yes, but that would be a significant chunk of their
spare time, so understandably they do not.

------
pjc50
> How could the candidate have started coding in one language, then decided
> actually they don’t know that language and should start over from scratch in
> a different one? How could someone with 10 years experience on their CV be
> unable to start a new project in their own editor? How could the candidate
> spend thirty minutes trying to generate a random number and still fail? This
> is crazy!

They panicked.

I would like to see a large company trying a replication study on their
interview practice with their current employees. You'd need to make sure that
the test was significantly different from the one used to hire them, and that
they were put under sufficient economic and social psychological pressure -
perhaps tell them that if they fail they will be fired and immediately
escorted from the building (+), then make them do it with the CEO sitting
behind them. I would expect over a 10% failure rate, including some people who
are normally regarded as business-critical important team players.

You then have to hope that people lean the lesson "interviews are a poor
predictor of performance" and not "some of our people are really imposters"..

(+) obviously don't actually do this, but like fire drills you need to
convince people that it might be a real fire sometimes. I am also aware that
there are ethics problems with performing stress experiments on employees
without their consent, although nobody seems to care about this for
interviewees

~~~
abtinf
I would expect a far higher failure rate, maybe closer to 80-90%, especially
at a successful company with happy employees.

The engineers will be focused on building actual systems and be totally
unprepared for technical interviewing.

Engineers great in one domain will be interviewed by those in different areas
with different views on what constitutes “basic” knowledge.

None of their work history within the company can be used as part of the
evaluation—you must simulate the problem where nearly all your work is locked
up behind an NDA.

------
momokoko
Technical interviews are disguised IQ tests. IQ tests are actually not allowed
in the US based on a Supreme Court decision[1] that said they were
discriminatory.

We know that cognitive ability has one of the strongest correlations with job
performance.

So "technical interviews" that are heavy on algorithm puzzles are just a
loophole to get around the IQ test issue. They aren't actually testing domain
knowledge.

[1]
[https://en.m.wikipedia.org/wiki/Griggs_v._Duke_Power_Co](https://en.m.wikipedia.org/wiki/Griggs_v._Duke_Power_Co).

~~~
e10a
> We know that cognitive ability has one of the strongest correlations with
> job performance

There is a good argument that the technical interview success is much more
strongly correlated to the number of hours spent practicing Leetcode-type
questions than to actual cognitive ability.

~~~
typon
I've noticed that exactly. IQ presumably is something inherent.. But personal
experience shows that technical interviews have nothing to do with IQ. They
can be easily solved by creating a dynamic programming problem pattern matcher
in your brain, which is what I created after doing 300 leetcode problems. I
did not become smarter in general or at coding at all, I just got good at
mapping problems to binary trees or dynamic programming or string search.

~~~
riffraff
aren't IQ tests the same? You just learn how to solve some kind of problems
(series of images, series of numbers and so on) etc. This obviously does not
make you "smarter", it just makes you good at solving some kind of puzzles.
But "smart" is ill defined and vague anyway..

~~~
sgift
That's the reason all real IQ tests (not the ones you find on the internet,
the real ones) are secret/closely guarded: Knowing the tests beforehand skews
the results. So: yes, it is the same in that way and has been theorized as one
of the reasons for the Flynn effect. Afaik there's also research into new
types of IQ tests to mitigate this all the time, but a good IQ test has higher
requirements than the typical interviewing test, so it takes longer.

------
choppaface
I’ve interviewed many developers who pass a code test, get hired, and then
can’t design a basic interface for an object. Or take forever getting through
a large feature. Or can’t commmunicate clearly. Or go down the rabbit hole and
implement something really complex and shiny but irrelevant. (Irrelevant isn’t
always bad, but does often expose a weakness in effectively interpreting user
needs).

Maybe 1 in 10 engineers who pass a code test end up being actually good
software engineers. The yield definitely varies by the role.

Wait, if code tests aren’t perfectly predictive, then how do interviewers get
feedback about failures (or even successes) of the screening process? They
don’t, because recruiters and hiring managers tend to keep that to themselves.

The story is way way more complicated than this post alludes.

~~~
ALittleLight
Are people who failed code tests better?

~~~
choppaface
I know of at least a couple instances where the person failed the code test
and then (according to LinkedIn) got a solid job somewhere else.

Triplebyte actually has some hard data on this trend. Many candidates
definitely bomb a few initial screens, do well on later screens, and then burn
out and even withdraw from later on-sites. I imagine there’s high variance.
Any experienced recruiter I’m sure has observed the same pattern... bombing
one code test is typically not definitive.

~~~
ronilan
The last line made me curious...

> _bombing one code test is typically not definitive_

How many times would you say someone has to bomb Triplebyte (and/or similar)
to make it “definitive”?

And after said number of failures what is it exactly that is “definitive”?
What is the conclusion you would make? What is the conclusion that the
candidate should make?

~~~
choppaface
When I work with students, they're able to get _somewhere_ after several
tries, even if they're tuned out for the first 5-7 times or so.

Definitive is a relative term. It's the point at which you give up. If you
give up early, that might be a good thing or a bad thing for you. It's a
choice you have to make yourself.

There's also the result that at Google they found hires who had one bad score
typically outperformed others; the hypothesis was that there was somebody at
Google who was willing to fight for them. Having a supportive manager / peer
is IMO a much greater predictor of success than passing code tests.

------
BigJono
Technical interviews have been quite useful to me in the past.

Once I applied for a job, and received an online technical test where every
single question used the wrong terminology for a bunch of stuff and half the
"answers" in the multiple choice section were wrong, while some of the "wrong
answers" happened to be right.

That technical interview served it's purpose perfectly, assuming it's purpose
was to save me from accidentally taking a job that would have been a fucking
train wreck.

~~~
james_s_tayler
Amazing.

------
hyperpape
> To see this more clearly, consider interviewing a pilot. After establishing
> basic bona fides, it would be reasonable to ask the candidate about what to
> do in various emergency situations. Emergency situations aren’t
> representative of the daily work of flying, but safety is important so
> nobody would accuse such an interviewer of asking irrelevant questions.

Minor nit about what is really a pretty good post. This paragraph sounds quite
reasonable to this non-pilot, but do you notice what it doesn't say? It
doesn't tell you whether pilots are _actually_ interviewed this way.

Here's an actual example I know of, though it's not making the point the
author wants to make. At an old job, every so often, a guy used to wander over
from the other side of the yard and ask us to cut a few random pieces of pipe.
We then carried them across the yard for a candidate pipe welder to weld them.
If the welds looked good, they'd probably hire the candidate. It was pretty
close to actual work you'd do, and the results seemed ok.

I'm sure there are examples that would suit his point and be real, not just
speculation.

~~~
jillesvangurp
Oral exams are part of certification for pilots. Additionally, pilots have to
keep log books so you can simply examine their history and progress. So,
repeating that during a job interview is kind of redundant because either they
are certified and current, or not. For software engineers, certifications are
kind of meaningless. I actually treat them as a red flag because in my
experience it's not so great consultants that tend to overemphasize
certifications in their CVs to cover up that they haven't done a lot of great
projects. Usually, this is obvious from their CVs.

I've been on both sides of the table for technical interviews. If we're
talking, that means CVs and linkedin & github profiles were scrutinized, etc.
and we're now moving to the phase where we are going to mutually find out
whether there's a basis for working together. Part of that is quickly
verifying those things were accurate; but most of it is simply eliminating any
red flags: any obvious reasons you should not hire someone.

I usually focus on just getting people to talk about what they've recently
been doing and getting them to talk about things that they are interested in
technically.

When I'm interviewed myself, I mentally reverse the roles. I already know I'm
good enough; the interview is about determining whether the company is good
enough for me. I've declined offers because of how the interview went or
because I realized the interviewer would not have gotten past my own red
flags: I'd never hire them.

This attitude works well on both sides of the table. Would I want to work for
me based on how I'm behaving? Is my behavior making them more or less
enthusiastic about working for me? Part of an interview is that its a sales
job. Imagine you get an excellent candidate and that they have multiple job
offers: how do you make them pick you?

------
gnusty_gnurc
> Despite all that I must finish by observing that regardless of our best
> efforts in the industry, hiring is still largely random. Well designed
> interview processes make it slightly better than random, which is why we do
> them.

This article is a defense of an asymmetric process that stresses out software
developers and forces them to jump through hoops. Slightly better than random
selection is a garbage pay-off for the nonsense I've experienced in technical
interviews.

~~~
esoterica
You’re misattributing the cause of the stress. Any competitive job opening is
intrinsically going to be stressful because there are there are X people
interviewing for 1 position and you have to outperform the other (X-1)
candidates. The interview format is irrelevant. The only “non stressful” way
to do an interview is to hand out $500k/year job offers to every warm bodies
that hands you their resume, no questions asked. Sorry, but that’s not going
to happen.

~~~
Apocryphon
I think you're getting at the core of the situation. Ultimately the interview
is going to be stressful because it can only end in two ways. But I think
there are ways to mitigate it, as suggested by some of the proposals in other
comments.

Another cause for stress is simply because the whole process is just so damned
opaque. There's not just a power asymmetry between hirer and applicant,
there's also an information asymmetry. At the end of the day, you don't know
what the true criteria were used to judge you. You don't know if you hit a
mark and was just passed over in favor of a better candidate, or if you bombed
without realizing it, or one of the interviewers simply didn't like you. And
at that point it's not simply losing out on the offer, it's losing out the
time spent to get a rejection that blandly says that you weren't the right
fit. And sometimes they don't even remember to reject you.

Like the intrinsic stress of being at the mercy of an employer, it's part of
the process that one just has to get used to. But one wonders if there's some
better way. Perhaps pre-interview rubrics and post-interview feedback? Make
applicants sign a waiver so they can't use it in a lawsuit. At least it'd be
more transparent than the status quo.

~~~
gnusty_gnurc
Very much agreed. Opacity is the larger issue for me. Especially considering
the disproportionate investment I have to make as an interviewee. And it seems
people readily admit the process yields marginally better than random results
- so it's pretty puzzling to see people defend it like there's not much to
complain about.

------
pcunite
I failed a C++ test that was a requirement to get a job (30 minutes timed,
multiple choice). I had written a C++ application that I had sold to thousands
of customers (Fortune 100s in there too). I had mistakenly thought I could be
a programmer at a real job.

Later, I sold $20,000 worth of source code that took me 60 hours to produce.
The client thought it was easy to read.

I still don't think I'm a _real_ programmer. The test sure seemed to prove
that. But, people still buy my software today (I have other apps).

From my perspective, it seems the current technical interviews are looking for
something in particular, from which you don't later actually do that for the
real job. Kinda like seeing if you can survive a skydive, but you'll actually
only be packing chutes after hire.

I wish there was a way to say, up front, that there are different types of
programmers, vs the catch all "programmer" job title. Because it sure seems
like I'm typing out code. The computer has always accepted it.

~~~
jussij
> The test sure seemed to prove that.

I would not get too hung up on those multiple choice tests.

I've been programming professionally now for over 25+ years and ran into my
first multiple choice test about 4 years ago.

In a very short space of time I managed to failed three (two C# and one C++)
in quick succession.

Now days, if I come across a role that requires such a test I just politely
refuse and move on.

FWIW why I think these test are a waste of time and why I did so badly is
because they don't actually test day to day programming skills.

The three test I did where all very similar with a lot of _gotcha type_ of
questions.

They asked questions like spot the obscure error in a very poorly written
piece of spaghetti code, or tell me exactly what exception this code will
throw exception etc.

As a programmer you never have to directly deal with these types of issues
only because your compiler finds the errors and the documentation gives you
exception details.

And in a way the tests knew this as well, because all three warned you could
not switch out of the test window while doing the test.

In other words the test _would have been a trivial exercise_ if you where
actually allowed to use your day to day work tools.

------
lanrh1836
The fact that interviews are so standardized in this industry is such a rarity
and the pros outweigh the cons for me as someone interviewing. It means you
should basically know exactly what to expect, and you have every resource,
most free, to study for them. I find that a much more meritocratic process
than the way hiring is done in many other industries/professions.

The alternative is widely varying interviews from one company to the next with
no idea what to expect, which leads to a mismatch in expectations and mostly
wasted time. Take home coding assignments is one alternative I see as well but
as someone interviewing that is a much bigger time commitment and something I
would not want to do especially if I’m interviewing at a few places.

~~~
bydl0coder
The problem with take home assignments is that while investing time the
candidate doesn't know how it will be judged. Sometimes people want just a
working prototype but sometimes they want essentially a production quality
high-performance code.

~~~
scapegoat444
I failed a take home b/c the company didn't host the HTML project on a server,
instead they opened it via the file explorer...

~~~
eq_sd_
I failed one because I used callbacks instead of promises...

------
MarkMc
I agree that the candidate should be able to demonstrate ability to write
code, but I'm surprised that the author is unable to find interesting, short,
diagnostically useful problems that crop up in the course of his day-to-day
programming.

Often when writing code I think, "hmm, this problem would make a good
interview question" so I keep a note of it. Here's a recent example from an
accounting program where I was reporting debit and credit transactions: Given
a list of integers, remove all pairs where the magnitude is the same but
opposite sign. Eg. given [-1, 4, 6, 1, -2, -4, 4], the result should be [6,
-2, 4].

For more in-depth assessment, the author dismisses the idea of asking the
candidate to do an all-day task or homework because it isn't respectful of the
candidate's time. So why not just pay the candidate for that time?

~~~
edanm
The example question you give is exactly the type of "algorithmic" question
people complain about, and that the article says is the only kind of question
you can realistically ask. And for the same reasons.

> For more in-depth assessment, the author dismisses the idea of asking the
> candidate to do an all-day task or homework because it isn't respectful of
> the candidate's time. So why not just pay the candidate for that time?

First of all, it costs money. Significant money even, if you tend to interview
a lot of people.

Secondly, it's often an accounting hassle to pay money - you can't just hand
someone cash, it needs to be in the form of a salary, and is not always so
easy to do (and some candidates would probably be put off by it, especially if
they're already working, since having a second income can get complicated).

------
joezydeco
_" Everyone who builds a team of developers, and I do mean everyone, rapidly
gets used to people turning up to interview who cannot actually program
computers, even under the most generous definitions of the term."_

I've started doing a small whiteboard coding exercise during interviews, and
over the past couple of iterations it's devolved down to something
simple...and I mean _really_ simple. And I'm watching senior developers fail
the test.

For example: I'm trying to fill a bootloader developer position so I get a
couple of senior embedded developers to the table. Passed the screen, HR likes
them, yadda yadda. Some even have U-boot experience. Should be easy, I think.

So I'll ask them to perform a simple bit-flipping exercise on the whiteboard.
Nothing insanely tricky and it's usually something that any register-level
developer should be able to blurt out without thinking twice about it. And
_they are choking on the question_.

It's kind of freaking me out. Is the test that hard? Is the developer that
inexperienced or distant from doing this kind of work even though they're
applying for it?

What am I missing here?

~~~
Quiark
The high-stress environment of job interviews blocks thinking.

~~~
AnimalMuppet
Perhaps, some of the time. I doubt it's the whole answer, though. I suspect
that some of the answer is stress, and some is fakers, and some is people who
have been in embedded and never had to do that. (I haven't for 10 years... but
I'm sure I can still do it.)

------
neilmock
It's the same power dynamic that will be in place forever, incumbents unable
to recognize talent outside of their narrow perspective. The penalty is paid
by the candidate, and the company will usually go along just fine, but it is a
sad broken system for a lot of qualified people.

~~~
mlevental
> The penalty is paid by the candidate, and the company will usually go along
> just fine, but it is a sad broken system for a lot of qualified people.

i don't understand how people don't get it: if this were a malfunctioning
system companies' margins/bottom lines would be affected and they would
correct course. the fact that not only does this trend persist but grows
signals that it's actually effective.

~~~
reallydontask
This is pretty naive.

A company I used to work at made ~$10Bn in profit per year and employed 300k
people this does not mean that all its practices were profit making, some were
demonstrably not profit making, even admitted to be so by SVPs. I don't mean
some investment that would pay off in the future, I mean cost cutting
exercises that went so far as to go past the fat and into the bone (SVPs words
when we got a new CEO)

Another company I used to work at, this one employing 20 people, had a
shockingly bad product. It looked dated, it was slow and it had all manner of
process issues, in a word: Dysfunctional. The company was pretty profitable as
it had excellent sales people. Working here really made me question a lot of
things. How could a company with such a shitty, outdated, slow and
unmaintainable product be so successful (for its size)?

The point is that the relationship between qualified people and financial
success is far from linear.

------
pytester
>Fix a real bug or implement a real feature. Beyond the obvious copyright
issues if you don’t get an assignment from the candidate,

Irrelevant if you've already fixed or implemented it and do not intend to use
the candidate's code.

>there’s no good way to make this process repeatable,

Select fixed bug/feature, isolate code, create question, use in interview,
iterate on task.

>no way to ensure the question always fits the available time,

Other than selecting the question such that it does and/or tweaking it such
that it takes more/less time.

>and no way to ensure you see a good range of programming skills (e.g. the
feature may simply involve copy/pasting some existing code with minor tweaks).

Other than selecting the question such that it isn't that.

>It also assumes you’re hiring Java developers for a Java codebase or Ruby
developers for a Ruby codebase

I'm certainly not going to hire ruby developers for a java code base or java
developers for a ruby code base, and I find it weird to assume that I would.

~~~
ajmurmann
> I'm certainly not going to hire ruby developers for a java code base or java
> developers for a ruby code base, and I find it weird to assume that I would.

Learning a programming language is usually not the hard part unless it follows
a completely new paradigm that you are entirely unfamiliar with. E.g.
programming in an object oriented language if you only ever have done
imperative. I've interviewed people in languages they didn't know with great
success, you just gotta ask the right questions and sometimes their questions
(or lack of) are more important than their answers.

~~~
pytester
>Learning a programming language is usually not the hard part

The syntax and semantics can indeed be picked up in a day or more. It's
working within the ecosystem and circumventing all of the non-obvious
pitfalls, knowing the shortcuts (and knowing what can't be shortcut), knowing
where all the important information is located that all requires years of
experience.

I've worked with developers who have switched languages and they often port
across bad habits and quirks and try to make the new language more like their
"home" language. Sometimes years later they're still trying to do things in a
non-optimal way.

I also think that most people are more productive in their favored language
(or languages) and you're unlikely to get as good quality work if you pull
them away from it.

There's also a cultural element to programming languages that I think is
underrated. I don't expect java programmers to have the same kind of outlook
or approach to python programmers.

------
willtim
I disagree with the author's explanation as to why all questions are
algorithmic. I suspect the real reason is due to the US dominance in tech and
the popularity of algorithmics in US academia. Indeed the parts of computer
science to do with logic, formal methods, semantics, mathematics of program
construction etc are, I am told, often described in the US as "Eurotheory".
This is a great shame. A new graduate arrives at Google fully trained in
algorithms as mandated, but is unlikely to need to implement any of them
during their career. It is far more likely that they will need to glue
existing systems together, consume APIs, author APIs etc. "Eurotheory" would
much better prepare them for this.

As an example, consider that we could be asking questions on API design: get
candidates to work through their semantics/axioms (introductory forms to
create objects, methods to transform and eliminate them). Get them to write
out high-level testable properties such as the way objects compose, associate,
commute etc. Using types to enforce invariants etc

~~~
cjblomqvist
Yes, and the algorithmic tests are highly biased towards those who've focused
on that (either through academics or by self training afterwards) - and a miss
on these questions still doesn't tell which is the correct story. Then again,
this might not matter. Firstly, everybody knows these types of questions are
to be expected so everybody cab prepare. Secondly, the problem (from an
employers perspective) might not be the false negatives, but the false
positives.

This in turn points to that we might try to generalize this a little bit too
much, with assuming there're generally applicable ways for software developers
but rather we need to adjust the process depending on the type of software
development context. If you're developing a web service of minor scale, then
algorithmic questions might be stupid (because the knowledge needed is rather
one of network, async, web and structure rather than solving math problems),
but they might be relevant for Google (where they are might be relevant
because they search for developers with that mindset and who can use that
mindset to come up with and solve big problems in a certain way - I don't know
because I lack insight into the developer strategy at Google).

As engineers and developers my experience is we tend to love to generalize,
even when it's not wise to do so.

------
telotortium
Also describes the interview process this company uses:

> Firstly, how do we hire at R3? We’re recruiting for technical roles around
> the world as part of building an open source Bitcoin-inspired decentralised
> database. The interview process for developers consists of, firstly, a piece
> of code that you’re asked to send back a quick code review of (this is meant
> to take about 5 minutes), then an 30–60 minute long video chat+screenshare
> in which you may join from home and some coding is done in an editor and
> language of your choice*, then finally an invitation on site to meet the
> team and talk to senior management. The code test sometimes also includes
> design and ‘talky’ questions, depending on the nature of the candidate and
> precise job role: it’s not just coding. We’ve found this process to be
> pretty accurate whilst still being quite lightweight (compared to some
> hiring processes at least!).

~~~
Waterluvian
All I would need added to that style of interview is something along the lines
of:

"And if you have to Google things during the video call, go right ahead. But
try to talk through your thought process."

So that I don't go into the interview feeling super nervous about the moment
where I get to say, "I'm googling the bidict library because I forget the API.
But I want to use a bidict because..."

~~~
mike_hearn
(I wrote the article)

We allow candidates to use Google if they need to during the interview.
However, the interviewer is free to draw conclusions from that. If a candidate
is routinely searching for things that someone using the language would use
every day, that's going to count against them. If they need to look up
something that is only rarely used, nobody will care.

There can be occasional disagreement about what's reasonable to look up. A
surprising number of developers don't know how to read from files, and I've
encountered a few that didn't know files are random access. Some people think
that's a problem, others think it's expected. I tend to see ability to use
files as a proxy for general experience, which for R3's projects matters a lot
(we aren't writing generic database-backed web apps, there's a significant R&D
component to it).

------
tptacek
This post describes an unusually good technical interview program, in which
most of the tech-out is apparently done at home, the interview is prefaced
with a short but useful work-sample code review test, standardized questions
are used, and candidates are allowed to use their preferred tools and
languages.

That's commendable, but I think the argument is still faulty.

As I understand it, the premise of the argument for interviews and against
work-sample testing is that candidates won't perform work-sample challenges.
I've interviewed almost as many candidates as this person has (and screened
out far more based on challenges) and that simply hasn't been my experience.
More importantly, the argument is incomplete. Stipulate for a moment that the
"best" developers won't take work-sample tests (I'll come back to that)...
and?

The point of work-sample tests is twofold: to stop being beholden to
subjective impressions of how well candidates can perform the work you need
them to do, and to provide a framework for tech qualification that's
straightforward to iterate on. If your work-sample tests are sound and you're
getting enough candidates to saturate your headcount requirements after
setting a bar you're comfortable with, why obsess over who exactly you're
hiring? The industry is full of developers with glittering resumes and lists
of past achievements who are dead weight on teams†. Why assume the developers
with the best marketing are the ones who will perform best on your team? Why
not instead just spend the time to figure out what exactly a good performer
is, and then directly test (and iterate on testing) for that?

Regardless, the belief that strong candidates won't do work-sample tests is
pervasive. And for good reason: work-sample testing at most companies seems
cargo-culted: "take-home tests" (that often _follow_ tech-out interviews) that
can only rule candidates out and never rule them in. I agree: that's bad. But
the solution is straightforward:

* (1) Allot a budget to the entire technical qualification process, including all interviews and challenges, (2) try to make as much of the process objectively scored based on a rubric your team has committed to writing, and (3) ideally move as much of the process "offline" (so the candidate can perform it at home and without you watching) as possible.

* Make the work-sample challenges as determinative as possible. Candidates who do well on our work-sample challenges are presumed technically qualified; 1:1 interviews that follow it are largely pro-forma. At Matasano / NCC Group (while I was there) the bar for a technical interviewer to overrule the output of the work sample challenges was quite high.

Regardless of whether you use work-sample challenges or scripted, structured
interviews, a bit of low-hanging fruit that nobody seems to pluck is:
relentlessly keep candidates informed about what to expect in your process.
Since about 2010 we've been buying books (expensive books!) for candidates and
providing study guides to go with them; we also provide fairly detailed
descriptions of what our challenges will be. We're even moving to a system now
where we offer candidates a practice version of some of our challenges (if I
could fake up the entire AWS API, I'd do practice challenges for _all_ our
challenges).

But most employers seem determined to make their tech-out process as much of a
gotcha game as possible, and tell candidates almost nothing about the actual
hard problems they'll be expected to deal with.

This post uses a framing that says whatever tech companies do to qualify
candidates must be rational; after all, if they're wrong, they're raising
their own costs. I respectfully disagree: engineering teams do not, as a
general rule, optimize their processes for efficiency or shareholder value.
Many teams literally use interviews as a hazing process (one they often exempt
"in-network" hires from). Still more delegate interviewing out to the entire
engineering team and exhibit little or no interest in establishing rigor or
rolling back bias. But engineers are as biased as the rest of humans, and most
of hiring today is not in fact rational.

Also: sorting algorithms make dumb interview questions.

† _This is in part due to a dynamics in our industry that make it easy to
"gradually fail upwards": most companies can't fire fast enough after making a
bad hiring decision to prevent the bad hire from accruing another positive
line-item on their resume, so smart people can bounce from team to team for
years without ever making a serious contribution and end out ahead._

~~~
enraged_camel
>>As I understand it, the premise of the argument for interviews and against
work-sample testing is that candidates won't perform work-sample challenges.

My understanding, as I've heard from others, is that it's not that candidates
won't perform work-sample challenges, but that they _can 't_, due to various
non-work obligations that result in lack of free time.

This leads to a situation whereby interview processes that feature work-sample
tests discriminate against people with families and kids/dependents.

~~~
pmiller2
Well, I “can” perform these so called “work sample tests” that a lot of
companies want to give out (sometimes even before getting to speak with anyone
at the company). Here’s my issue with them:

* Either they’re strictly timed, introducing an artificial sort of pressure that’s rarely present on the job, or one ends up competing against people who spend 3 or 4 times the suggested amount of time. I don’t know about you, but I can do a way better job on most of these sorts of tasks if I spend a large multiple of the expected time on it, too.

* The tasks are often poorly specified, sometimes with the justification that it tests the ability to clarify requirements. I don’t think that argument holds water, because all “clarifying requirements” does when one doesn’t have ready access to folks who can answer those kinds of questions is introduce insane amounts of latency and frustration into the process.

* Even when the tasks are reasonably well specified, the grading rubric is often kept from the candidate. At work, if I don’t know if I or my team is building the right thing or doing it in the right way, I can go ask people (PM, tech lead, EM, another team member, etc.)

* Finally, most of these “take home assignments” I’ve seen that are supposed to be in the 2-4 hour range to finish don’t get me as a candidate out of 2-4 hours of giving impromptu algorithms lectures at the whiteboard. If the prize for putting in 2-6x as much time as I would to get to an onsite interview with another company is just to come onsite with your company and spend all day doing the exact same thing, then, no thank you.

I realize that few or none of these things apply to the process at Matasano,
but there is another one that does: I can already get a job that pays
reasonably well, doing things I already mostly know how to do, without
spending hours studying a field I know little about.

That is arguably a feature and not a bug, so, I don’t mention it as a
criticism, but rather as a reason why I probably wouldn’t do Matasano’s work
sample test. For the type of people they’re looking for, that’s probably fine.
It’s just that I’m not one of those people.

~~~
tptacek
This is all well said. I wonder if you have thoughts on what employers might
do about the "competition with completists taking 2 days on a 2 hour
challenge" versus "pressure from timing people". We've taken the tack of
telling people what our timing expectations are, but also not timing or
tracking time (you could ostensibly have picked up the problem on a Monday,
noodled for 15 minutes, decided you'd do a better job on Wednesday, picked it
back up for 45 minutes, had a sudden business call, and then had to wait until
Saturday to finish it; right now, we'd like that to just seamlessly work, but
it obviously creates the effect you're talking about).

~~~
pmiller2
If you’re not going to track time in any way, then there’s no way to prevent
anyone spending as much time on the problem as they have available. My best
suggestion is to just ask people how much time they spent on it. The biggest
problem with that is that by telling people the time expectations, you’ll
probably get answers within or near that range.

I had one of these tests say they wanted the project delivered as a git repo
and would use the timestamps on the commit history to figure out how much time
was spent. I just laughed at that and figured the people who knew how to forge
the timestamps would do that to make themselves look good.

The only alternative I can think of to explicitly tracking time is to just not
give a time expectation __and __ask how much time they spent. That way,
answers aren’t biased by you anchoring a range in their mind. This has other
obvious disadvantages, but it would take care of the “candidate spent 2 days
on a 2 hour project” problem.

------
MarkMc
Candidates don't often appreciate the unequal disincentives for the
interviewer in making the wrong call. For the company, hiring one poor
programmer is usually worse than rejecting 10 good programmers. So a coding
test which reduces the rate of false positives is useful, even if it increases
the rate of false negatives.

As an aside, this lopsided penalty reminds me of how Typhoon forecasters in
different cities systematically bias the predicted stormtrack toward their own
city:
[https://twitter.com/m_clem/status/1081953447358984192?s=19](https://twitter.com/m_clem/status/1081953447358984192?s=19)

~~~
Apocryphon
That's certainly a valid stat that I've heard before, but one wonders then if
the situation shouldn't also be improving mechanisms when managing existing
employees. California, where much of the tech industry is located, is an at-
will state. If a bad programmer is really so bad, then shouldn't it be easier
to terminate such an individual? Because the alternative, of rejecting 10 good
programmers to get to a good one, seems just as much of a waste of programmer
time and resources, as hiring a bad one. Perhaps there should be a
probationary period with teeth, programmers should be hired as contractors for
the first month.

------
EnderMB
The one point where I agree is, in my view, the sole reason why big companies
use algorithm-style interviews.

FAANG-level companies can put out a job role and have hundreds, if not
thousands of people apply. Out of all of these people, there might be dozens
of people with phenomenal CV's - relevant experience to the role, mastery of a
programming language or two, public publications of their work, and provable
experiences through living projects.

You reach a point where it's not practical to compare these people through
programming ability alone, and that's where the algorithm interview comes in.
It allows companies to choose a suitable person while alienating good
developers, because through sheer interest alone they can afford to do so.

I just wish that more companies were honest about why this approach is taken,
because it would stop startups asking you to find a join in two linked lists
when the role itself is to maintain a shitty CRUD app. These companies cannot
afford to alienate good developers because they're not getting the sheer
amount of interest a Google or Facebook will get.

Again, just my opinion, but people need to embrace the fact that interviewing
isn't perfect. A code test should purely exist to ensure that you can weed out
people that cannot do ANY programming. Outside of that, use the interview
itself to determine their fit for the role.

------
notyourday
I think the vast majority of the issues come from the fundamental disconnect
between what people want to believe they are hiring for ahd what they are
actually hiring for.

I have been in this industry for over 25 year from being a grunt, all the way
to three letter jobs. The most uncomfortable question in decision to open a
job is "What are we trying to accomplish by opening this job?"

In my experience if one is cut through the posturing the answer is "We need
two developers with the following credentials because we believe that by
having those developers we can increase the output -- more lines of code that
we need to generate to make feature we promised on a road map ship. They need
to know Angular/Javascript/Css/Node/Go/React/Kafka/K8s/RMQ and be ninjas".
Should one drill down what that feature is one would find "It is a new dialog
box and if a user is to click this button some things will happen and X email
would be sent. Our users want this. We currently use backbone but we will be
using Angular in future. API is written in node but we have a few projects to
explore Go and we are using RMQ for queues though there's a team that is
tinkering with Kafka" That's what the manager is _actually hiring for_.

So the real job requirement is:

"Need to get two butts in seats to build a dialog box to trigger an API action
that would publish a message to a workflow queue. Must know a bit about
backbone/css/javascript, nodejs and rabbitmq" [0]

Recognizing that one wants to put butts in seats rather than someone who can
spearhead the API move from nodejs to Go would change the entire dynamics.
Hardcore interviews for the 1st make no sense -- if a person can think, you
can teach them how to write some javascript, mangle some backbone code, create
a new route in an existing Express application and publish a message to a
queue.

[0] Look at the hiring ads here. It looks like everyone is hiring Chief
Architects that code.

------
boblebricoleur
> Companies do code testing because they have encountered so many candidates
> who look good on paper, and may even be able to talk about computers
> convincingly, but can’t actually write a program when asked. Any program. At
> all.

Maybe I'm reading too much into this and it's just semantics quibbling but
maybe talking about _computers_ is not talking about coding ? I've never met
someone able to pass for a programmer when they are not. I've always felt just
asking candidates to talk about their last or favorite project help catching a
glimpse of how they think about software problems.

Little anectode : my brother is the first software engineer of the family

Father : "Son, what kind of computer should I get ?"

Son : "I don't know much about choosing computers"

Father : "Don't you write software for a living ?"

Son : "Yes, exactly"

~~~
mike_hearn
Sorry, that was just a turn of phrase. For computers you can read "programming
computers".

------
ajmurmann
To me the biggest problem with algorithmic interviews is when the candidates
are truly asked to reproduce a well established algorithm. Frequently at this
point candidates will just have memorized the answer, but aren't actually good
at algorithm thinking. I've interviewed many candidates who thought I wanted
them to reproduce how a certain data structure worked although I was not. As
soon as we'd deviate about the well treated path they be completely lost. I
think puzzle style programming interviews that's require algorithmic thinking
and good problem solving skills can be valuable. Especially talking through
possible solutions and their trade-offs can be great. But only if we aren't
testing that the candidate is good at memorization and studied a lot.

~~~
PopeDotNinja
> I think puzzle style programming interviews that's require algorithmic
> thinking and good problem solving skills can be valuable

Think of it this way. Do the majority of applicants you have the luxury of
interviewing think those questions can be valuable? I'm inclined to not think
they are, so giving me one makes me less interested in your company. I might
still do it, but it's a bit off a turn off, like if you show up to an
interview 5 minutes late.

------
ulucs
Honestly, talk is cheap. If they do hire people with programming tests and
keeping the performance metrics of the people they hire (as most companies
do), they could have just published how test performance predicts actual job
skills. Until then, the null hypothesis reigns supreme.

------
throwaway033057
Just another ad for a company disguised as a blog post.

~~~
dang
I'm not sure why you say that here, but it's actually not a problem on HN as
long as the article is interesting. I wrote about this recently:
[https://news.ycombinator.com/item?id=20186280](https://news.ycombinator.com/item?id=20186280).

------
croh
When I conduct interview, I ask very simple programs like fizz buzz for logic.
If s/he succeeds, I focus to understand if interviewee is passionate about his
craft, which is the hardest problem s/he encounter ? which is code s/he proud
of ?

------
ChrisMarshallNY
Well, I have tens of thousands of lines of super-high-quality code, for
multiple shipping apps and entire systems, used by thousands of people daily,
out in the open-source domain, in dozens of repos. These have over a decade of
detailed checkin commentary, with Doxygen and Jazzy docsets. I have hundreds
of pages of documentation and blog posts, often intricately explaining my
design process. I am constantly trying to learn the latest tech for my
specialty. But I spend exactly zero time on LeetCode or HackerRank, so it's
likely I'll fail Secret Santa. I doubt that I'm what you're looking for.

~~~
mike_hearn
To be clear: if you fail a coding interview question and your self-judgement
is accurate then either the interview question is a bad question, or the
interviewer is a bad interviewer, or the process has a problem.

I don't think many (any?) of the people I've hired this way have spent time
practicing coding interviews, let alone on HackerRank. The questions asked are
simply not challenging enough to require drill. If you can code competently in
your daily job you can pass them.

~~~
ChrisMarshallNY
Also, I know that no one will ever read this, but I figgered I'd put it there
for posterity.

It's amusing, when I say "I have a published open-source portfolio that
consists of XXX LoC, ten years' worth of checkin history, in XX numbers of
repos.", the response is "IF what you say is true..."

Look for yourself. I can back up every single claim I make.

I don't pretend that all my code is screamingly efficient or the best-written
code on Earth, but what I do have, is tens of thousands of lines of super-
high-quality, well-documented, code, graphical assets (including the original
vector masters), clever designs, design documents (including things like
OmniGraffle originals), Medium and blog postings -often going into very deep
detail about my designs, architectures -and the decisions while developing
them-, GitHub issues and responses, Apple RADAR reports, and multiple
published, high-quality, well-branded apps on the App Store.

Sheesh. Don't take my word for it. See for yourself.

Now, THAT being said, why doesn't everyone have a portfolio?

You ever see a designer or graphic artist going to an interview? They bring
along these big black cases (although nowadays, it's probably more like a
laptop case). These cases contain drawings, printouts, raw materials and
sketches, etc. They are called "portfolios." Even students, fresh out of
school, have them. It's considered a requirement.

No design firm on Earth would ever consider ignoring one of these, and
instead, pull out a matchbook with "Draw Spunky" on the cover, and insist on
that being the hiring criteria.

Which is EXACTLY what most software shops are doing, these days.

------
kstenerud
“What’s the number after F in hexadecimal?”

My first reaction to this was: "Damn... where's my ASCII table?"

Because this question can be taken two ways: "What comes after f in
hexadecimal notation?" and "What is the UTF-8 code for 'G'?"

A very important thing to keep in mind for both sides in an interview is that
misunderstandings will happen, for the most mundane and obvious things. How
you respond to these will determine your success in interviewing. Give the
benefit of the doubt until the other party has conclusively proven that they
don't know what they're talking about.

~~~
xwolfi
That would be very weird to misunderstand this. I get it's stressful during
interview, but "number" and "hexadecimal", as well as "F" all points exactly
to 10, not 'G'.

It cannot be taken two ways, but ofc you can slip meaning, associate ideas
etc, and if you fall back on your feet and joke about it quickly, nobody will
mind.

~~~
kstenerud
I've had this happen quite a bit when investigating things, where I get the
wrong idea from one step, and don't realize it until I've run with it for some
time. This is a normal occurrence and we do it all the time. In an interview
setting, with added stress of being watched and judged, the problem is
magnified.

In interviews, I'll always throw in comments to help clarify if it looks like
they're going off the rails, because letting them run on an incorrect
assumption is only going to diminish your ability to judge their abilities. We
all make mistaken assumptions all the time, about everything. The person who
made no bad assumptions in an interview just got lucky.

------
maxheadroom
I have to disagree: When you're hiring a person, you're hiring someone to join
your team. Hopefully, we can agree that the majority of people in our field
have the capacity to learn.

In fact, in many FAANGs, your first performance review year is "freebie"
because they know that you're not going to come in and know the role and the
tools used intnerally.

So, then, what is the technical aptitude test (is it really anything other
than that?) for? Well, it could be used to measure a baseline for basic
programming skills; however, this isn't how most technical interviews are
designed.

Interviews aren't designed around the premise that knowledge on the internet
exists or that your colleagues could, oft, be your first line of support (or
code reviews, if the company has a good culture). Hell, there are interviews
where the candidate has to code in notepad (or other text editor equivalent)
because they don't want the candidate availing of things like Intellisense.

If we look at it from a different perspective, though, we live in a time when
we can now put our source code online for the world to see. We can publish
packages for the world to consume. We can make changes to production software
(FOSS) and documentation and that's easily traceable (if you're not
obfuscating your identity through sixty different handles).

It seems we don't consider these viable avenues of purview to view the
candidate's ability before even asking them questions.

With that being said, are you really hiring the candidate because they would
be good for the team or are you just hiring the candidate who can check off
some interview-type question boxes?

I have interviewed a number of people in my day and the most excelled
candidate that we interviewed and hired, because he checked all of the
technical boxes, ended-up being a raging asshole and bringing the team morale
down, considerably.

So, at the end of the day, you're not just hiring someone to fill a role. A
single role is all but mostly dead, anymore. You're hiring someone to join a
team and the human aspect should outweigh the technical aspect.

After all, just because someone checks all of the boxes, it doesn't imply that
they would "be a good fit".

------
_pmf_
> It’s vastly preferable to asking someone to write code with a fat marker pen
> whilst standing up in a tiny conference room.

Or, you know, just use a whiteboard in the interview for the same purpose you
use it in meetings:

\- not at all

\- for high level algorithmic descriptions, data flow or architecture
sketching

(Of course, the UML being demonized is a bit of a misfortune because now boxes
and arrows can mean anything, but hey, better than having a visual language
that has been developed by a bunch of old CIS White males, according to tech
diversity Twitter.)

------
angarg12
> All day interviews. Many firms expect an interview for a developer to take
> an entire day, typically with between 5 and 8 separate interviews. This
> makes it hard for developers who already have jobs to attend.

This bothers me immensely. If you are job hunting, you want to apply to
several jobs to get a number of offers and get an overall better deal. How do
people manage this when they got a full time job? Do they literally take an
entire week vacation to have 5 all day onsites?

~~~
michaelt
If you're applying for your first job, you might treat job hunting like a
fussy suitor problem and want to get five offers to get a good picture of your
market value. You get a job where you're pretty happy with the salary, the
benefits, and the work.

When you apply for subsequent jobs, you no longer need to take a large sample
to calibrate your judgement; you only need to compare the job to your current
one. And questions like salary and the nature of the work can be answered
during the phone screen.

If changing jobs takes three days of annual leave and boosts my salary by
$10,000 that seems like a pretty good trade to me.

------
harryf
Over the span of the last 10 years I've probably interviewed between 80 and
100 engineers, at some stage in the process and even ran the whole hiring
process in my last company.

There are points in this blog post I strongly agree with like how
disrespectful "homework" and the importance of letting a candidate code on
their own machine, where they have all their shortcuts etc. setup and are able
to be productive. But I've found the algorithmic coding challenge to be in
equal parts harmful as useful.

The biggest problem, as I see it, is as software engineers we are strongly
biased towards a rational understanding of the world. For example I've seen
hiring teams argue _for_ a candidate they clearly don't like because that
person aced the coding challenge. In such situations, as hiring manager, I
then challenge the hiring team with "OK it's Monday morning. You had a bad
weekend. And now you're in the office sat next to the candidate. How do you
feel about that?" and usually the truth _then_ comes out and we avoid hiring
someone that would destroy the team. The point here is we have a natural bias
to see things in absolutes; zeros or ones. This _tends_ to lead to an over-
focus on the coding challenge because it gives us the illusion of a black or
white answer on whether we should hire.

When I've interview people, I always ask myself "what's my gut feeling about
this person?". What I've found there is "my gut" is sometimes wrong in the
positive direction - I liked someone but they turned into a bad hire but
always right in the negative direction; if I have a bad gut feeling about
someone that I can't fully explain, these days I always go with it. The
"evidence" I have there was two hires that I had a bad feeling about, one I
later had to fire myself and the other that stuck around for a couple of years
until someone else fired them.

Otherwise, when it comes to assessing coding ability, these days most
programmers have code on github, even if it's just "hobby" code. I'd rather
review that, look at their commit messages etc. and get a feel for how they
program when relaxed and not trying to impress.

And if we must make candidates do homework, how about contributing to the
greater good and asking them to make a contribution to some Open Source
project e.g. fix a bug in numpy or otherwise?

------
musicale
Algorithm puzzles are a good test for the ability to solve algorithm puzzles
(which usually means having seen a similar puzzle before or guessing the right
trick.)

Which is probably correlated with solving problems in algorithm puzzle (aka
"programming") contests.

If solving algorithm puzzles or competing in "programming" contests is your
company's core business, then probably it's a good interview test.

------
linguae
The problem I have with coding interviews isn't necessarily the algorithms-
based questions. I don't have a problem with reviewing undergraduate data
structures/algorithms as well as those related to my subfield (for example,
someone who is familiar with DBMS implementation should know the basics of a
B-tree even if knowing its exact implementation requires review).

Here are the problems that I have with coding interviews:

1\. Often I feel that interviewers are looking for exact, optimal solutions
rather than caring about how the interviewer actually approaches problem
solving. Forget about being asked FizzBuzz-style questions or about being
asked to delete a node properly in a binary search tree; in the interviews
I've had in the past three years, I've encountered difficult Leetcode- or ACM
International Collegiate Programming Contest-style problems where it's
expected that I come up with an optimal solution within 30-45 minutes. It's
even worse with companies that give you a hard-level Leetcode-style problem
that is automatically graded instead of being examined by a human.

2\. The lack of receiving feedback about interviews that didn't go well makes
the process difficult. At least on a standardized test you receive a score,
and at least at the end of a final exam in class, you get your final class
grade. Case in point: I've had two successful Google software engineering
internships with great reviews from my mentors, and so it's not like I'm
incapable of programming FizzBuzz or writing a for loop, but after three tries
I haven't been able to get an offer for a full-time position there, despite
making it to the on-site round each and every time. It's similar with other
companies: I'm usually able to make it past the phone screen and about 50% of
the time past the initial programming question, but I have a hard time making
it past the on-site round for software engineering positions.

3\. The sheer breadth of possible questions to be asked. Not only are
Leetcode-style puzzles "fair game," but also domain-specific questions. For
example, suppose I'm interviewing for a position where I'm working on a DBMS.
Despite having taken graduate-level courses on the implementation of databases
and distributed systems, there is still a very large amount of questions that
I could be answered, including specific details of specific databases that I
might not have been exposed to.

The frustration I have with the software engineering interview process is
enough for me to want to change fields at times. Thankfully I've found the
interview process for more research-style groups to be less about coding
acrobatics and more about conveying previous research experiences and
successes as well as proving competency and curiosity. I have a pleasant role
as a research engineer where I'm doing very interesting research work while
also maintaining my coding skills. Unfortunately such jobs are hard to find in
industry, which means that one day I'm going to have to take up the gauntlet
of software engineering coding interviews all over again.

~~~
xwolfi
But Google doesn't want you, you want Google. They want the process as
annoying, unfair and difficult as possible cause they have entire countries
that want to work there. Your internship was free so, well, it was easier to
sell you to management.

Go to a small mom and pops startup where you do absolutely everything from
rebooting servers to convincing clients to pay you. That'll be a thousands
time more valuable than 10 years at google sleeping between desks pretending
to change the color of a gmail icon :)

------
sbov
The way some technical interviews test technical skills would be like testing
communication skills by asking candidates to give a public speech.

Then talk about how you "dodged a bad one" because they couldn't say a single
sentence without an "uhm".

~~~
dahfizz
I don't understand this mentality at all. If I'm hiring you to do X, I want
some sort of confirmation that you can actually perform X. What's the
alternative? Hiring the first person who applies?

~~~
inertiatic
Hiring the way hospitals hire doctors.

I don't know, do they have them dissect other species while explaining their
though process?

~~~
dahfizz
The vital difference is that doctors have doctorates. Becoming a doctor is an
incredibly rigorous process and you can be confident that anyone who is a
doctor is competent. You don't need a demonstration of their skills.

Any shmuck can spend a week on leetcode and call themselves a programmer.
There needs to be a way to sort out that type of applicant in a field where so
many are 'self taught'

------
kortilla
>How could someone with 10 years experience on their CV be unable to start a
new project in their own editor

Said nobody with 10+ years of experience. Starting a new project in an editor
is as obscure as changing the color scheme in your editor.

~~~
danjac
Generally when I start a new project I'll do it differently from last time.
Might be a different language or framework, maybe it uses docker or not, maybe
the current best practice or boilerplate has changed in the last six months.
It might be different for people in agencies, but then they might have some
code generator or starter pack for optimization.

------
torgian
I haven’t read all the comments, but...

My question is, do companies give interviewees the option to work on a problem
the company is currently facing?

If not.... why not?

~~~
ajmurmann
I work at Pivotal where we pair program with candidates on work that's usually
actually in the backlog. In some cases you might have to replay a story you
did already if you don't have a new one that's well suited. One of the biggest
benefits is that the candidate also sees what they are getting themselves
into. I don't think it would work without pair programming though and might
make less sense in am environment where you won't pair either once you got the
job.

------
anbop
This isn’t going to change in the near term, because the companies in the
middle emulate the companies at the top, and the companies at the top are
doing just fine with their current hiring processes. Sure, they’re missing out
on some good people, but they have enough excess demand for employment that
they can keep their cubicles well-stocked with enough C++/Java/Python coders
to accomplish their business objectives and make tens of billions of dollars
per year.

Telling them they should change is like telling Tom Brady he should change the
way he holds the football — even if you’re right, he’s not going to listen to
you while he’s still winning the Super Bowl every other year.

------
a_imho
All day interviews with no compensation are unprofessional bordering on
insulting.

