
Lessons from 3,000 technical interviews - leeny
http://blog.interviewing.io/lessons-from-3000-technical-interviews/
======
forrestbrazeal
The author draws a hard distinction between Udacity/Coursera MOOCs (good) and
traditional master's degrees (bad). I'll interject that with Georgia Tech's
Online Master's in Computer Science program [0], which is delivered via
Udacity and insanely cheap [1], you can get the best of both! (Their
"Computability, Complexity and Algorithms" class is one of the top Udacity
courses cited in the article.)

Keep in mind that a traditional degree program does have a huge advantage over
a strict MOOC: accountability. It sounds good to say that anybody can go push
themselves through one of these courses. Try pushing yourself through ten, and
actually writing all the papers and implementing all the code, while working
full time and having a family. That grade looming at the end of the semester
really does wonders for your motivation. Plus you can get help from live
professors and TAs, and the Piazza forums for OMSCS are full of smart, curious
students who love talking about the subject at hand. There's a richness to the
degree experience that I don't think you get with scattered classes.

(Obvious disclaimer: I'm a current OMSCS student)

[0] [http://omscs.gatech.edu](http://omscs.gatech.edu) [1]
[https://www.omscs.gatech.edu/program-info/cost-payment-
sched...](https://www.omscs.gatech.edu/program-info/cost-payment-schedule)

~~~
dibstern
I'm studying a 3 year coursework masters that allows people with no background
in CS (just had to do an additional semester of courses) to participate and it
aims to give a pretty well rounded CS education. Not a common thing,
apparently.

I completely agree, the motivation and environment that it gives you, the
reinforcement that you're actually doing something official and serious as
opposed to doing something that very few people outside of the software
industry take seriously, the structure it gives you, the fellow students you
meet and befriend, it's all huge. Additionally, looking at the
Udacity/Coursera material, a lot of it is shallow, poorly taught, and would
not have taught me nearly as well as my master's program has taught me.

~~~
ortekk
Could you please elaborate? It looks like this program is only for people that
already have an CS-related degree.

What courses are you taking?

------
graffic
"Whether passing an algorithmic technical phone screen means you’re a great
engineer is another matter entirely and hopefully the subject of a future
post."

This sentence plus the inverse correlation between experience and "interview
performance" shown there. Makes a big smell about how biased are those
interviews to themselves and not to real technical interviews.

From the data it looks like the questions asked using that service are the
ones you might learn in university and after many years not using them, that
knowledge fades away because you're not using it.

This is reinforced by MOOCs being the 101 of the subject they're dealing with.
It would be interesting to see if there are trivia questions from 101 courses.

The most obvious bias is in the clickbait title. Those 3K interviews are in a
specific platform, meaning they're done in a specific way.

So after checking their results it seems that interviews done using that
service benefit people with fresh university or 101 lessons knowledge.

What worries me more is the lack of improvement and perhaps the moral
superiority of ending the article with a "these findings have done nothing to
change interviewing.io’s core mission". It feels like the entire statistics
game shown there was to feed back what they already knew.

~~~
wheaties
Yes, because we all know that your ability to implement a red-black tree from
memory has a direct correlation with your ability to implement some random
business logic or CRUD app. Oh wait, they don't.

~~~
blindhippo
I used to think along these lines. Then I started doing 10+ interviews a month
and realized a very clear reality: basic CS knowledge and problem skills is
far more important to me and my team then knowing how to slap together some
semblance of a working CRUD system.

I ask "algorithmic" questions, normally expressed as a legitimate business
case (invent a real world problem, solution is implement some algorithm or use
specific data structure). My warm up question typically is a simplistic "find
the subset in a given collection that matches this specific criteria" (with a
subtle implication of "do it efficiently"). The average coder should be able
to solve this type of thing, on their own, in about 10 minutes max, 15 with
some feedback on improvements.

Yet, 80% of my candidates take nearly 45 minutes and cannot deliver a workable
solution without massive handholding, and I don't even get to my higher order,
"real questions". The scenario of a coder who can't solve my warm-up being let
loose on code I actively maintain makes my stomach churn.

Until I see the average bar for problem solving go up, I'm going to keep
asking basic CS questions in my coding interviews. The job is to solve
complex, typically ambiguous problems. Coding is one of the tools - and I want
peers who understand the theory behind using those tools.

(I should note, I tend not to pay attention to credentials on a resume. I care
more about ability to do the job then past history - though if a candidate has
a masters in some field of CS, I might delve into it a bit out of curiosity...
they are an expert afterall)

~~~
FLUX-YOU
>The average coder should be able to solve this type of thing, on their own,
in about 10 minutes max, 15 with some feedback on improvements.

>Yet, 80% of my candidates take nearly 45 minutes and cannot deliver a
workable solution without massive handholding, and I don't even get to my
higher order, "real questions".

You need to ask yourself why you believe the "average coder" should be able to
solve that because clearly your beliefs are not founded in reality.

This is what I cannot understand about interviewers who are constantly
frustrated with the population's skillset: You obviously have higher skill
standards than the average. That is fine. Just accept you will have less hires
because none of you are capable of fixing the entire population's skill
levels.

~~~
timr
_" You need to ask yourself why you believe the "average coder" should be able
to solve that because clearly your beliefs are not founded in reality."_

Oh, stop.

The problem he describes is trivial, and something that you'll encounter as an
entry-level web developer on a regular basis. If you can't solve it, you're
absolutely not up to the job. In fact, I'll go further: if you _literally_
cannot find an efficient way to filter a list of stuff based on a criteria,
you're not even a programmer yet. It doesn't matter if you've "written" a
dozen toy webapps by stringing together NPM modules -- not knowing these basic
things makes you a danger to any team that hires you.

You can't judge the quality of a test exclusively by the number of people who
fail it. If you resume screen for "has written code before" and 80% of your
applicants fail that test, is your standard set too high?

(In case you're wondering, that's not a hypothetical example.)

~~~
FLUX-YOU
Is he talking about this problem (with a filter at the end)?

[http://www.geeksforgeeks.org/finding-all-subsets-of-a-
given-...](http://www.geeksforgeeks.org/finding-all-subsets-of-a-given-set-in-
java/)

Because I have never done that. That's different than just running through a
list and picking out items that meet a criteria.

~~~
timr
Not that I can see. The OP said:

 _" find the subset in a given collection that matches this specific
criteria"_

So basically, a loop through a single table. That's as simple as it gets. You
can make the problem more complicated, of course (e.g. _" write a method to
find the minimum and maximum ages of the male users"_), but it's still pretty
simple stuff.

A slightly less trivial "algorithm" question that should be equally easy for
any decent programmer: I give you a document of english words. Write a
function that counts the words, and returns the top ten words seen, by
frequency. Now...solve the same problem when it's not a single document, but a
stream of words of unspecified length. Don't run out of memory.

~~~
SomeStupidPoint
I would like to see a solution to your problem with bounded memory.

In particular, the case where I want the top 3 words, you don't know the
length of the stream, and you get random permutations of the same 4 words
until I stop emitting them (where I will end by emitting 3 to break the tie).

That's not to say your problem isn't interesting -- just that while
specifically constructing a problem as an example, you created one that's
unbounded in required memory (in both storage per value and potentially number
of values to store) and then demanded a solution that doesn't run out of
memory.

I think most interview questions are similar nonsense.

~~~
timr
_" I think most interview questions are similar nonsense."_

It's probably a good idea to be careful with your words when you admit that
you don't know the answer to a question.

First off: your example (random permutations of the same four words) doesn't
require much memory at all. So if you think it does, you're wrong. You might
overflow your counters, but that's a different problem.

A stream of random gibberish is certainly more challenging. But the
cardinality of the English language isn't infinite (the OED has about 230k
words, and that's with lot of words that nobody ever uses), so even a naive
solution doesn't require "unbounded memory", as long as you take the problem
statement seriously and don't do something ridiculous. That would be good
enough to pass an interview.

But OK, let's say you _do_ have a stream of random latin-encoded gibberish.
What then? The problem statement is that you have to determine the top-10
words (or in this case "tokens") by frequency. The cardinality of the set is
infinite, but the probability of duplication per token is small, and the
output set is tiny. Do you really think you _need_ unbounded storage?

In any case, even if you think a problem is "nonsense", it's probably true
that the interviewer has thought about it more than you have. The part that
frustrates you is highly likely to be the bit worth probing. A bad candidate
will bomb out immediately; a decent candidate will provide a solid, if not
perfect solution; a _great_ candidate will solve the problem, see the broader
theoretical aspects, and investigate those as well.

~~~
dsp1234
_Do you really think you need unbounded storage?_

This is an example of a trap that interviewers run into when they try to
arbitrarily reword questions.

While I could be mistaken, an exact solution to the top-k problem requires
O(N) space where N is the number of distinct items. I can trivially think of a
stream of tokens that would defeat any reasonable computer in both available
memory and general "storage" (ex: 1 quintillion distinct words, then the next
10 words are a duplicate of an existing word, then end of stream). Since you
asked for an algorithm and not a heuristic, it's clear you are looking for an
exact answer. So the answer is "Yes, I would need unbounded storage for the
new question as asked". Since this is an interview, I'd expect that'd you'd
want me to give you the technically correct, and accurate answer.

I've tripped up enough interviewers who have tried to slightly reword
questions, but ended up changing their meaning.

~~~
timr
Well, "algorithm" is just a way of saying "procedure", so a heuristic applies
(IMO). I would accept a reasonable heuristic without much argument because the
problem of random text is much harder...but honestly, if you got to this part,
you've already passed the interview. A total flameout on this question is not
even realizing that the storage scales with vocabulary size, instead of
sequence length.

An _exact_ solution to an unbounded sequence of purely random text probably
does require unbounded memory (I say 'probably' only to hedge my bets here).
But depending on the definition of "random", you can put pretty tight bounds
on it and only be off by a little. I haven't done the math, but my gut says
that a bloom filter, followed by incrementing counts only for positive hits
from the filter, would scale well. There may be simpler approaches that make
use of word length and the size of the latin alphabet (e.g. "all tokens of
size N have 1/C(26,N) probability of colliding if letters are chosen from a
uniform distribution, therefore...")

But again, if we actually had this conversation in an interview, there'd be no
danger of not passing. Unless you were a jerk or something.

------
fecak
Thanks for writing this Aline. As a recruiter for almost 20 years, I wish I
had access to all my data and then the time to compile it, and anecdotally I'd
expect the finding about MOOCs would be similar.

The most selective of my hiring clients over the years tended to stress
intellectual curiosity as a leading criterion and factor in their hiring
decisions, as they felt that trait had led to better outcomes (good hires)
over the years. MOOCs are still a relatively recent development and new option
for the intellectually curious, but it's not much different than asking
someone about the books on their reading list.

Unfortunately, demonstrating intellectual curiosity often takes up personal
time, so someone with heavy personal time obligations and a non-challenging
day job is at a significant disadvantage. One could assume that those who have
the time to take MOOCs also have time to study the types of interview
questions likely favored by the types of companies represented in this study.

Thanks for continuing to share your data for the benefit of others.

~~~
enjo
"Intellectual curiosity" was my primary attribute for many years. Particularly
when I was young and didn't really know any better. It makes sense doesn't it?
Someone who is interested in learning for learning's sake will be a better
developer!

Except I'm convinced it's not really true. It's something that is horribly
subjective and really self-selective. It's funny, intellectually curious
people often have exactly the same interests as whomever is doing the
interview. I find that nearly everyone loves to learn, you just have to find
the thing they're interested in learning about.

The signal that I have found to be a great indicator of success on my teams
isn't about curiosity at all. It's about attention to detail. In the world of
scatter brained developers who never seem to really follow through on
anything, it's those guys that are the real unicorns.

Our interview process is now designed to bubble that to the top. Vague
programming problems with poorly defined requirements provide a platform by
which we can see how someone digs into problems. I'll ask for them to send me
a couple of things after the interview, it's a really good signal when they
pull out their phone and add it to a to-do list.

Those guys may not always be the "smartest" or the most interesting, but man
when you're going to spend months working down a really large project they get
stuff done.

~~~
tomc1985
You had me until, "it's a really good signal when they pull out their phone
and add it to a to-do list."

Because it's just one of many possible courses of action for staying
organized. For all you know the guy added it to his to-do list on his way to
the car, or maybe he employs some other mechanic. While you could argue that
an interview is a showboating environment where one is expected to signal
certain desireable traits, I personally resent that these attributes seem to
manifest in recruiters as specific actions that reinforce _their_ perceptions
of a good candidate -- if someone is unconventional then how is looking for
specific signals going to tell you anything useful about them?

~~~
enjo
It's a signal. It's not a binary yes/no, there are many observations we're
making throughout the interview. This is just one, and the to-do list isn't
the only correct answer. Asking me to email those details to them is another
possible right answer. Writing a note in a notebook is fine.

The key thing to remember here is that we're asking this very much in the
middle of the interview, and I just want to see that they have some sort of
system in place for remembering these things.

In my experience, most of the time, simply going "sure, I'll do that" and
depending on memory to actually get it done is a negative signal. Maybe they
are particularly gifted in terms of memory, but in my experience that is
rarely the case.

In hiring we're willing to accept false-negatives, but never false-positives.
You are correct, I may let a perfectly good candidate walk, but I'm willing to
accept that as the cost of hiring a bad one is just too high.

~~~
stolsvik
I would not have written that down, but I can assure you that I would remember
to send you whatever you asked: An interview setting is very special, and tend
to get hyper focus. I wouldn't go out from there wondering the typical "man,
that was intense - wonder how it went??", and then forget to send you those
things you asked!

Unless your request was very specific and strange ("send me the completed
k3-45 and t9807 forms"), in which case I'd have to jot /that/ down.

Should you not rather evaluate whether your request was fulfilled?

~~~
enjo
So the requests are often a bit strange (relevant to the interview tho).
Usually asking for a citation for some claim made during the interview. I also
make a point to ask in the middle of the interview so there's lot of times to
forget.

Again I do want to stress: This is a small detail as part of a larger
interview. Screwing this one thing up isn't going to sink you, but we combine
it with a bunch of other signals and observations to make decisions. I've been
doing this for several years, and I'm very happy with the results.

------
blazespin
I am perplexed why anyone would think that interview performances has any
interesting statistical relevance. Much more interesting would be how
successful the candidate was after receiving a job at the company.

~~~
joshvm
The conclusion is a bit obvious: if you ask people algorithms questions, the
people who do the best will be people who spent a lot of time rote learning
algorithms material. This means schools which place a lot of emphasis on the
theoretical core or people who've taken online algorithms courses.

I'd be much more interested to see performance on other questions e.g.
Google's typical curveball for new grads is an architecture question like "How
would you implement YouTube?".

~~~
drd93
These are not curveballs though. Nowadays it's just another category of
interview questions - System Design. There are tons of resources to prepare
for this type of questions just like with the Algorithms & Data Structures
problems.

------
closed
Interesting article! Some minor statistical pet peeves:

1\. Setting non-significant bars to 0 seems fishy. Leaving them and putting
confidence intervals on everything would let them speak for themselves.

2\. Calling something effect size is ambiguous. That's like saying you
measured distance in units (and the wiki article on effect size linked makes
clear there are a billion measures of effect size).

I'm guessing their measure of effect size were the beta coefficients in a
multiple regression?

~~~
eanzenberg
Not minor at all. It's very easy to mislead intentionally or not with
statistics and that's why rigorousness is needed. No details are given for the
analysis, even basic things like definition of what "top university" or "top
company job" even means.

It's just amazing to see how many positive comments a post like this gets
without even the hint of methodology.

------
geebee
Interesting bit on the MS degree. I followed the link, and I'm not quite as
surprised that the correlation is poor, or even negative, given the way the
data was collected and analyzed.

Absolutely agree that some MS degrees are pretty much less rigorous cash cows
by now, that allow students to skip the fundamentals such as data structures,
operating systems, and compilers.

However, many CS MS degrees actually do require this as a background, to the
point where some programs have emerged to prepare non-CS majors for MS
degrees, kind of like those post-bac premed programs. It's hard to believe
that those MS degrees, which require a decent GPA in those core courses, along
with high GRE scores (sorry, but we are talking about interviewing skill,
which may be more related to exam taking ability than job performance),
wouldn't result in a similar profile to people with CS degrees from top
schools.

This is fully acknowledged in the text of the article referenced in a link,
but unless people follow it, I do think the message may be a bit misleading.

That's an aside, though. The value may very well be in the prep for these
degrees (ie., the post-bac CS coursework required for admissions to a
reputable MS program). If you can get that through online courses (udacity or
coursera) through genuinely rigorous self-study? Yeah, that might do it, for
far less money. I've audited a few of them, and they're the real deal, that's
the real coursework there.

~~~
matt_wulfeck
> _Absolutely agree that some MS degrees are pretty much less rigorous cash
> cows by now, that allow students to skip the fundamentals such as data
> structures, operating systems, and compilers._

At what point do we not consider operating systems and compilers
"fundamental"? What percentage of CS/programming jobs require deep knowledge
in these arenas?

~~~
kowdermeister
It should be optional or advanced level. If you are a web developer for
example OS and compilers are totally not important to get along and have a
healthy career.

~~~
guntars
I disagree, compilers are relevant to web development because of all the
compiling and transpiling that a single source file goes through in large
application development.

~~~
kowdermeister
If you are interested in _writing_ compilers then sure. If I just want to use
those compilers, then I don't really need to understand them just use them as
specified.

Now you'll say that I'll make a mess if I don't know what the compiler does.
That's what best practices are for if say I pick TypeScript. Also,
optimization is the compilers responsibility.

------
k2xl
Interviewing is a funny thing.

I remember when I graduated from a "Top School" and interviewed at "hot
startups" from the valley. I aced a lot of the interviews - why? Because I had
just taken classes on LinkedLists, Binary Trees, HashMaps, etc... So when they
asked me to whiteboard a "shortest path algorithm" it was just rehashing what
I did in school.

Years later, looking back, I fail to see the relevance in most of the
technical questions. In fact, if I had to do those questions over again today
I would probably fail miserably. Yet, I have been in the industry for a while
now and have worked with countless more technologies and have accomplished far
more than my younger self.

Just because someone performs well in a technical interview doesn't mean they
will do a good job. That is the data that really matters. I've interviewed
hundreds of candidates as a hiring manager for some big startups, and from my
experience technical interviews are not a great indicator of success.

I'm saying this coming from someone who has gone to a "Top School" and done
multiple Coursera/Udacity/etc classes.

Yes, someone might be able to whiteboard a random forest or write a merge
sort, but do they know how to engineer a system? Can the candidate:

> Communicate well with others in a group?

> Solve unique technical problems?

> Research and learn new technologies effectively?

> Understand how to push back to product owners if there's scope creep?

etc...

These are all things that are not really analyzed in many technical
interviews.

As I'm reading this analysis all I can think of is that it is pretty useless -
if not dangerous for the industry.

What I've found is that it is critically important that someone knows how to
code at some basic level. But their ability to code and explain algorithms on
the fly, while probably relevant in academia/research, is such a minor part of
the day-to-day of a programmer - At least from my experience.

~~~
phd514
That has been my experience, too. I remember deriving the optimal solution for
an interview question that was a variant of a radix sort and impressing the
interviewer for one of the hot startups at the time. That was partly the
result of my ability to recognize the similarities between the problems but
even more a consequence of having recently studied radix sorts. Twenty years
later, I'm a much better developer than I was then and have shipped many
projects with significant degrees of complexity, but I couldn't answer that
interview question or ones like it as well today as I did then.

------
Futurebot
The takeaway from this is that those who do best are those with:

\- the wealthiest/most financially supportive parents/relatives

\- upbringings that are conducive to academic success

\- the most free time

as those are the ones who, by a large margin, attend top schools, work at top
companies, and have time to spend on self-learning. Another data point of
confirmation of a well-studied idea.

Assortative mating: [http://www.economist.com/news/briefing/21640316-children-
ric...](http://www.economist.com/news/briefing/21640316-children-rich-and-
powerful-are-increasingly-well-suited-earning-wealth-and-power)

Few poor at rich schools even all these years later:
[https://www.nytimes.com/2014/08/26/education/despite-
promise...](https://www.nytimes.com/2014/08/26/education/despite-promises-
little-progress-in-drawing-poor-to-elite-colleges.html?_r=0)

Why people care about elite schools: [https://medium.com/@spencer_th0mas/why-
america-cares-about-e...](https://medium.com/@spencer_th0mas/why-america-
cares-about-elite-schools-411798386ba7#.yream7cae)

~~~
mgraczyk
I did not see in the data set where the interviewee's family wealth or
upbringing were mentioned. Would you mind providing a link?

~~~
Futurebot
"Having attended an elite school" is a proxy for that. Info available at 2nd
link.

~~~
morgante
It's not that good of a proxy though. There are still tons of poor students at
elite universities.

~~~
jacalata
From the article >In 2006, at the 82 schools rated “most competitive” by
Barron’s Profiles of American Colleges, 14 percent of American undergraduates
came from the poorer half of the nation

That doesn't sound like 'tons'.

~~~
morgante
14 percent is still thousands of students. Equating elite universities with
privileged upbringings is stereotyping. Like most stereotypes, it's grounded
in truth but also unfair

I worked really hard to be able to get into an elite university and to get the
scholarships to pay for it. When people automatically assume that my education
means my family is wealthy, it discounts the enormous amount of work which I
(and other low-income students) did to get there.

~~~
stolsvik
You are confirming the point that was made!! It is harder for "poor folks" to
get in. It is easier for wealthy kids to get in.

You should state ".. and I had to work my way in there - my folks aren't
wealthy!" as a bi-sentence every time you mention your education. I bet most
people immediately would understand what it implied: This is an achiever!

~~~
morgante
No I'm not.

Even if fewer poor students get in, that definitely does not mean you can make
the reverse assumption: that everyone who attended an elite university is
wealthy. For all we know, these results are dominated by poor students who
worked hard enough to get into an elite university (which, by the way,
wouldn't be a crazy assumption: my CS classes had a lot more income diversity
than, for example, political science ones).

------
Apocryphon
Not to harp on the "technical interviews are disconnected from actual work!"
angle too much, but I'm reminded of a comment from a thread about the creator
of Homebrew failing a Google interview. Someone pointed out that it goes to
show that it's possible to create widely-used software without an intimate
knowledge of CS. I wonder if that's a disconcerting fact for some employers to
grapple with.

~~~
AndyNemmity
Made a similar comment above, but that was entirely true for myself. I have
worked on amazing things as we all have in life, petabyte databases, and just
the most incredible setups.

But when it comes to an interview, I cannot get a job several levels below my
own (without referrals).

With referrals, I am offered the moon and the stars, complete pick out of a
tremendous number of amazing jobs.

But any random interview, and I totally bomb it. I wonder how many out there
are like that as well. It feels rather strange from my perspective, but maybe
it's fairly normal?

~~~
maverick_iceman
You cannot get a job at a top company without referral/recruiter looking you
up. Any such company is totally swamped by the number of resumes submitted and
the chance of getting an interview is practically zero.

~~~
whenwillitstop
This is not even close to being true. People get hired into top companies all
the time.

------
ma2rten
Until recently I worked at a startup as Machine Learning Engineer/Data
Scientist. There I got some experience interviewing people and looking at
their resumes. In my experience, which is very limited compared to this post,
people who put an MOOC on their resume are usually less qualified compared to
people who don't.

There is nothing wrong with MOOCs, but they are almost always beginner-level.
If you put them on your resume it kindof implies you don't have a lot of
experience beyond that. Putting the Coursera Machine Learning course on your
resume would be the equivalent of putting Java 101 on your resume for a
Software Engineer.

I would recommend anyone to put projects on your CV instead. Even if you don't
have a lot of work experience, just put side-projects and school projects on
there.

~~~
jghn
I think the point here are folks taking MOOCs for enrichment, not as a gateway
to get a job. So for instance someone taking the ML course just for knowledge,
not to try to get in the door as a data scientist

~~~
ma2rten
Yeah, like I said, there is nothing wrong with MOOCs, but I would not put them
on my resume as a candidate nor would I value them as a positive signal from
an employer's point of view when I see them on a resume.

------
collyw
Interesting and surprising, especially the experience thing. I think I am a
significantly better engineer than earlier in my career, so I assumed
experience would count for a fair bit. Then again I have inherited projects
from experienced guys who make crap high level architecture decisions and the
code is way more difficult to work with than it ought to be.

But then this article seems to be measuring interview performance, not actual
ability on the job. So is any of it actually relevant at all?

~~~
hibikir
Most of the things that make us better during our career have nothing to do
with programming faster: If anything, they can make us program slower, and do
worse in interviews.

For instance, an interview I took last year came with a premise that I found
downright bonkers. It was based on code the company had in production, but it
stopped being a good problem to look at years before. I was having trouble
coming up with the right tradeoffs for the implementation because all my
experience was telling me that the entire approach was misguided in the first
place, so the problem should not be solved. I passed, but it was a far rougher
performance than I would have liked.

There's also how being far from college makes the least important knowledge
gained fade away, and the least important thing I studied was memorizing
algorithms. I write new algorithms at work sometimes, and I implement off the
shelf ones too, but I don't have to recall them off the top of my head. Nobody
has to implement distributed consensus algorithms under time pressure, or
write HyperLogLog from memory.

So ultimately, there's what easy to measure, and then there's what is valuable
and important. We go with what is easy, and those are things that are taught
in college. Understanding the right level of testing or designing a system for
observability are far more valuable in the long run: It's crazy how much
downtime in well known companies comes from people not learning those things
in college. But since we are bad at measuring those things, and kids right out
of school don't know them, we don't interview for that.

And sadly this is why we all end up hiring by network so much: We can't tell
if someone is good in a day, and we can't really ask people to dedicate two
weeks to work with us in a probationary period if they have real jobs, but we
sure can recall quality former coworkers and ask them to join in.

~~~
nickbauman
Most pedagogy, over time, gravitates toward what's easy to measure, not what's
valuable to learn. So we've been very lucky in hiring by network even though
we kinda feel terrible about it. On the other hand we're hiring more women now
too as a result, not least because women are now much more conscientious about
managing their network of other female developers than men.

------
bhntr3
I wonder if "took courses" could be a stand in for "prepared heavily". It
seems like people with all the other attributes might think they didn't need
to study. People without them might think they did and took courses to "catch
up". In my experience, preparation is the key driver of performance in these
types of interviews.

It seems reasonable that a person who took a MOOC might have prepared in other
ways as well while people who didn't probably didn't prepare much at all
(since watching a few Algo lectures seems the most accessible refresher.)

------
chewyshine
Top school is probably serving as a proxy for intelligence in this
analysis...a well known predictor of both interview and actual job
performance.

~~~
nickbauman
Top school is also serving as a proxy for wealth and privilege. Good grades
are also a proxy for wealth. Kids who have rich parents get private tutoring.

~~~
FT_intern
The privilege to value education and to spend time studying, yes. Almost all
resources can be found online nowadays. If you've ever been to an SAT tutoring
class, you know that the only benefit the kids have is the benefit of being
forced to take practice tests.

A top school is a good signal for how much time someone spent studying in high
school, except for affirmative action students who get into top schools with
much worse scores and GPA.

~~~
throwaway_374
> except for affirmative action students who get into top schools with much
> worse scores and GPA.

This dismissive "quota tokens" attitude really irks me. It's one thing getting
in, it's a total different ballgame surviving and coming out the other end
with a decent GPA. I've seen people of all socioeconomic backgrounds fail and
some from remarkable poor backgrounds do exceptionally well.

~~~
aianus
Getting in is harder than surviving or even excelling. The median grade at
Harvard is an A and the graduation rate is 97.5%.

------
grogenaut
Unless I missed it in the article the data is all about passing the interview
not acutally seeing if any of these things correlate to the employees working
out in the 1,3,5 year time spans.

With this data you're just biasing towards people who interview well, which, I
don't think you actually care about.

Well I mean I guess you do if you're a recruiter (if you're a moral recuiter
you care about both), but not really if you're an employer.

------
memracom
I think you are seeing the effect of people who have decided for themselves to
pursue lifelong learning. The Udacity/Coursera thing just clusters these
people in a way that you notice them in the stats. But remember that
statistics do lie. You need to dig into the reality behind the numbers, and
question whether you are measuring all the right indicators.

My experience comes from several decades developing software and from time to
time, hiring people. The people that worked out best, either as colleagues or
hires, always seemed to be learning new things and were ahead of the curve
trying out new techniques or tools before they became popular.

If you understand how a tool/technique becomes popular as the mass of software
developers wrestle with new problems and finally find a way to master them,
then it makes sense that constant learning makes some people stand out of the
crowd. They happen to be the first ones to learn the new tool/technique and if
they do not introduce it to their development team, then when management does
make the decision to introduce it, the folks who know how to drive it have a
chance to excel and appear to be rocket scientists.

------
sundvor
Searched the article and the comments here for "Pluralsight", with zero hits.
So what makes Udacity/Coursea preferable? TLDR, I'm asking this as Pluralsight
was a significant contributor to my landing my latest role after redundancies.

The long version: I recently landed a role after some time off, having changed
from mainly back end Php/Coldfusion to C# in the last year. I was able to make
the switch in my last role. For me, moving to C# was a big transition; as well
as guidance from a (fantastic) mentor, I used Pluralsight to learn C#, asp.net
and DDD - e.g. from Jon Skeet, Scott Allen and Julie Lerman, to mention but a
few.

Being completely burnt-out on the old stacks, I was set on making my next role
a C# one. I've come to love what Microsoft are doing with Core, open sourcing
etc, as well as the strictly typed C# language and ability to use NCrunch with
live unit tests. So I signed up for a year after relinquishing my corp
subscription, kept doing their courses, and found the training material highly
accessible with great quality content. Each interview was a learning process,
when I didn't know something from a test, I'd go and study it so that I'd be
better prepared for the next role. One of these was the study of data
structures and basic computer algorithms, where I was lacking. I might not
have had years of experience, but the experience I had was mostly best
practice.

During my search, I typically got great feedback on the fact that I was doing
Pluralsight courses, and it was a significant factor in being hired for the
new role - it showed cultural fit, in addition to passing their tech tests
(which happened to involve structures). My company had interviewed a _lot_ of
candidates, struggling to find the right talent. Just possessing technical
skills is one thing, having the right attitude towards learning is another.

At any rate, I'll keep using Pluralsight to raise my proficiency in my new
stack - even as an old timer, I am having a newfound level of enthusiasm
towards my whole profession which I haven't felt since I coded in assembly on
the good old Amigas. I would be interested in knowing why Coursera / Udacity
might be better or more accepted in the marketplace though.

------
faitswulff
It's rather shocking how much effect Udacity/Coursera had on interview
performance - more than graduating from a top school or being employed at a
top company:

"...only 3 attributes emerged as statistically significant: top school, top
company, and classes on Udacity/Coursera."

~~~
Alex3917
> It's rather shocking how much effect Udacity/Coursera had on interview
> performance

Having done Udacity and also taking a couple CS classes at Cornell, this
doesn't surprise me at all. People who take Udacity classes are doing so
voluntarily on their own time, so they are going to generally be smarter and
of higher socioeconomic status.

If you look at people who take CS classes over the summer at college when they
don't have to be and when there are no student loans, you're going to get a
similar population.

~~~
joshvm
I don't buy the high socioeconomic status bit. MOOCs are hugely popular in
countries like India where education is extremely competitive. I help run an
international science camp for late-teenage kids. More and more we're seeing
applicants who list the MOOCs that they've taken, and most often they're not
from 'developed' countries.

If anything the opposite is true. MOOCs have enabled vast swathes of
economically challenged people to learn from high quality video material,
whereas before they were limited to textbooks.

The key is that these kids are absolutely determined. And similarly if you
really want that job at one of the Big Four, you might also consider soaking
up as much prep material as you could.

------
ordinaryperson
The master's in CS can be useful if:

1\. You have an undergrad degree in liberal arts 2\. You pay as little tuition
as possible 3\. You take no time off and continue to work FT

These apply to me -- my undergrad was in English, I paid 6k total (27% of the
21k total cost) and went to school at night over 4 years while my career
continued to progress.

Most of the people in my program couldn't write a FOR loop if their life
depended on it, they viewed it (incorrectly) as a jobs program while the
school needed the $$ to keep the dept afloat, so I'm not surprised they fared
poorly in technical interviews.

But that doesn't mean the degree isn't useful. If you're already a programmer,
it helps get your foot in the door at many places. HR managers/recruiters feel
more confident forwarding on your résumé, they can't parse your GitHub repos.

The degree is icing on the cake, it's not going to magically turn you into the
Cinderella of Programming if you have no real-world experience. I got my
master's with a QA and a paralegal and today? They're still a QA and a
paralegal.

That being said, timed technical interviews are almost universally asinine,
IMHO. When in real life do you have 10 minutes to figure out a problem? Or are
prevented from Googling the answer? The measure of successful programmers is
how efficient and professional they are in problem solving, not how much
useless information they can keep in their head.

Things I've never had to do in 'real' life: -Never had to split a linked list
given a pivot value -Never had to reverse a string or a red/black tree -Never
written my own implementation for Breadth First Search

etc etc

Personally I'd rather see take-home assignments that roughly approximate the
type of work you'd do, which in my career has been churning out new features
or applications. Does knowing the time-complexity of radix sort vs heap sort
really have a material impact on your effectiveness as a programmer? No.

~~~
popcorncolonel
> The measure of successful programmers is how efficient and professional they
> are in problem solving

They are not asking you to split a linked list given a pivot value or reverse
a string without using a builtin function because that's what you going to be
doing on your job. They're doing that so they can see how efficient you can be
given a new problem. That's why companies have interviewers sign NDA's - so
their interview questions escape as little as possible, so it's not just
problem memorization.

~~~
ordinaryperson
> so it's not just problem memorization.

How is it not?

Grasping the fundamentals of algorithms and data structures: important.

Memorizing the best greedy algorithm for traversing a linked list: not
important.

Any super specific answer to a problem is likely not worth committing to
memory because 1) it'll probably change as platforms evolve 2) Googling
specifics lets you save space in your brain for things that actually matter.

------
Bahamut
It should be noted that these technical interviews are biased to a particular
style, so the data only really is of relevance for these types of interviews.

------
acjohnson55
On the master's front, I went down a slightly unusual path. I enrolled in a
master's program in music technology at NYU [1]. I already had a master's in
engineering from Princeton [2], but after time away from the software world, I
wanted to retool for a return to engineering, but with a focus on applications
that actually mattered to me.

It turned out to be a very expensive, but very fulfilling decision, and it
paved a route for a very successful past four years.

Compared to my first master's, it was less theoretical and much more project-
based. In that sense, it was fantastic preparation for career work, because
every semester, I had to conceptualize and ship 4-5 different projects in all
sorts of subject areas. The value of that shouldn't be underestimated. It also
directly led me to cofounding a startup that had a brief lifetime, but
effectively converted me to a full-stack engineer.

Today, I don't use much of the subject matter I learned in my day-to-day, but
I draw on the creativity, problem-solving skills, and work patterns every day.

My Princeton program was great too, but I thought I'd share about the NYU
program, as that was the more outside-the-box choice. There's something
special to be said for a master's degree, when it's interdisciplinary and
let's you focus on the intersection of engineering skills and subject matter
expertise.

[1]
[http://steinhardt.nyu.edu/music/technology](http://steinhardt.nyu.edu/music/technology)

[2] [http://ee.princeton.edu/graduate/meng-
program](http://ee.princeton.edu/graduate/meng-program)

------
sytelus
yes, this is absolutely startling:

 _For people who attended top schools, completing Udacity or Coursera courses
didn’t appear to matter. (...) Moreover, interviewees who attended top schools
performed significantly worse than interviewees who had not attended top
schools but HAD taken a Udacity or Coursera course._

Possible explanation might be that people going through regular degree
typically spread themselves thin over many subjects (digital electronics,
compiler design, OS theory, networking etc) while MOOC folks sharply focuses
on exactly the things for interviews (i.e. popular algorithms). Its like
interval training for one specific purpose vs long regime for fully rounded
health. The problem here is not academic system but how we measure performance
in interviews. I highly doubt if results would be same if interviewers started
asking questions from all these different subjects instead of just cute
algorithm puzzles.

------
kenoyer130
We really need a further correlation between people who pass the interviews
and job performance a year later. I do a lot of interviewing at my current job
and we have found no strong correlation at all between CS skills and actual
ability to "get things done".

We toned down the CS type questions since they tend to take too long. We still
ask a few basic tree and string manipulation questions to weed out the people
who have no idea how to program and get insight into how the person thinks.

I still feel at the end of the day we could flip a coin on accepting an
interview candidate once they have shown basic competency and have the same
results.

I have been telling candidates that a public github repo with a nice commit
history carries much more weight with me then a CS degree since we have been
burned so many times before.

------
AlexCoventry

      If you know me, or even if you’ve read some of my writing, you know that, in
      the past, I’ve been quite loudly opposed to the concept of pedigree as a
      useful hiring signal. With that in mind, I feel like I owe clearly
      acknowledge, up front, that we found this time runs counter to my stance.
    

Did the interviewers have access to the applicant's resume? If so, to what
extent do these results simply reflect the interviewers' bias for top schools
and famous companies?

------
lgleason
While I do think that interviewing is broken, I would love to see the raw data
with this. For example, did Udacity courses have other related traits
associated with them, IE: did these candidates that also have a certain number
of years of experience, degree etc.? 3000 is a small sample size and I'm
wondering if there is some sampling bias here.

------
analog31
Something I wonder is how the participants in these interviews were selected
from the general population of job candidates. Painting with a broad brush,
the best workers might not even be candidates, because they've already been
hired. And the best candidates might be the least likely to seek coding
interview practice.

------
pklausler
I conducts lots of tech interviews for SWE positions, and as everybody's
boning up on algorithmic trivia, I've learned that I can get a stronger hiring
signal by asking _simpler_ questions that people with an aptitude for
programming will succeed on and people with an aptitude for memorizing the
implementations of algorithms will not.

(Simple example: given two closed intervals [a..b] and [c..d], how do you
compare the four values to determine whether or not the intervals overlap? You
may laugh, but it defeats about 50% of candidates in the first minute of an
interview because they just don't understand simple relationships and Boolean
expressions.)

------
henrik_w
I thought the most interesting finding was that completing Udacity or Coursera
courses on programming/algorithms (for non-top school graduates) was highly
predictable of strong interviewing performance.

~~~
ubernostrum
I know some schools and bootcamps now have a separate "How to pass coding
interviews" unit in their curriculum (since this is a separate skill unrelated
to how to write good code on the job, sadly). If those courses include such a
unit, it would explain the performance on stereotypical tech interviews.

------
ggggtez
Interestingly, they suggest that if you attend a to school, the effect of
Udacity is negligible. I'd argue that Udacity is this filling in gaps of a
poor education.

------
chvid
I have been through an interview process many times; I have never been asked a
technical question / asked to do on blackboard or on computer problem solving.

I guess that that form of interviewing is simply not common in my neck of the
woods (Denmark).

I am curious to what sort of questions / tasks are actually given to the
interviewee?

And are they in any way biased towards more textbook/academic ones? (I.e.
"implement bubble sort" rather than "create a blue button").

~~~
ascorbic
Out of interest, what sort of questions do you get asked in interviews for
Danish tech jobs?

To answer your question, when I have interviewed people the tasks I have set
have always been job-related. I set pre-interview at-home tasks, as well as
smaller tasks during the interview. I did this for the same reason I avoided
whiteboard exercises: I wanted it to be realistic. One example I used was to
implement an app that shows sunrise and sunset times. They were allowed to do
this in any way they wanted, using open source libraries if they wanted, and
add whichever features they wanted.

~~~
chvid
I do a lot of short term consulting contracts. Typically for 6 months or so.
My role is either Java/JavaScript senior developer or architect.

In the general the interviews runs for 30 minutes. The first 15 minutes, the
client presents themselves and the particular project they are working on.

Then I spent typically 5-10 minutes running thru my resume; focusing on any
relevant previous projects.

If there is any technical talk it will be based on the projects in my resume
or the client's project. The most technical it will get would be pros and cons
of particular technology choices.

I have never been asked a "quiz question" \- that is a question where the
interviewer knew the correct answer in advance.

------
Mister_Y
One of the reasons I got hired by Airbnb is that I took MOOCs, but I also
believe that most of my knowledge comes from reading books and that's a thing
I didn't put on my CV. So, even if showing interest in learning opens for you
a huge amount of opportunities, I think you actually have to go deeper than
just enrolling on a couple of MOOCs.

------
allThumbs
I feel like things are operating according to the following pattern:

    
    
      1. Go to college:
    
         a. spend many semesters in lectures
            all of which tangentially brush 
            upon the final exam based on the 
            whims of the lecturer.
    
         b. cram for final exam last minute 
            panic to crunch memory according 
            to advice on content which was 
            brushed upon during lectures.
    
      2. Interview for job:
    
         a. cram for interview by going to 
            coursera to crunch memory according 
            to interview memes based on the 
            whims of the interviewer.
    
         b. spend the rest of term of employment
            exercising skills which tend to be
            tangentially brushed upon during
            both interview and schooling while 
            the majority of actual tasks are often 
            googled and stack-overflowed
            into place based on arbitrary design 
            decisions and politicized stack
            choices.
    
      3. Results:
    
         a. good interviewees have learned 
            appropriate memes to reassure interviewers.
    
         b. good students have learned obligatory cruft
            to reassure professors.
    
         c. actual necessities are tangential to
            many or most entry barriers.
    

How accurate is this?

~~~
dexwiz
Don't underestimate the knowledge gained from tangential brushes. Being able
to Google and Stack-Overflow a question is dependent on knowing the vocabulary
and keywords related to the problem. While tangential brushes don't give you
full mastery of the domain, it gives you a bookmark on what is possible and
how to find it again in the future. If you talk to newer programmers (new
anything really), you will find they lack to knowledge to even properly
express their problem, let alone search for answers. While more experience
people may not know everything in detail, they have a better handle on various
knowledge domains and where to look.

~~~
blowski
I agree with you on this. It's the difference between "I get a 500 error when
I try to submit the blah form and here's the error in the log - it seems to
crash in the foo method of the bar service when it receives a null value but
it's not clear even from the debugger why it's getting null" vs "X doesn't
work".

~~~
Ocerge
This exact situations happens to me far too often, especially when working
with developers who refuse to learn the other side of the stack. Had lots of
"this doesn't work" conversations with frontend devs who refuse to get their
arms dirty with the database, and same from backend to frontend as well.

------
jventura
It was a very good reading, but I wonder how interviewing performance relates
to job ("real") performance?

~~~
chewyshine
If the interviews are "structured", the correlation between interview
performance and job performance is probably around .3 based on meta-analyses
of research findings. If the interviews are "unstructured", the correlation
will be within sampling error of zero.

~~~
JamesBarney
Do you have a source for this?

I totally believe you, but I've been trying to find good empirical studies and
authors to read on interviewing.

~~~
ryanworl
Not the person who originally replied to you, but the HR class I took in
college broke down the correlation of interview performance to job performance
into a few finer-grained buckets than just structured and unstructured. Sorry,
no sources here, as I've since gotten rid of that textbook.

Structured vs. unstructured are the fairly general categories, but that
correlation number is accurate. The best predictor I can recall (~0.6 IIRC?)
is a "work sample test". You can Google that and find all kinds of HR
resources about it.

That is HR jargon for, in this case, writing code to solve a real problem
under normal conditions for that company. That is, on an actual computer, not
on a whiteboard, and not being constantly scrutinized throughout the process.

~~~
JamesBarney
Yeah from what I've read the two most effective predictors of employees were
work-sample and GMA tests(basically IQ).

------
sgt101
When you are interviewing for a specialist post (and most posts are specialist
to some degree) you are looking for evidence that the candidate can do that
particular job. Therefore a course that indicates that they have the
particular skills required is highly desirable!

------
shanwang
I'm not surprised that MOOCs are a big factor, people like me who have left
school years ago have forgotten how to write a BFS, we need something to brush
up those knowledge.

If you run statistics against using sites like careercup, you may find that
being the top factor.

------
Eridrus
Huh, I hadn't bothered to list MOOCs on my resume since I didn't think
employers would be interested, maybe data like this will make employers more
interested in the courses, which would probably get more people to shell out
for the certificates.

------
clark-kent
Basically a "bad" programmer that can't write maintainable code that prepares
for technical interviews by brushing up on algorithms and whiteboard style
questions will do better than a very good programmer with lots of years
experience.

------
bootload
_" We got this data from looking at interviewees’ LinkedIn profiles."_

Verification of completion and award id? There are a lot of individuals who
will add a degree regardless of attendance, completion or award. Who validates
the assertions?

------
ditonal
Very unsurprising for me. You are measuring your ability to solve algorithm
puzzles. Most engineers don't actually do many algorithm puzzles in day-to-day
work, especially the types of algorithms that interviews tend to focus on like
sorting and dynamic programming. So "years of experience" is not measuring
experience in what you're actually being tested on. On the other hand, you do
exactly those types of things in many CS classes, and in Coursera classes,
algorithms are exactly what you practice. So it makes sense it correlates.

Top company is a predictor for the obvious reason - it's selection bias for
people who already passed those interviews at the company. You're not good at
the interview because you worked at the company, you work for the company
because you're good at the interview.

Master's degrees seem like largely for international students needing visas,
career switchers, etc so not surprised they are not a strong predictor. And if
anything the course material moves past the intro data structures stuff the
whiteboard interviews tend to test.

The only huge surprise for me here is that Coursera is a stronger predictor
than top company and top school. I would have predicted top company > top
school > Coursera.

The post that I would be much more interested in is correlating performance
reviews to interview performance. That gets suggested as a possible future
post.

~~~
marcosdumay
Those riddles are second semester problems on CS, thus anybody that isn't an
student spent at least the last 4 years not working on them. The same probably
applies to top companies, people probably interviewed there more than 2 years
ago.

But Coursera stuff that gets into a CV is almost certainly recent.

~~~
glippiglop
Totally agree with this. I'm reminded of when I interviewed with Amazon a few
years back and it was clear that the questions were best suited to filtering
down to the best graduates. In my mid 30's I was too rusty to satisfy the
interviewer and my real world experience counted for nothing as the entire
process was structured like a series of exam questions.

I think there's a point - I would guess at about 8+ years of experience -
where a technical interview will stop being a reliable indicator of an
engineer's ability. It then becomes more important to evaluate what the
candidate has been doing in their career and how that experience can be taken
advantage of within the team that they'd be joining. Technical ability can
still be judged by asking to see some existing code if it's that essential.

------
boha
Sad to see so much detail paid to the data, and so little to the setup of the
experiment itself.

It shouldn't be surprising that an online technical screen favors candidates
who've participated in a MOOC, but is blind, say, to years of experience. A
screen like this is timed-performance-at-a-distance, which resembles MOOC
participation. The full spectrum of qualities that comprise a Good Hire might
incorporate the other signals from the post, but this type of interview won't
test them.

(I'll be the first to admit I'm biased against performative coding in
engineering interviews. Tech screens like this are often necessary, though, so
they have their place.)

------
pmiller2
Was there any kind of statistical correction applied when the data were
partitioned into MOOC + top school vs MOOC + no top school?

------
conqrr
Slightly Off-topic, but does anyone have an invite for this platform? I have
been trying to get one since ages.

------
serge2k
That graph is about the best evidence I've seen that interviews are garbage.
Years of experience doesn't matter at all? Coursera courses are the best thing
ever?

------
maverick_iceman
This is a very poorly done analysis. At a minimum she needs to define top
school/top company. Also I'd like to see the confidence intervals around the
effect sizes. In addition, looking up MOOC information from LinkedIn may
result in a lot of false negatives. (She doesn't mention if MOOC courses in
non-CS subjects count.) Did all the interviewees have CS degrees? What about
the Masters degrees, is she including non-CS ones? Is the sample of
interviewees representative or there's any selection bias that we should be
aware of?

A study which doesn't answer so many basic methodological questions is
garbage.

------
lintiness
"I’m excited to see that what mattered way more than pedigree was the actions
people took to better themselves."

so a degree from a top school is not earned (nor are admissions i guess), but
rather conferred at birth? i beg to differ.

the commentary on the "disutility" masters degrees is even worse.

~~~
adrenalinelol
I don't think it's entirely that black and white. If an undergraduate degree
from a tier 1 school didn't cost approx. a quarter of a million dollars, along
with the fact that your "peers" with rich parents will have attended a private
school + tutoring + usually had a decent home-life; these things stack up, you
can't claim it's a level playing field. Of course you need to be smart to get
there in the first place (in most cases). But it's a far cry from a pure
meritocracy.

~~~
nsp
For what it's worth, financial aid at top tier schools is generally fantastic

------
marsian
Is this guy a paid shill for academic friends trying to boost enrollments and
overcome the disillusionment of the younger people who realize too much
emphasis is placed on academics and not enough on practical application?

The world needs more vocational schools and trade schools and technical
schools than it does colleges and universities.

