Hacker News new | past | comments | ask | show | jobs | submit login
The Utter Uselessness of Job Interviews (nytimes.com)
563 points by tomek_zemla on Apr 9, 2017 | hide | past | web | favorite | 400 comments



I find it quite problematic that researchers get to talk about their own research and present it as facts without anyone taking a critical look.

Over time I’ve become more skeptical about this kind of psychology research (as more studies fail to replicate) and, as is often the case, here the sample size is quite small (76 students, split across 3 groups), predicting something relatively noisy as GPA. It is unclear to me that one would be able to detect reasonable effects.

Furthermore, some claims that make it into the piece are at odds with the data:

> Strikingly, not one interviewer reported noticing that he or she was conducting a random interview. More striking still, the students who conducted random interviews rated the degree to which they “got to know” the interviewee slightly higher on average than those who conducted honest interviews.

While Table 3 in the paper shows that there is no statistical evidence for this claim as the effects are swamped by the variance.

My point is not that this article is wrong; verifying/debunking the claims would take much more time than my quick glance. But that ought to be the responsibility of the newspaper, and not individual readers.

Politicians don’t get to write about the successes of their own policies. While there is a difference between researchers and politicians, I think we ought to be a bit more critical.


Of course we should be more critical, the methodology here is far from perfect. But right now, people have much faith in interviewing, and this research suggests that this faith may not be justified. As you mentioned, this is not the only research that reaches this conclusion. There is also the work by Daniel Kahneman[1] which I find pretty rigorous and draw the same conclusions about interviewing.

So obviously, the title of this article should be "Maybe interviewing is not that useful" instead of "The utter uselessness of job interviews", but besides this I find your comment unjustified. In fact, it's quite the opposite, I believe this type of work contributed to make us more critical by questioning some basic facts about interviewing, that i would have never questioned just a couple of years ago.

> Over time I’ve become more skeptical about this kind of psychology research (as more studies fail to replicate)

ok, this is interesting, where is it mentioned?

[1] Think fast and think slow. There is this short article which mention some of the results and has been discussed on HN a couple of times already. http://www.nytimes.com/2011/10/23/magazine/dont-blink-the-ha...


It's easy to present data that shows flaws with today's interviewing.

It's a lot harder to present a better way of predicting candidate performance in the workplace, along with substantial data that indicates it's better than today's methods. Corporations would love more effective ways to determine effectiveness/performance before hiring.

Interviewing is terrible, but that doesn't mean there is a better option.


Nonsense. First step is to acknowledge that interviewing doesn't work very well. You can't keep deluding yourself because you haven't found a better way. Accept reality.

This is one of the biggest problems I see with business guys today. They want absolute certainty in a world which can't offer it. Start accepting that there is a lot of stuff we don't know and can't know at the moment and we simply have to work towards getting better.

You can't get better if you don't acknowledge that there is a problem in the first place.

These silly, must have +5 years of experience, and check all these boxes with technologies to get the job clearly shows the industry is completely lost at the moment and isn't willing to acknowledge it.


Of course people know interviews aren't the greatest, but it's the best tool they have right now. What people are pushing back against is criticism without solutions.

There are a bunch of other testing methods that are illegal in the USA too, such as IQ tests.


When users of code I've written complain that something sucks to use, I don't demand they come up with a better solution. Probably I don't even want to know what they think the answer is—unless they design user interfaces for a living, odds are their answer will be terrible and they won't even be happy with it if I build it.

This is as it should be—I'm paid to solve problems.

Why is this different?

The more I read about interviewing, the more I realize too many people think they have this problem solved—their amateur psychology is impeccable and their technical screens test for exactly the right things, no more and no less. Did they do a bunch of controlled studies to convince themselves of this, or are they taking sounding good, or intuition about the statistical outcomes of different techniques, to be equivalent to truth?

Maybe the first step is to collectively realize we have close to no clue what we're doing, and are being asked to solve a hard problem: individually, to talk to someone for an hour and make a hiring recommendation. In aggregate, to make the decision based on a handful of these one-hour conversations.

Maybe the first step is to realize this is a problem worth trying to solve.

Maybe the first step is vocal non-acceptance.


> What people are pushing back against is criticism without solutions.

Which is bullshit. It's perfectly reasonable to criticise something without proposing an alternative. It's especially ridiculous to reject criticism provided without alternatives, when it's literally your job to do the work being criticised.

"Hey the way you're doing this part of your job produces results no better than random selection."

"Bring me solutions, not problems!"


Solution: toss a coin or take n first applicants for the trial period. Same effectiveness, much cheaper. No need for the interviewer.


I still don't understand why employers can't simply set up a two week "trial contract" where as promising candidates simply work for two weeks so everyone can actually see and judge, with real world empirical data, how well the person does in the environment at the actual job.

Yes yes..of course I know this could be gamed as well, but no matter...you can't really argue that this wouldn't be magnitudes better then the typical current/broken interview process.


This will get you the most desperate employees, not the ones you want. I've got a home loan to pay, I'm not switching jobs if you can only guarantee 2 weeks of employment. And if there are multiple offers around, I'll take the 3 month or full time position, even if I'm half way through the 2 week contract.

Also, not all jobs/codebases lend themselves to being productive in 2 weeks, I'd argue they should be, but they aren't.


The concept of a two week trial contract is interesting, but also as flawed as anything else.

I switched roles to a new team, and the first 2 weeks were a trainwreck. It was almost of no predictive quality on how I would do.

Now, there are many reasons why that was the case, and perhaps those underlying issues should be addressed, but from all the role changes I've had, the first two weeks shows how well the group you're going into can onboard, more than it shows how productive the individual will be in the long term.


You would need a longer trial period, a few months is fine with a clause that allows earlier termination.


Yes, because it would be magnitudes worse. So do you give the assignment to? You have several hundred applicants, do all of them get the 2 week assignment? Who watches over them and answers their questions? After spending that much money, is the answer you get any better than a set of interviews On the other hand, assume I am in a job and want to change positions, for any number of reasons... How many two week jobs do I have to take? Or should I quit my job first?


In the 2004-2007 timeframe, the company I worked for hired software engineers via a staffing company for three-month contracts. We interviewed the candidates with the intention of making a full-time hire. As the contract term approached, the management team did a 360 review, the decided to offer a full-time position or just not-renew the contract. This had some downsides, but overall I found it to be better that alternate approaches I've tried before or since. It stopped being viable once software engineering became a sellers market.


This seems to work ok for companies working on green field stuff, though you'd still probably struggle to entice people to leave their jobs for you. For other companies though, where technical debt and poor management is everywhere you look it doesn't. It gives employees a chance to see what they're really dealing with and to look for work two months later.


If someone is currently employed it makes that arrangement difficult.


Indeed, though any kind of arrangement is difficult. You might end up in a dead end job, not matching your skills or otherwise soul crushing regardless of the method.


I took a job half a year ago. I'm still "learning the ropes" so to speak. I remember the first two weeks. Nothing would have been gleaned from them.


In most countries that don't have employ-at-will I.E you could just fire them anyway, this is a real thing that actually happens


Yet, literally every professionally employed person is in their current gig via that process.

It works good enough.


Counterexample: I did not apply to, nor interview for, my current gig, and I'm not cheating by working for myself.

And I bet almost everyone reading this has worked with at least one person they think shouldn't have survived the interview, and that person was making a boatload because they convinced the boss they're brilliant. Meanwhile 90% of their day was spent talking about how great they are, and 10% creating new bugs, and no one dared say anything because the thought of them being more "productive" was horrifying.

'It' mostly successfully matches employees to employers, but the quality of those matches may vary wildly.

Interview processes also vary wildly—you can't really say that it works without defining 'it.' Are we talking multiple technical screens that require writing code or a single fluffy buzzword-laden conversation with a C-level? Both have failure modes, but those failure modes sure are different.


> Interview processes also vary wildly—you can't really say that it works without defining 'it.' Are we talking multiple technical screens that require writing code or a single fluffy buzzword-laden conversation with a C-level? Both have failure modes, but those failure modes sure are different.

End of the day, everything evens out. People add the structure they need when they hire people. If your engineering interview process for some detail oriented gig is buzzword trivia with the CIO, the company will probably tank anyway. Conversely, if you do some nerd-fest whiteboard interview for a CTO in a bigger organization, you're probably not getting the right outcome either.


Did you read the article? People add the structure they think they need to interviews, but are clearly able to troll themselves into worsening their judgments by requiring steps that not only don't help, but actively hurt.

And this is a near universal phenomenon. Almost everyone wants to "get to know the candidate."


That's a syllogism and I'm not sure it tells us anything. If you replaced interviewing with a footrace across hot coals the same assertion would be (just as vacuously) true.


It's deeper than that.

Everyone hates the process, and companies have investing major dollars and hours trying to improve. End of the day, little has changed since 1917. You either acquire-hire, get a strong referral or interview a pool of unknown applicants.

The tests and quizzes are little different to how a city hired an accountant in 1917. The old boys network evolved. Then you're left with the rest.


I strongly disagree. There are all sorts of situations where having a bad option is much worse than having no option, hiring and medicine clearly among them.

> Corporations would love more effective ways to determine effectiveness/performance before hiring.

This is irrelevant to evaluating the current methods. Even if there is no replacement, if they are useless, we should know it, bar none.


That feels a bit like saying we should just stick to blood letting and leeches because we don't know any more effective way of treating disease. Just because we don't have a superior alternative doesn't actually mean the current method is effective.


A)

Leeches are often used in literal modern hospitals in the developed world as an effective treatment for certain ills.

B)

If we had no alternative, we should stick to it, but look at other things in the meantime. You seem to be advocating doing nothing at all, which is trivially easy to show works for no-one


Yes, leeches have extremely limited use. However, bloodletting and leeches killed far more people than they ever "treated."


We have had a better way for decades, its banned. What do you think all these algorithm questions are supposed to do?


Any evidence to say that algorithm questions are a better indicator of job performance? Most development work isn't algorithm heavy at all.


I doubt there's that much, in general, since a lot of jobs just need a floor. But when it's used as a proxy for IQ test, the better you do at algorithms the higher IQ you probably have, and that typically correlates with better job performance, or being able to transform one's job into one with higher impact. (Even more if they have high Conscientiousness too but I don't think algorithms would correlate much with that.) It's also a weak proxy for seriousness. I hate algorithm questions (though fortunately not algorithms), almost everyone I talk to hates them, but if you're not expecting them and don't at least know the basic ones, you're not serious about applying to random tech job. (Which is fine, I don't mind that some people will get in a huff and walk out when asked to write a graph search algorithm or something as if it's beneath them or useless since the day-job never does such things, they just weren't serious about applying to that company.) Some tech companies have gotten rid of them, which is great, but you can't count on that yet as a candidate.


The problem with testing algorithms is that it in no way tests intelligence. I would think that 9 out of 10 programmers that know an algorithm would not be able to derive the algorithm from first principles. So you are just testing esoteric knowledge - it's qualitatively no different that asking someone questions about a specific framework / API.

You could make the argument that algorithms tend to be studied more by smarter people, but if that's what you're going for you may as well ask them about their hobbies, and hire the person that is into playing chess, or doing astronomy (or whatever intellectual pursuit you care to name).

If on the other hand you are interested in a person's ability to code, ask them to do so. The last time I had to hire someone, I wrote a small application with one module that was deliberately written in an obfuscated style. I asked candidates to bring that module under control - rewrite it in a readable code style. To do this, successful candidates needed to identify what the current code was doing by examining the public interfaces in a debugger, documenting what the calls seemed to do, prepare unit tests, and then rewrite the module in a readable style. It took about a day for most candidates to do.

At the end of that, you get to see a candidate's ability to read code, use a debugger, write unit tests, write documentation, and write well structured code, which is a pretty good coverage of the typical tasks in a developer's day. I feel this gives a much more realistic assessment of a candidate's capabilities that asking questions about a more or less randomly chosen algorithm.


> It took about a day for most candidates to do.

This is an issue as well. If you aren't google then a day is too much investment for a single job opportunity, especially if you're already employed.


I don't think someone with an IQ of 80 could program Djikstra's algorithm given a mathematical description of it with diagrams. I'm even skeptical of programming binary search. They might understand an intuitive explanation involving a phone book but I don't think they could program it. And even if they could, I think someone with an IQ of 120 would do it much faster, though both solutions would likely have the integer overflow bug that was even in Java's implementation for a long time. So I think algorithms do test intelligence, just not as well as an actual IQ test. It can easily be gamed by sheer memorization, whereas good IQ tests can't. I agree that other things like ability to play chess would probably test just as well as algorithms. If the industry switched to testing candidates to see if they can solve chess problems, or play a computer self-limited to some specific ELO, you can bet that everyone who was serious about getting a job in the industry would start playing a lot of chess, and those with higher IQs will on average play better chess.

When I have to give an interview and have to include an algorithms section I make candidates type code. Whether that's on a phone screen with a shared online text editor or in person with their laptop / an interview laptop, I want them to type stuff, not just rely on whiteboard pseudo-code and diagramming. As a vim user I discount that their editing environment may not be what they're used to but even if I was forced to use Notepad I could still bang out a function to test the even/oddness of a number (my own fizzbuzz) pretty quickly. So I at least make sure to test coding, even if poorly.

I agree work-sample tests are the best, but as another commenter noted if they take a lot of time for the applicant you're going to get people who refuse that just as some refuse to play the algorithms game. Especially if people have a github repo, especially if some of the projects they've worked on have had more than themselves as commiters, especially if they're currently employed as a developer at some other company that does general software. Unless you're trying to build a top team, which most projects don't need, you're wasting a lot of time trying to rank beyond "would work out ok" and "would not work out at all". I have a section in my phone screen that tests for regex knowledge, I'm primarily just testing to see if they know the concept or if when faced with a problem that regexes can solve (which actually does happen from time to time) they reach for writing some custom parser or not. If they vaguely remember there's a way to specify a pattern and find matches, that's a Pass. If they know grep/their language of choice's regex syntax and can give a full solution, great, I'll rank them slightly higher than someone who just knows the concept, but all I really care about is the concept. If they don't know the concept, that's a strong sign (to me) they won't work out.

I tried to do a semi work sample test with an intern candidate a few months ago instead of a different test, based on experience with a prior intern who struggled on something I thought was basic and left me wondering why I didn't catch that in the phone screen. Basically I gave them some stripped down code from several files that looks a lot like what we have in production (JS, using Backbone) explained the overall mapping from bits of code to what could be shown on the screen, and essentially asked them to add a new component (already written) to one part of the screen by modifying/filling-in-the-functions in a few places. It required them to read and understand some alien code, see what they can ignore, understand what was asked, and then do it (initialize something, pass it around, up to I think 3 indirect function calls of nesting, call a couple things on it). The candidate got through it, I'm not sure the old intern would have...


What is the better way that is banned?


I suspect he is referring to iq tests. They have a pretty high correlation to success and are banned.


In my experience, IQ tests are a good indication of your ability to take IQ tests and very little else. Of course, you can differentiate an absolute dunce from someone who's not, but nothing more subtle than that.


http://www1.udel.edu/educ/gottfredson/reprints/1997whygmatte...

Why g Matters: The Complexity of Everyday Life

Personnel selection research provides much evidence that intelligence (g) is an important predictor of performance in training and on the job, especially in higher level work. This article provides evidence that g has pervasive utility in work settings because it is essentially the ability to deal with cognitive complexity, in particular, with complex information processing. The more complex a work task, the greater the advantages that higher g confers in performing it well. Everyday tasks, like job duties, also differ in their level of complexity. The importance of intelligence therefore differs systematically across different arenas of social life as well as economic endeavor. Data from the National Adult Literacy Survey are used to show how higher levels of cognitive ability systematically improve individuals’ odds of dealing successfully with the ordinary demands of modem life (such as banking, using maps and transportation schedules, reading and understanding forms, interpreting news articles). These and other data are summarized to illustrate how the advantages of higher g, even when they are small, cumulate to affect the overall life chances of individuals at different ranges of the IQ bell curve. The article concludes by suggesting ways to reduce the risks for low-IQ individuals of being left behind by an increasingly complex postindustrial economy.


Much like the way that hackerrank tests how good you are at hackerrank tests?


I wonder how much correlation with IQ is allowed before custom algorithms questions qualify as an illegal IQ test.


I don't think the issue is correlation, but rather disparate impact[0].

Algorithmic tests have a pretty low bar to clear in IT jobs. Using them for secretaries or lab technicians is another story.

https://en.wikipedia.org/wiki/Disparate_impact


Employees will test the limits with lawsuits. I know at least one large state lost a class action lawsuit due to racial bias on civil service exams.

The problem with comparing IQ results at hire to employment success is that employment outcome is difficult to define over time. You're also unlikely to get statistically relevant data without focusing on large organizations with standardized HR processes. Most of the research is based on supervisory evaluations, which are not the most reliable indicators of anything for a variety of reasons.

The other thing I find amusing is that business folk who talk about this miss the fact that there are large workforces in the US that either have or do use standardized testing like this to hire and promote. Those are government bureaucracies, which function relatively well, but are hardly a model that most folks advocating this would aspire towards.


Stem cells and steroids maybe?


> ok, this is interesting, where is it mentioned

https://mobile.nytimes.com/2015/08/28/science/many-social-sc... among other places


People have too much faith in the efficacy of interviewing, ergo we should give slipshod journalism and science lacking rigor a free pass?

I'm all for upending the status quo with interviews, but let's not throw out science and reporting just to get there.


My point is not that this work is flawed, or that there should not be an article reporting on this research or the topic of interviewing. Rather, I think it'd be better for a third party to write about the topic in a more objective manner, rather than a professor promoting his own research (and thus with skewed incentives).

In particular, I was disappointed to find a (short) paragraph in the article that I find bogus. That does not mean the article shouldn't have been posted in the first place, but just that this paragraph should have been edited or removed.

I think there is something wrong when I feel like I have to look up the actual research paper and check whether the claims made in an article are supported by data and methodology. I should not be a skeptic when reading New York times articles.

To be fair, it is posted in the opinion section, but should we really just take this article as an opinion? That doesn't feel right to me either.

Then onto your last question and Daniel Kahneman, we can talk about that for a long time, but let me keep it short. The best place I know (though technical) is the blog by Andrew Gelman ([1][2] turned up in a 5 second Google, but there is way more on his blog), and Daniel Kahneman himself has "admitted" flaws in his studies [3][4].

[1] http://andrewgelman.com/2014/09/03/disagree-alan-turing-dani... [2] http://andrewgelman.com/2016/06/26/29449/ [3] http://retractionwatch.com/2017/02/20/placed-much-faith-unde... [4] https://replicationindex.wordpress.com/2017/02/02/reconstruc...


I worked in the IT arm of a household name US non-IT company (just to provide basic context). Eventually we determined that interviewing just wasn't that helpful. We started telling the recruiting company, "Send me the best you've got that's available by Tuesday." Now certainly the recruiter is going to do some selection biasing there, but we found we had just as much success as our previous interviewing process that had a phone interview, face to face interview, and a test.


The fact that psychology (and other social sciences and even medicine) have a replication crisis in spades is well known. Just Google it, if you haven't been following the general science news for the last few years.

Even the "hard" sciences have trouble, because journals prefer publishing new and positive results, rather than replications or negatives.


The claim you mention about getting to know the person better in random interviews as well as "People can’t help seeing signals, even in noise." is misleading and unsurprising.

People ask questions in interviews that they want to know the answer to and that could go either way. All questions have equivalent expected surprise, either both answers are unsurprising, or one answer is surprising, but you think you already know the answer is the other one.

If interviewers were asking questions like "is 2+2=4" they would have detected random interviewers way easier, but they wouldn't be trying very hard to get to know the person.

As for getting to know the person, the more surprising someone's answers are, if you believe they are telling the truth, the more distinguishes them from "average person who gave answers I expected", so you say you "got to know" them. This is unsurprising.

This isn't to defend unstructured interviews, other studies for a long time have shown them to be worse than structured interviews and test scores. If I had to guess the only reason the research in the article got published as novel was the random interview part.

Edit: Here's a table from a meta-analysis of lots of studies on correlation between different factors and job performance: http://imgur.com/a/YRFTh. Basically unstructured interviews aren't as good as structured ones, but they are better than nothing. Work samples are the best, structured interviews and IQ tests tie for second.

Note that this meta-analysis is combining a bunch of different fields to yield general observations, a specific field may have different results. But in expectation for a randomly selected field these are fairly solid results, and I don't expect fields vary from them too much.


> Politicians don’t get to write about the successes of their own policies

Politicians boast about successes constantly. I don't understand what you mean by "get to".


They also run expensive ad campaigns promoting their own policies, at least here in Australia. They don't even have to be successful policies to be able to talk about them...


Politicians _always_ promote the success of their own policies.


I think the parent means, we don't take as objective fact a politician's claims that their policies worked; we interpret it as self-aggrandizement and scrutinize the claims quite heavily. And that—given the push to publish and the fact that null-result studies aren't very publishable—we should likely do the same for research conclusions.


I understood, but I think recent evidence in the US suggests that a great many people absolutely take as objective fact what a politician claims. A similar number probably absolutely believe the opposite with little or no evidence. Net, I think citing politicians was perhaps not the best analogy.


Sorry I was not more clear. What I meant was that the NY Times employs journalists and fact checkers and editors to validate stories, say on politics to make sure, to the best of their abilities, that the articles they post are correct.

Why is that not the case here, where a professor is allowed to sell his own work? It is as if Obama is the NYT reporter for Obamacare.

That people believe politicians blindly is a topic for another day :)


Maybe, but doesn't a claim like "a 30-minute conversation with someone is a strong indicator of job performance" deserve scrutiny of its own?


Unless you're talking about a very high-level position that either requires substantial leadership traits or very specialized knowledge, interviewing has never been about finding the right person for the job, it's been about finding a right person for the job. The reality is that most jobs can be done successfully by a large number of people. The accuracy of the assessment of the candidate's abilities is, to my mind, a secondary concern in the interview process and yet, just as in this study, is the only part of the interview process studied and critiqued.

But what I find far more important in the interview process is involving current team members in the process of selecting new coworkers. It's one way of getting teams to be bought into a feeling of shared purpose and is the first step in establishing a working relationship. If you don't give at least some of the current team a role in the hiring process, teams will feel imposed upon by those hiring and won't be as understanding about flaws in those added to the teams.

Focusing on selecting the "right candidate" is really myopic in a situation where there are likely many right candidates. We should, instead, be focusing on not selecting a wrong candidate and fostering the right team dynamic. We already know that strong teams significantly outperform strong individual performers who don't cooperate. Yet hiring still seems focused on optimizing for strong individual contributions. And I've yet to see a study that looks at the flaws of the interview process in building ineffective teams.


I think you may be misconstruing the argument here. I agree that any number of people will fit -- but the problem is, what if interviews aren't even selecting "a" right person for the job?


> what if interviews aren't even selecting "a" right person for the job?

We know that interviews are selecting a right person some percentage of the time. That percentage will never be 0 or 100. We will always have to accept some bad hires.

I would rather that successful hire percentage be somewhat lower and keep the team engaged in defining culture and hiring standards than to give up team involvement for a higher right-person rate.

My main point is that the people studying this issue and arguing against the interview process tend to only look at less than half of the benefit of the interview process. If we're going to ditch the interview process, whatever replaces it needs to have the same property of involving the existing team or it is, to my mind, automatically worse than what we have now. Because while we're all aware of how flawed the interview process can be, it does work to some extent. When I was hiring, only 2 out of the 50 or so that I hired didn't work out. Would I like to have avoided hiring one or both of them? Sure. Am I willing to sacrifice the team involvement benefits to got to do so? Absolutely not.


But you have no control. How do you know your success can be attributed to the interview?


Of course it does. However the article and perhaps the underlying paper may be as useless as the interviews they are criticizing.


It's also worth noting that those students are probably very inexperienced interviewers.


Indeed, warrants further study.


I don't see the problem. That researchers haven't done a perfect study in every possible way is hardly the issue here, but rather the blind trust in our ability to interview and hire people. This research paper does not come out of the blue. This is something that has been observed by many organizations and people for quite some time, that interviews are broken.

In fact there is an awful lot of wrong stuff society keeps doing despite evidence over many year suggesting it is stupid. The stupidity of high CEO salaries e.g. has been proven quite well, yet business keeps a blind faith in it. Getting employees in complicated jobs to perform by rewarding them on extremely narrow metric has also proven itself counterproductive yet the MBA crowd refuse to believe it doesn't work.

Business practices often seem to be more like religion than founded on reality. It is a GOOD thing that some researchers are trying to do their bit to correct this flawed picture.


> I find it quite problematic that researchers get to talk about their own research and present it as facts without anyone taking a critical look.

Are you suggesting anything out of the ordinary is going on here? Doesn't one have to present their work as legitimate and wait for feedback? So long as they are open to that feedback, I don't really get this argument?

> Over time I’ve become more skeptical about this kind of psychology research (as more studies fail to replicate) and, as is often the case, here the sample size is quite small (76 students, split across 3 groups), predicting something relatively noisy as GPA. It is unclear to me that one would be able to detect reasonable effects.

So present that back them as a refutation? Are you waiting for someone else to do it? Why are you debating the reliability here?


There are many, many studies showing the same thing. This study is just one example. See, e.g., meta-analyses of your choice:

http://psycnet.apa.org/journals/apl/79/4/599/

They're predictive, but not very.

Structured interviews are better, but not really by much.

The problem with all of these things is they work, but not very well, and people make too much of any one thing. An interview is a very small slice of behavior, even if it's structured very well.

People make too much of them.


Psychology is a science like astrology is a science. Sample sizes are the least relevant of the flaws with its studies.


You are absolutely right!


Laszlo Bock (former SVP of People at Google) did a great job summarizing decades of research around structured interviewing in his book 'Work Rules!'

For a quick reference, the two defining criteria for a structured interview are:

1.) They use one or several consistent set(s) of questions, and

2.) There are clear criteria for assessing responses

That second point is really important. You can't only ask candidates the same sets of questions and have a structured process: you need to understand what a "one-star" response vs. a "five-star" response actually looks or sounds like. Training and calibrating all of the interviewers in a large company around a similar rating system is nightmarish, so most companies don't bother.

The book also outlines that pairing a work sample with a structured interview is one of the most accurate methods of hiring.

If anyone is interested in some in-depth structured interview questions or work sample ideas, feel free to email me. I've spent the last few years working on a company in the interviewing space and would love to chat.


Don't take his word as a word of god from heaven. Google's interview process is just as shitty as anyone else's, and shittier than some I've been through. It selects for people who do well on the whiteboard under pressure, which are very often not the best workers overall. It also wastes a ton of time both on the employer and on the candidate side. Source: interviewed ~100 people in my 6+ years at Google.


Yes, I don't think their process is magic. Google gets a good workforce because they are generous and prestigious, which means a lot of good people apply there. And Google is willing to say no to a lot of people in their search for good people. They reject a lot of candidates who would probably have worked out just fine.


Or, put another way, they choose to accept a high rate of false negatives to avoid false positives.


> Or, put another way, they choose to accept a high rate of false negatives to avoid false positives.

Which is how it is typically presented because it sounds much better than "reject a lot of candidates who would probably have worked out just fine". It is useful to perceive both the potential value in an approach like this and the shortcomings. Google can absorb the massive expense in man hours, lost opportunity, etc. that comes with trying to craft genuinely predictive interview processes, but a lot of the companies trying to emulate them can't. Too often, interviewees don't realize a process of this sort is stacked against them, and interviewers don't appreciate the negatives of adopting a still-nascent approach that sounds more reliable simply because it is quantitative - and assuming since Google does it it must work.


Interviewing is hard. I wonder if a number of great candidates just refuse to interview with Google because it's too cumbersome? I know a couple of great folks who just dropped half way because they couldn't be bothered with Google's lack of organization and their lengthy process.

Its not like Google pays the best or still has the best workplace. It's a large company with large company politics and red tape.


I wonder if a number of great candidates just refuse to interview with Google because it's too cumbersome?

I've met a few such people in this forum. Not many.

I'm not sure how one would even begin getting a rigorous estimate of that number. What is a credible sample of "great candidates" in this industry?


I'm not sure I believe that, actually. As elaborate as the process is, there are still plenty of false positives. I'm not convinced a simpler process would have produced a materially different outcome.


That's only a good tradeoff if the false positive generator carries the weight it generates in false negatives.


It's more than they're optimizing on a very specific set of skills/experience - the ability to answer a particular type of problem (eg. from "Crack the Coding Interview") on a whiteboard in under 45 minutes.


The trouble with that is it probably makes false positives more likely to slip through because they have to interview more people to fill a position...


Which is unfortunately wrong (unsafe), as it assumes the noise is random.

That seems to be an unproven assumption, and quite likely to be a wrong assumption. For example if good people turn out to be less interested in "honing their interview skills", adding parasitic noise to the signal.


bingo


It selects for people who do well on the whiteboard under pressure, which are very often not the best workers overall.

This gets tossed around as a truism. I'm curious, does anyone have any evidence for it? Call me a skeptic, but these kinds of "everyone knows" truths are often wrong.

Google and other such companies have a vested interest in getting hiring right. They also have the wherewithal to conduct studies, collect data, and let the evidence guide their hiring practices. Google in particular has shown a willingness to completely overhaul their practices by eliminating ineffective practices (remember their reputation for "thought puzzle" type questions?).

So I'm curious if you have anything to back up the idea that they're doing it all wrong.


To quote Abraham Lincoln: "Do not trust anything you read on the internet".

I know it from my own experience and that of many others who have been through the gauntlet. Take it for what it's worth, I'm not selling you anything. I don't look impressive on the whiteboard, but I do have a rather impressive track record. Something doesn't line up. :-)

FWIW, as far as I recall there was another experiment at Google where they tried to establish correlation between interview performance and job performance, and as far as I recall, there was no meaningful correlation. This, of course, is not fully representative, because it does not include poor whiteboard performers.


Don't take this the wrong way, but the anecdotes of people who didn't make it through "the gauntlet" are quite likely to be biased. Those of people who did make it are as well.

This is not data.


Did I say it was "data"? The closest anyone has come to "data" on this (that I know of) is Google, in that experiment where they just hired people at random. But they decided to ignore the results and stick to the soul crushing 5 hour interviews anyway, so data did not change the relevant people's minds.


Looking up the actual experiment, you're completely misrepresenting the conclusions. Here: https://www.google.com/amp/business.financialpost.com/entrep...

These were their conclusions: 1. The ability to hire well is random. This is referring to individuals, not the system as a whole. 2. Forget brain-teasers. Focus on behavioral questions in interviews, rather than hypotheticals 3. Consistency matters for leaders 4. Grades don’t predict anything about who is going to be a successful employee. School grades, that is.

So, stop making stuff up from behind your throwaway account.


Ouch, "making stuff up". That's harsh, my man. Thus far I've made absolutely nothing up in this thread, or indeed in any others under this account. And you're using a PR puff piece written by Google HR to discount years of personal experience that I'm sharing here. You're free to not believe me, but let's not level accusations without evidence, OK?


And yet you fail to provide a non-puff-piece link to the study you're talking about?


> Google in particular has shown a willingness to

Google is collecting and analysing data to improve its hiring process... not to improve the hiring process of the industry at large.

There is an effectively limitless supply of great engineers who will jump through hoops to work for Google.

That's just not true for the vast majority of the industry.


Is it really a truism? If anything, the general industry consensus is the opposite, that Google engineers are brilliant, the cream of the crop. Every big tech company and Google wannabe emulates their interview process. My Quora feed for whatever reason is littered with questions pertaining to how amazing working at Google is. In my experience, the people who question the effectiveness of Google style interviews seem to be in the minority.


> It selects for people who do well on the whiteboard under pressure

If this were true, Google would have crashed and burned a long time ago.

Obviously, their interview process selects for much more versatile engineers than that. Engineers who not only produce reliable and maintainable code, but who can actually come up with products that generate billions of dollars over the years.


A simpler explanation is that they pay well and are prestigious and so get a lot more good candidates. The proof is pretty obvious: what companies pay as much as Google and are as prestigious as Google and have bad engineers?


> what companies pay as much as Google and

Actually, Google pays under the average of top companies, because as a top tier company, they can afford. Most people I know who went to work for Google took a pay cut but don't have a single regret about it.

> A simpler explanation is that they pay well and are prestigious and so get a lot more good candidates.

Their pay and prestige will attract even more bad candidates.

How do you separate good from bad candidates?

That's right: a kick-ass interview process.


>Actually, Google pays under the average of top companies, because as a top tier company, they can afford. Most people I know who went to work for Google took a pay cut but don't have a single regret about it.

Compared to who? They pay more than AMZN/MS/FB/AAPL/etc. for equal level, but are stingier with levels. You might be correct that certain other companies pay more than Google (Netflix maybe?), but they're certainly above average.


> They pay more than AMZN/MS/FB/AAPL/etc. for equal level, but are stingier with levels.

Google is more generous to good performers via bonuses once you are working there, but if you have two offers in hand, you are going to find Google highly resistant to negotiating. The notion of people taking a pay cut to work at Google sounds plausible to me.

If one's goal is to maximize compensation (particularly in the short term), a Google offer is better used to get a higher paying offer at one of their competitors.


> Google is more generous to good performers via bonuses once you are working there, but if you have two offers in hand, you are going to find Google highly resistant to negotiating. The notion of people taking a pay cut to work at Google sounds plausible to me.

I don't think that's true. While google is by all accounts (including in my personal experience) unwilling to move significantly on base salary, they'll happily match pretty much any offer with stock from my experience (and the experience of others I've talked to).


What they will not do is adjust for cost of living or differences in taxation when comparing an offer in Mountain View with one in a cheaper locale such as Seattle (which is where two of the companies you listed above are headquartered).

Perhaps I dealt with a particularly nasty Google recruiter. I felt like the recruiter had misrepresented the health benefits and relocation package once I got the actual offer letter and related paperwork.


Ah, you're correct, I was specifically told "we don't take cost of living into account when deciding compensation" or similar language. (which isn't strictly true either)

That said, from what I've seen, compensation growth at Google from everything I've seen is faster than at the other companies, which means that for someone coming in at L>3, they will likely be given greater compensation at google than elsewhere.

I'm curious as to how they misrepresented things. I was actually pleasantly surprised once I got here by how extensive the benefits were, but I'm always interested in learning more, since while I actually think that 4 google interviews is a decent way to judge someone for google, I really hate their interview/negotiation process.


>If this were true, Google would have crashed and burned a long time ago.

How many of their projects succeed simply because they are google? Some major ones (like android) come to mind.


I guess the issue is that even the whiteboard style interviews can be gamed?


How does this work in practice?

If there are consistently used questions and specific criteria for assessing responses, can a candidate just learn the likely questions and what constitutes the "right" answer?


This is a good and important question. Also: while I have a lot of respect for people at Google trying to innovate on hiring, make no mistake: Google's heart is in the right place, but they aren't at the forefront of structured hiring, and their hiring processes are notoriously capricious.

The reality is that generating good questions for a structured interview is difficult. You can't just pose a programming problem. As you've noted, most programming problems have multiple good answers. Differentiating between multiple good answers from different candidates re-introduces subjectivity. Your brain would rather convince you that one valid answer is less good than another, even when it's not, that to admit to you that it can't differentiate or generate a narrative for you. The part of your brain that generates narratives is incredibly powerful and does not care about how accurate your hiring process ends up being.

What we tried to do was create questions that generated lists of facts. "Spot all the errors in this code" would be an example of this approach (but none of the three we used). We went into the process wanting to embrace epistemological uncertainty, generating a historical trail of data that we could retrofit to candidate performance.

In the end, work sample testing was so much more powerful a predictor for ourselves that we never fully got around to analyzing the data. Sometimes we'd get candidates that clearly generated inferior "lists of facts"; I think there may have been 1-2 instances where that outcome actually overruled work-sample testing delivered prior (out of a few tens of engineering hires and probably ~100 interviews).


Making sure I understand you: When you refer to work sample tests here, that refers to the crypto challenges and things like that that you published at Matasano? And you're saying that was much more predictive than the list of facts methods, right?


That's what people ordinarily assume I mean, but while our work sample challenges were similar to the cryptopals and Microcorruption stuff, they were not the same, or even derived from them. They were designed specifically to qualify candidates, and in fact predated our public challenges.


Just to further clarify, by work sample you do NOT mean an example of previously produced work? This still seems fickle: my dozen years of work output (for example, building and running a site with perfect uptime for millions of users) is not as valid an indicator of my future performance as how I happen to score on some arbitrary timed test?


Absolutely not. Samples of previous work are deceptive at the best of times.


Ahh, interesting. Would you mind sharing an example of how they were different, or how the fact that they were specifically for qualification changed the thought process? I'm working on making tech interviewing better and am fascinated by this stuff.


NDAs? Everything I've done yet is proprietary and judiciously covered by various binding legal agreements.

I wouldn't breach such a contract nor should you hire someone that would be willing to do so. The sole exception being in the cases where it would be the best for the public interest, i.e. whistle-blowing.


> but they aren't at the forefront of structured hiring,

Is anyone though?


Google tries to keep interview questions confidential - that's why candidates sign an NDA - and periodically rotates out questions that have appeared in public. Many engineers are also continually trying to think up new questions as well, usually based on their work.

For most questions, there's no "right" answer, but there are a set of points that the interviewer wants to see you touch on. For example, they might first want to see that you can code up a naive brute-force variant of the algorithm, checking whether you know the programming language claimed and can think through the problem, and ask you the algorithmic complexity. Then they'll want to see if you can get a divide-and-conquer or dynamic programming variant with lower time complexity. Then they might ask "What if it has to be an online algorithm, where new input arrives before the computation finishes?" Then they'll ask "How would you distribute this over 1000 machines, and what are the failure modes?"

At each stage, they're watching how you answer, and where you get stuck. If you ask clarifying questions or spend time to think before diving into coding, that's a plus. If you have never heard of the problem before (this is frequent - many questions are not in textbooks), they want to see how you would reason through it, and break it down into subproblems that are similar to textbook problems. If you miss language trivia, most people don't care; when I did interviews I'd usually volunteer the answer if they missed some API call, and when I interviewed my interviewers did the same. If you don't know how to solve the problem and can't make any effort to move forward through a solution, that's a big negative. Similarly if you don't know what the concept of big-O is or why it's important.


Maybe, but that doesn't sound like my interview at Google, which was interesting but really rather random. Perhaps most notably, one interviewer became irate that I didn't know the "ps -o" flags off of the top of my head. :-/


According to `man ps` on my Mac, there are nearly seventy of them - how many did he want you to know?!


It's been a long time since I interviewed an ops person. But when I did something like this, I'd ask, "Name as many arguments to ps as you can." (Or ls, also good.) I didn't really care about which flags they knew. What I was looking for was a pattern. Anybody good knows some arguments cold. Often they know them so well as part of a phrase that they have to think about what the individual ones mean, which tells me that they've done something enough that it's become a habit. Then they'll know a few others that they use occasionally. And then they'll stretch to name a few more that are obscure for them.

But the real magic comes when they run out. What do they do then? Good people say that they don't know. Often, they will take guesses at a few more based on general principles. E.g., "I'm sure there are arguments for sorting, but I only use top for that." Or, "There must be more ways to filter, but I only do it by user or command." They'll usually look a bit uncomfortable, because they are the sort of person who likes knowing things. But what they won't do is bluff or make things up.

And honestly, that's my biggest tip for hiring: find people who like understanding things and know a good amount for their level, but are willing to say when they don't know. People who can't do that are dangerous.


I feel pretty comfortable with my Unix abilities, and have held ops roles in the past, and I would have bombed this question.


I doubt that. The only way to bomb it is to make shit up, and I don't think you're the sort to do that.

I don't score it by how many option-letters people know. I score it by how well they demonstrate familiarity with a basic ops activity, which is finding out what's running.

Even if somebody said they never used ps, or like below, only use one incantation, that isn't a problem, because my follow-up there would be to ask how you'd find out what's using all the RAM or what's using a lot of CPU.

Maybe they're just used to using different tools, in which case I can ask for details for those tools. (And then check manually later to make sure I'm not getting snowed.) But if they've done a bunch of ops stuff, they'll demonstrate the sort of know-from-doing level of familiarity with something.

In theory I could just ask the broader question to start. But some people are good talkers, and I'm looking for evidence that people are good doers. All that said, I've shifted mainly to more experiential interview processes, as actually doing is the best way to see if somebody's a doer.


I would almost assuredly stop the interview at this point and try to figure out what your malfunction is.

To me any question where the obvious answer is "I can trivially look all that up in the man pages" is a gigantic red flag that I don't want to work at that place.

If I'm feeling curious that day, I might begin a conversation about what you think you are testing for with that question, as I've spent a fair bit of my career studying hiring pipelines. But if I hadn't had my coffee yet or were irritated or something I'd probably just ask to talk to someone else.


That's fine. If you go into something assuming that anything you don't understand is a malfunction, and you excuse your rude behavior with your mood and/or caffeine level, you're probably not the kind of person I'd want to spend a lot of time working with.

That said, as I've explained elsewhere in the thread, the point isn't to see how many they know. It's to see that a) they know some portion of it that people doing the work would know, and b) have reactions to the rest that indicate useful work habits and attitudes.

A perfectly find first answer to is "I only use ps -aux. I'd just look in the man page if I needed anything else." Because then we could have a good discussion of how they use that output to figure out things, what sort of things they'd expect to see in the man page, and what other tools they use to see what's going on.


I'm pretty confident both of you are people I'd be happy to work with in the future and so I want to use this as an opportunity to point out that your reaction to what Kasey said his reaction would have been to your interview question practically guarantees that you would fail to acquire Kasey for your team. I know Kasey a little bit better than I know you (he's a Chicago person) so I'll just sum this up as "that's not a point in the win column for this interviewing strategy".

I know you're not asking people recite "ps" flags from memory. The problem is, you're looking for a subjective "X-factor" in how someone answers a trivia question.

I used to really like this approach too. I'd think of things that you'd only know if you'd actually done the kind of work I'd done. I had some favorite interview questions: "what are some of the functions you carry around with you from project to project in your 'libyou.a' library", and "where would you start looking if you had to debug a program that segfaulted in malloc"? I think the logic I was using is the same as the logic you're using here.

Ultimately, I think it's a bad approach. Equivalently strong candidates look at their work through different lenses, and find different things memorable. Far more important to me, though, is that some people really suck at interviewing --- and, what motivates me more, having succumbed to this repeatedly as a hiring manager and also being myself I think an example of the phenomenon --- some people just interview way better than they should.


The issue with this question falls into 2 main categories. The first is that its a rude question. The basic premise seems to be, "if you are an experienced ops person you have to know something about ps". Which may be true, but by asking it of an experienced operator you are hinting that you don't believe their resume. There is a much more straight forward way to see if they are lying on their resume, call their references and prior employers. For really good operations folks, the answer to this question might be "I haven't needed ps in so long I've forgotten how it works. I've automated away the process running abstraction and have a library of python/bash/salt/chef/esoteric top commands/etc that I find more useful than remembering ps flags". This question implies their experience in that regard is also suspect.

The other problem with this question is that, by its phrasing, it implies that there is a right answer. If I can name more ps flags than an other person, I'm a better candidate. Which when put that way, I think points out its faults. Maybe you don't intend for that to be true, but you'd have to fight your own cognitive biases pretty hard to not at least bias towards the person that can name 17 flags off the top of their head, or the one that teaches you a new flag combo you didn't know. Even though those things are likely not that good of predictors of a good candidate.

You've said what you really want to get to, is can they admit when they don't know something. If that is important to your hiring process (and I'd encourage you to validate that with data), ask them a question no one could know the answer to. Ask the same question to every candidate, and grade it purely pass fail. Did they say "I don't know" or not.


  >> admit when they don't know something... encourage you to validate that with data
Correlating this with employee performance data might not be the only metric.

My experience is that people that won't admit they don't know something are not necessarily bad employees. Often they are capable employees, although this is a trait I rarely see in the best employees. However, I find they are toxic to a good work environment, since they won't listen to people who do know something.

They do seem to get promoted to management had a much higher rate :)


They'll usually look a bit uncomfortable, because they are the sort of person who likes knowing things.

So the process is, in fact, designed to make people feel uncomfortable, for a little bit (or at least you're definitely aware that this is a frequently occurring side effect).

Does that not suggest to you that there may be a negative tradeoff at play here? (To wit: yes you do manage do efficiently extract a few bits of information from them... but at cost of having them feel like they're being, well... interrogated. And are already "losing points" by that point in the conversation with you).


I think part of the point may be to select for people who don't consider "being interrogated" to be a contest. There are plenty of folks who are just like "Here's what I know, here's what I don't know, you decide whether you would want me on your team. If you don't, that's fine, I'll find another team to join." Then there are other folks who, when they're asked a question and don't know the answer, feel like their self-worth is under attack. The latter group can be really hard to work with, and can do a lot of damage to a team.

I had a PM friend who would try to push the bounds of a candidate's knowledge until they gave up and said "I don't know". The actual knowledge wasn't the point of the question: it was that they could say "I don't know", because PMs who can't tend to make life miserable for the engineers & other teammates who work with them.


Yes, exactly. I think to be a good developer, you have to be willing to keep learning. And to be a good developer in a team context, you have to be willing to learn from your teammates, ask for help, and share knowledge in a way that's supportive, not prideful.

And I agree with your PM friend; I think willingness to admit ignorance is even more valuable there.


If somebody goes into an interview expecting never to be asked a question they might struggle with, I'm not sure they're somebody I want to hire.

But no, I don't think they walk away feeling interrogated. My style is pretty conversational, and when people don't have all the answers, I definitely work to make them feel comfortable with that. E.g., for this question I'd close with something like, "Of course nobody knows all the arguments to ls. I sure don't. So you did fine."


This is a stupid trivia question. I use ps -ef and leave it at that. In my entire career I've rarely needed anything else and if I did I would look it up but never enough to remember. To base an interview question off of that speaks volumes about you.


Well, yes and no. If somebody just scored it as written, giving points for each argument known, it would be a dumb trivia question. But as I said, that's not what I do.

Try it next time you're mingling with ops people. Say in a bar at a conference. Don't look at what they answer. Look at how they answer. Some people are comfortable not knowing. Some people are excited to discover the ones they forgot they knew. some people get curious and eager to fill the hole they've just noticed in their knowledge. And some people get defensive and peevish.


Some people focus their curiosity on more important things than arguments for a command line utility.


I agree, and it's interesting that you know -e, while I've memorized -axuf ; I guess it's more portable?

Again, this is one of those, "Here's what I've got cached, and here's where I'd look up what I don't know offhand" questions.


No, "-axuf" is a weird mixture of BSD and POSIX syntax.

From the ps manpage:

Note that "ps -aux" is distinct from "ps aux". The POSIX and UNIX standards require that "ps -aux" print all processes owned by a user named "x", as well as printing all processes that would be selected by the -a option. If the user named "x" does not exist, this ps may interpret the command as "ps aux" instead and print a warning. This behavior is intended to aid in transitioning old scripts and habits. It is fragile, subject to change, and thus should not be relied upon.


I'd probably be snarky and say "I don't know - but I can google it!"

/actually I'd probably hit up the man page first...


dangerous because they prefix things with sudo?


I imagine there's a correlation there. But for me it's the danger that instead of admitting when they don't know they'll just bluff, going off and building things that look good but aren't. If somebody knows their limits and can admit them, you have to supervise them much less closely than a bluffer.


He probably had at least five or ten in mind. I prefer other methods of extracting this information. Usually 'ps -efly' piped into tools like awk/etc to gather what I want, or on occasion, extracting directly from the /proc filesystem. He did not consider these to be proper alternatives.

Since the 'ps -o' flags are only useful in that one command, I just look them up in the (rare) situation where they seem the best alternative. Having been at this for decades, I try to reserve my limited memory capacity for factoids that will pay off, and being able to process columnar data generically at an instant seems more useful than knowing a special way to do it with that one command.


If the stuff online on Quora etc. are to be believed, the hiring committee will disregard such trivia interviews (and possibly send a node to the interviewer asking him/her to correct their technique).

Not sure how well this works in practice though.


Last time I was job-hunting I experimented with Google to see if their interviews were as bad as everyone said.

They were. I was asked a ton of this kind of Linux/Unix trivia in one session. As well as just random stuff. For example:

https://twitter.com/ubernostrum/status/659182356171874304

https://twitter.com/ubernostrum/status/659182564309995520

https://twitter.com/ubernostrum/status/659182708996706308


The question there was something like "roughly how big is 2^24?". I actually think that's a reasonable enough question, but anyway there's a nice story about this same question in another context. The mathematician Solomon Golomb was (as an undergraduate, I think) taking some sort of biology class, and the lecturer described some process to do with cell division and said "... so the number of ways to do that is 2^24, and we all know that that is, don't we?" (meaning, of course, "and no one knows what that is but it's OK because I'm about to tell you". But Golomb happened to have been memorizing numbers of the form n^n, and 2^24 = 8^8, so he immediately called out "Yes, it's 16777216".

As a result of having read this story, I can now always instantly remember what 2^24 is too :-).


Another way to answer this kind of question is to know 2^24 = 2^10 * 2^10 * 2^4 and that 2^10 is 1024 which is roughly 1000.

So 1000 * 1000 * 16 = 16 million would be an easy estimate to make.

If you are designing a system and want to have an idea of how much space or memory an approach will take, being good at this kind of math can be really helpful.


The problem is that there are some people who will need to go through that process in their head to produce an estimate of 16 million - and then there are people who have had cause at some point in their life to know of the existence of the concept of '24 bit color' and that the number of colors expressible in 24 bits (8 bits each of R, G and B) is exactly 16,777,216.


Then they should probably ask people to estimate a thing that's even remotely relevant to the job.

(I mean, I get it, this kind of general estimation and/or ball-parking can be incredibly valuable, but this doesn't sound relevant in any way.)


This kind of general estimation and/or ball-parking can be incredibly valuable, but this doesn't sound relevant in any way.

To be fair to Google (in regard to this one filter question only), they actually are up front about expecting a significantly higher baseline of general mathematical awareness (even for non-mathy roles) than most other shops. At any rate, this kind of estimation comes up in capacity planning (both in architecture planning, and for algorithms on a single box) all the time. Which after all is what Google does, at nearly every level of their stack, (nearly) all the time.


It "comes up all the time"... and these ballpark guesses are probably(!) completely useless for anything remotely practical. At the scale Google's operating you want a Statistician, not a Guesstimator.

So, no, still not useful questions.


What about when you know what the concept of big O is, know why it's often touted as important, and disagree that it's as important as it's touted?

My opinion is that big-O often ends up being used as a premature optimization effort hindering "just get the right answer first". Maybe that brute force method will work in a reasonable clock-time cost, despite having egregious algorithmic cost. You won't know if you're busy overengineering and optimizing before you even have a working solution.


That's a big red flag, at least in a Google setting.

The reason is because when your dataset is in the petabytes, any algorithm bigger than O(N log N) is not going to terminate. You're not going to get any answer at all; your MapReduce is going to sit there burning CPU time for a day, and then you'll kill it, and you won't have any idea what went wrong. (This is learned from experience, if you couldn't tell.)

In a startup context, I'd certainly agree with you - that's exactly the approach I'm taking with my startup, where I'm doing a lot of the debugging and code-wrangling on my local laptop after verifying that the data I want exists and calculating its size with a couple cloud-computing passes. It can sometimes be useful to prototype & iterate with a small subset of the data until you've got the algorithms down.

But many of the algorithms Google cares about don't give any useful results on small data sets. I remember running a few collaborative-filtering algorithms and finding that they gave zero answers because within my small sample of 0.01% of the total data, nobody liked anything else that anybody else did.


In my experience as a Google engineer, this is mostly false. None of the work I or any of the people I know well at Google is on petabyte datasets; most people just aren't on the hot path of indexing/websearch/youtube. In fact almost none of the stuff I was asked about in my interview has been relevant to my work. I have thought about complex algorithms two or three times in the past year.


Not only that, but (in my experience as a Google engineer), factors relating to the location of data (distribution and disk vs ram, etc.) often greatly outway algorithmic performance. Constant factors often dominate variable factors.


Exactamundo! Cache/memory/disk locality dominates almost all other factors[1] unless you're doing something very special. (Speaking as a non-Google person who also likes to think that he knows what he's talking about.)

[1] Well, it's really the amazing amount of throughput that modern architectures can achieve. Latency hasn't improved quite as much given the inescapable limitation of the speed of light.


What department? I was in Search and used that knowledge all the time, and also did rotations on GWS, Google+, and GFiber. Certainly I used less of that knowledge for GWS (where everything you need is in-process) and for GFiber (where I was working on marketing & customer signup - the job req for this was actually different, and both interviews & job duties were much more like a traditional SWE job in other companies). But even Google+ needed a lot of algorithmic complexity knowledge; G+'s scaling needs are similar to Twitter's, and we know how that went.

Perhaps it has changed as well; when I left (2014) they had just started encouraging engineers to focus on one particular task, with the architecture already defined, while when I started (2009) there were still a number of problems of the form "here's a feature we want to add; here's the data we have available; how can we build it?"


I'm on Ads and Commerce front-end, and I also know a number of people on Android. These of course are mostly constrained by users' devices, so don't have the scale your talking about. But I also think the trend you're talking about of smaller roles has been continuing and has a lot to do with it as well.


Everyone loves to put Big-O up on a pedestal, however many times the constant costs can outweigh the rest of the algorithm. Cache access times in particular can be brutal for non-linear access patterns.


> What about when you know what the concept of big O is, know why it's often touted as important, and disagree that it's as important as it's touted?

What if you're ​having a conversation with someone about local politics and you're convinced the local zoning rules don't encourage growth in the way they think they do. Do you force the subject of conversation so you can make sure they know just how wrong they are? Not if you want them to walk away having a good impression of you.

Instead if the talk turns to zoning you put out feelers to see if they'll want to talk about the larger issue, and only engage in the conversation if it seems like the time for it.

The analogy isn't perfect, but if the interviewer wants to talk about the larger issue, you're probably well matched and will have a great interview. If not, and you know enough about runtime complexity to discuss the tradeoffs, then you know enough to just answer the textbook big-O question and move on.


There are situations in which you want to just get a right answer and ship, like at an early-stage startup.

Then there are situations where the code you write must play nicely within a massive, already-complex system, wherein it will work with potentially huge inputs. Like at Google.

So, saying in a Google interview that you don't think Big Oh is super important, and that you prefer shipping whatever correct solution you come up with first and then worrying about efficiency if it ends up being slow, would likely not get you very far.


> What about when you know what the concept of big O is, know why it's often touted as important, and disagree that it's as important as it's touted?

Let's say you created your own company and you're interviewing engineers.

Would you be comfortable hiring someone who doesn't know the difference between O(n^2) and O(2^n)?

Or someone who doesn't even know what these concepts represent?


I've always called this kind of thing "trying to get the sizzle before the steak". You gotta have a steak first; get that, then work on cooking it - then work on cooking it to perfection if that is what is needed or wanted.

There's also the thing I picked up from playing around with graphics demo coding; get the algorithm working first, even if it takes 10 seconds per frame. Then look for the slow parts, concentrating first on the inner loops. Whatever you do, don't try to prematurely optimize the code as you write it, as tempting as it may be. Because almost certainly you'll make things worse than if you waited until after you have a working first pass.

This applies to way more than just fast graphics code, of course.


The questions about BigO isn't about premature optimization or actual implementation. It's about understanding exactly what a particular program is doing and how it's doing it.

Any sort of technical solution given at a Google interview would generally have the following questions:

1) What big O time does this algorithm run in? Why? What big O space requirements does it take.

2) If space were more/less expensive, or time more/less important, how would you change the solution and why?

Understanding those tradeoffs and being able to analyze code at that level is a big part of most software engineering jobs.


Of course. For example, many of the questions for technical interviews involve knowing which data structure to use in a given setting, and applying it to find a solution.

The interviewee is more than welcome to study data structures and interview questions about data structures. A poor student will just memorize questions and answers by rote without really understanding the data structure in question. The good student will actually learn and understand data structures in the context of the solution.

One would hope that interviewers are able to ask follow-up questions about the solution which distinguish between the two.

EDIT: During the best tech interview I have done, I had no idea how to solve the question asked (that is, I had not studied its solution despite this being a relatively common question). I was able to ask articulate questions and invoked understanding of data structures as I went along. Memorizing solutions is a game of luck, and not recommendable.


Bullsh*t. Seriously, these questions are all tricks. If you memorize the tricks you can win the game. Even the "really easy" questions like reverse a string involve a trick -- swapping from the ends and stopping in the middle. The interview game involves memorizing tricks and identifying which memorized trick applies to the question.


I feel like 90% of writing software is identifying which "trick" applies to the problem you're trying to solve and applying it correctly.


Yeah, but you can look them up.


The thing about looking up tricks is you can only look them up if you remember they even exist to look up.


This assumes the trick is easily-indexed-by-name and can be looked up - so you have to know it exists. A lot of the better tricks are more like 'frameworks' whether specific ("Four Russians", "prefix sum") or general ("dynamic programming", "branch-and-bound").

There's a limit to how far you can get just Googling stuff ("how do I reverse a string") for recipe-book solutions to things. I think practically everything in Computer Science could be Googled ("how does merge sort work", "what is an inclusive vs exclusive cache") at some level but this doesn't mean one shouldn't know a great deal of it. At least, if you want one of these jobs...


And yet most people are memorizing solutions to hundreds of different questions and getting great jobs because of it. I know this because I have several friends who do this.


Well, at least hope that some of that information sticks.


This is a great point. There are absolutely candidates who are "good interviewers."

I think one of the most helpful methods of determining a good interviewer vs. a good future employee is taking a structured, situational interview question, i.e. "How would you go about selling a new product to a customer?" and turning it into an actionable work sample that is tailored to your company. "We make this piece of software that does this. Spend the next 10 minutes writing a cold email to a prospect that outlines our offering." As the interviewer, grade the work sample on structured criteria that is important.


This is the "calibrated interviewer" system. It strikes me as funnily similar to machine learning models, in that it's a black box that somehow demonstrates statistically it's effectiveness at a business problem, and you just have to hope there isn't some glaring bias or fatal flaw in the model that only occurs sporadically.


Only if you use a small, static body of questions. But there's nothing wrong with, say, each interviewer coming up with a couple of questions and rotating it them in a while.

You do want to try out the question and criteria before depending on it too much, but it's easy enough to try it out on a colleague.


Among the many things wrong with this approach is the fact that you aren't generating a body of questions and responses that you can evaluate against your choices and the resulting performance of candidates you hire, making it impossible to effectively iterate.


Depends on how you think of the interview process. The theory at work here is that if you get a number of domain experts to give an independent evaluation of a candidate and then compare notes, you'll get a decent outcome. Requiring them to use a structured, repeated question with a formal scoring rubric is to subtract certain sorts of bias from each individual evaluation. When hiring somebody doesn't work out, you go back and work on the interviewers and their interviewing skills, not the questions themselves.

I take it you're working from a different theory, where you try to put the expertise in a machinery of questions and answers, relying less (or not at all) on the interviewers themselves. I think that's also a valid approach, one with different limitations.

Personally, as I mentioned elsewhere, I'm not a big fan of interview questions anymore period. If I want to know if somebody can do the work, now I try to create situations where they just do the work. But if one is trapped in the dominant tech interview paradigm, I personally favor investing in interviewers more than the questions themselves.


The same way it works for teachers that give tests. Come up with a bunch of questions and only ask a random subset of those questions to each person. Occasionally evaluate the questions and add/remove some.


There few questions that you can learn to answer but it is probably very hard to reach the 5 star answer if you do not understand what you learned in detail. An example question would be: what happens when you type www.domain.com into your browser and hit enter. You can answer this many different ways and levels.


That question has also been asked so many times most candidates who have done a few minutes of googling can give you the "right" answer. See here: https://github.com/alex/what-happens-when/blob/master/README...

I'd probably start here to lighten up the mood due to how cliche the question has become:

https://github.com/alex/what-happens-when/issues/231


Candidates doing a few minutes googling to find this, but will they put in the few months to get to the bottom of every aspects of it?


Probably not if they have actual work to do rather than memorize easily google-able trivia questions.


A book conveniently published in the wake of the PR disaster that was their wage fixing scandal.

Nothing about Google's hiring process can be deemed reputable when they'd go so far as to illegally conspire with other tech giants to prevent potential employees from achieving the best possible outcomes for themselves. Maybe things have changed over there, but given the slap on the wrist they got, I'm skeptical.


> Laszlo Bock (former SVP of People at Google) did a great job summarizing decades of research around structured interviewing

Most importantly most companies interviewing don't get to pick from the pool of applicants that google gets to. And even more importantly most companies don't even get to pick from the pool of applicants that a typical funded (or soon to be funded) startup will get.

While I am sure there are things that can be learned from 'Work Rules' the question is how much of that applies to the vast majority of companies in America.


Google's hiring process is famously painful, and they lose a lot of great candidates that way. I wouldn't trust their insights to build a hiring pipeline.


Google is definitely not a benchmark for interviews. From what I experienced, they must have a huge rate on false negatives. You can only afford this if you're Google (or that level).


Yup, it was pretty common knowledge at Google that if you took all the engineers and ran them through the interview process again, it would have rejected most of them.


The view is: better to lose a good candidate than to hire a bad one.


Google's interview process is the single shining example of the worst interview process I've ever been through. So much so I actually took the interviewer to task for it.

It may be good at screening University graduates but it's pretty awful at anything else. It's also really obvious that the interviewers don't actually know how to interview people and are pretty much cargo-culting the same process that hired them.

Awful. It totally destroyed my perception of the company.

Now granted this was... 7 years ago, no (or more) but from colleagues who have interviewed there (or been hired!) it's not changed fundamentally that much.


Getting candidates to jump through crazy hoops is a sensible strategy for a big organization. It selects for compliance, which is important when your enterprise is too complex, uncertain or fast-moving to operate by consensus


A fair point, though not really a defense as such ;)



Easy link to the relevant table from the paper: http://imgur.com/a/YRFTh


From my experience, I'd say Google's interviews need to ask harder questions and last longer.

Today they ask fairly easy questions (given a list of integers, find the subsequence that has the following property etc. - basically leetcode medium level) that you can solve fairly easily using basic CS 101 knowledge. However, it ends up (again, IME) being a race against time where you have to whiteboard code a simple solution in 20 minutes approximately.

I'd much rather have them as questions that take real insight to solve, but have the interviews last 90 minutes or so.


I recently suggested to them not to dumb down their interviews as they simply make the equation about talking to them not worth the time. I remember they used to have much tougher interviews in the past, which were super motivating; now it's like standard hackerrank stuff which is pretty uninteresting. OTOH, each company has its blind spots so that they won't be on top forever, so maybe it's better that way as those that were very capable but rejected could dethrone them at some later time.

To me their approach lately is like forcing a top tennis player to play with kids (well, you would be surprised how many pros have issues slowing down their game) or a double diamond slope skier to ski on blue slopes only (and increasing the risk of getting an injury as their timing/moves are optimized to ultimate performance instead of basics). I've heard of people that solved stuff on interviews nobody solved before them but were rejected as they messed up some basic stuff and recruiter was communicating back to them that it looked bad in the hiring committee.


But if you were interviewing for a tennis instructor, wouldn't that be a valid result? You probably shouldn't hire pros who aren't good at teaching when the job involves teaching.

Similarly, as a senior engineer at a large company, you need to be adaptable enough to work with people with less/different experience than you. Adjusting your message to the audience is an important skill.


It ensures consistency, but how do you know that whatever you're measuring for is actually correlated with job performance?


They don't. In fact they even did an experiment and admitted some people at random, irrespective of how well they did in the interviews. Those people were found to perform about as well as the legitimate interview "lottery winners" hired during the same time period.

A lot of people (anecdotally, the majority) at Google have the "impostor syndrome", and the news of the experiment did nothing whatsoever to quell the symptoms. Now they don't know if they are, in fact, not impostors, but they do know that on average they perform about as well. :-)


Could the performance of the lottery winners have been "environmental"? That is, they benefited from being surrounded by competent people (which was, in turn, guaranteed by those people having gone through the interview process) and "leveled up" due to that?

In other words, maybe as long as you let in a small number (but only a small number) of non-performers, you're fine (which is bound to happen anyway - I'm sure there is some noise in the interviews).


Yes. Getting hired by Google is only part of the deal. Actually _succeeding_ when you're already there is much more difficult. It's a high pressure environment with a lot of very smart overachievers. Because of this it's sort of a self-fulfilling prophecy, and people who don't measure up also don't feel welcome, as it were. Since performance reviews are largely derived from peer feedback, hiring mistakes tend to be self-correcting. Most of the time, though, I've seen great people leave just because they didn't like the pressure. The amount of pressure depends on the team. The higher the profile -- the more pressure (but also more rewards, greater career potential, etc). But the general bar for what's considered "good work" is pretty high, and more uniform than in any other large company I have ever worked at.

Then there's the issue that by the time you even get an on-site, you're already very much not a random candidate. Recruiters actually do look at your track record, etc. You can bullshit there, but I don't recommend it, since references will be spot checked, and they better line up.

Google interviews are largely a roll of the dice above certain level of basic engineering competence. I.e. if you don't know the basics, you will almost certainly not pass them. But if you're a more senior candidate, Google doesn't really know how to interview you, and their interview process turns into a random number generator biased heavily towards "no hire".


They are no longer amongst top choices for top people. Alphabet might be, Google isn't. That's why they are dumbing down their interviews in the past 8 years and repelling even more top people that want to change the world and not be just another cog in the machine.


They certainly still _are_ among the top choices for top people, but they're no longer the _best_ choice for most. I can't in good conscience advise anyone to join any 70K person company. "Cog in a machine" describes it pretty well. Ignore "self driving cars" and "internet balloons" and other BS: there's near zero chance you'll get to work on any of that, particularly if you don't already have a stellar track record at some company Google/Alphabet respects (of which there are very few).


Yes, but even when this is done well, there is still the question of what's measured vs what results in hiring good candidates and not hiring the others. Turning away a huge number of false negatives is also not optimal, especially in a situation where talent is scarce.


this just gives them resource liquidity for operations - the people that take things forward are mostly acquired

if you have to start using structured interview questions to expand a team - you've already lost


I have some major issues with their conclusions... and the title of the article (which is mostly nonsensical clickbait).

The real conclusion should be that "unstructured interviews provide a variable that decreases the average accuracy of predicting GPAs, when combined with (one) other predictive variable(s) (only previous GPA)."

This conclusion seems logical. When combined with an objective predictive measure of a person's ability to maintain a certain GPA (that person's historical ability to maintain a certain GPA), a subjective interview decreases predictive accuracy when predicting specifically a person's ability to maintain a certain GPA.

To then go on to conclude that interviews then provide little, to negative, value in predicting something enormously more subjective (and more complicated), like job performance, is absurd - and borderline bad science.

There are numerous (more subjective) attributes that an unstructured interview does help gauge, from culture fit to general social skills to one's ability to handle stressful social situations. I'd hypothesize all of these are probably better measured in an unstructured (or structured) interview than in most (any) other way. To recommend the complete abandonment of unstructured interviews (which is done in the final sentence of the actual paper) is ridiculous.


> There are numerous (more subjective) attributes that an unstructured interview does help gauge, from culture fit to general social skills to one's ability to handle stressful social situations. I'd hypothesize all of these are probably better measured in an unstructured (or structured) interview than in most (any) other way. To recommend the complete abandonment of unstructured interviews (which is done in the final sentence of the actual paper) is ridiculous.

Well, based on what? Is there any evidence for any of these hunches?


I didn't say it was based on hard data, that's why it's a hypothesis...

But, if you really believe that it's a huge leap to hypothesize that interviews would be a better measure of subjective social skills than metrics like GPA, I don't know what to tell you.


I mean let's go further back. I question the assumption that selecting for "culture fit" does much more than shut people out whose ethnic background or social class is too different.


I mean, that's an extremely narrow reading of "culture fit".

Believe it or not, there are non-racist and non-classist character traits that are also not represented in one's GPA.

But, I guess hiring for one's willingness to work with specific clients, or for one's happiness with the organization's structure (in, for example, a holacracy) wouldn't be valid things to hire around to you?


I've been a programmer for a bit over ten years. I've worked at scrappy little startups, midsized companies, now for a tech giant for a few years. The engineers I work with at the tech giant are consistently better engineers than my other coworkers have been, and I credit the very structured interview process. We're trained to ask specific questions, look for specific types of answers, and each interviewer is evaluating different criteria.

Also, it is quite often not the technical questions that end up making us decide not to hire someone. That's just one area. I know it's the part that sticks out, and candidates give a lot of weight to it in their memory of the interview, but you shouldn't just assume it was that your whiteboard code wasn't quite good enough. I actually don't think that's the most common thing we give a "no hire" for.


Which of the companies paid the best/has the best benefits?

If it's the tech giant, it's very possible they attract better candidates because they offer more.


In my experience, the big Corps don't tend to offer anywhere near as much compensation but better days off etc etc.

The best sofware guys I've ever seen work at those big corps. The big corps are the ones that have the resources to work on the REALLY hard problems and not writing the same CRUD app over and over again. Those same companies also provide tons of educational benefits so that you become an expert in your field.

Things like fighter jet software are fantastically complex. You hear about the dumb mistakes that are made (International Date Line Issues), but you never hear about the thousands of hours of testing every change to code goes through and the insane calculations that are made every second in even standard level flight.

I have no idea why there is such a backlash against interviews. What's so hard about studying for a job you want? Even if you don't USE knowledge, you should at least be able to rederive things using your base knowledge. Thoe interviews I've seen at aero companies are damn hard. Tons of grilling on if you actually understand mechanics and fliud dynamics.


>I have no idea why there is such a backlash against interviews. What's so hard about studying for a job you want?

That's a really good question and I had to think about why I don't like interviews.

1. It's an interrogation and the stakes are extremely high. There are so many aspects of it you don't control. Some guy had a shitty morning, or just doesn't like you, dropped your resume on the way back to his office, etc.

2. It's completely phony and it starts with the first question, "So why do you want to work at generic company X?" "I need money," is not an acceptable answer. Now I have to be phony and tell them why their company is awesome (it's not) which makes be feel like a kiss ass. I have to pretend to be excited about working at a boring ass company. Just shoot me.

3. We have to do it, unless we have rich parents that died young like Batman.

4. The person or people that are interviewing you have no notion of what you've accomplished aside from skimming your resume. All the hard work I've done to produce miracles in the last 20 or so years means, really nothing.

5. Companies only reward tenure at that company. It signals the start of something new but that's typically a bad thing when it comes to work. Less vacation, less credibility, less influence, etc.

6. You really don't know if the person or people you are interviewing are bozos or not. It doesn't matter, they cut the checks, they have the power.

7. As the article insinuates, they're fairly pointless.


they may also be able to hire more, and let people leave on their own (or let them go later), keeping the 'better' ones on board. Smaller companies can't make the same number of hires in the first place.


The tech giant, for sure.


I say the same thing on many of these threads. Most people rejected a at BigCos aren't rejected for failing to pass the technical bar(s); those people get filtered out in the phone screens.

The by far most common reasons interviewers offer for low scores are: didn't listen/didn't acknowledge when they reached the end of their knowledge/didn't test when they got off the rails/just didn't seem like a cooperative coworker.

These are all filters that may not matter at ye olde startup; but in a stable long lived team, these are red flags that seem corellated in my experience with lowering the morale of a dozen other people. So, even if you're a rock star, you might get rejected.


This is a great comment. Totally agree.

> didn't acknowledge when they reached the end of their knowledge

Oh man, this one. How hard is "I don't know?" You're not supposed to have the entirety of computer science and software engineering knowledge jammed into your head. Let's talk about what you know and what you don't and figure out whether you're a good fit.

We hire lots of people without direct experience in what we do because they seem great to work with and we think they'll learn quickly. Just please don't try to bullshit; it doesn't usually work.


> I actually don't think that's the most common thing we give a "no hire" for.

Can you cite any of the more common things you give no hires for?


Some common ones:

- Lack of self awareness and introspection. Not being able to give specific examples of times you've made a mistake and how you learned from it.

- Shit talking old coworkers, general attitude that you're great and nothing is your fault.

- Being unable to talk about YOUR specific contributions; "we did this, we did that, the project made money..." Okay, what did YOU do? Surprising amount of people fail on that one.

- Not being able to give context about the "why" of what you worked on, what other options you considered, etc.

- Not thinking about the end user; "I used this technology because it sounded cool" instead of "I wanted this result for my customers." Tell us about the situation and why you made the right or wrong choices.

- Interrupting the interviewer and not asking questions. This is a weird one. We want you to do well, so sometimes if you're going down the wrong path we'll try to help. I've had candidates talk over me and just barge ahead down a totally incorrect path. ¯\_(ツ)_/¯

Obviously most weaknesses can be overlooked if there are serious strengths.

I've done a lot of interviewing, happy to answer questions.


>- Lack of self awareness and introspection. Not being able to give specific examples of times you've made a mistake and how you learned from it.

>- Shit talking old coworkers, general attitude that you're great and nothing is your fault.

These are fair enough, although you did say above that the other coworkers in your old scrappy startups were much worse.

> - Being unable to talk about YOUR specific contributions; "we did this, we did that, the project made money..." Okay, what did YOU do? Surprising amount of people fail on that one.

Some people will struggle to take credit because of politeness. Isn't taking credit trivial to do? Especially when the interviewer cannot possibly check up on that. I'm absolutely not convinced about this one, although I do make a point to talk about my contributions when I interview (which frankly doesn't happen much at all, since I tend to stay in the same job for long periods if possible and try to make things better there).

>- Not thinking about the end user; "I used this technology because it sounded cool" instead of "I wanted this result for my customers." Tell us about the situation and why you made the right or wrong choices.

This sounds like bullshit too, to be honest. Most people don't get to decide the general point of what they do unless they are the CTO. However I do realise that this is absolutely the kind of stuff interviewers look at and you need to prepare for it.

>- Interrupting the interviewer and not asking questions. This is a weird one. We want you to do well, so sometimes if you're going down the wrong path we'll try to help. I've had candidates talk over me and just barge ahead down a totally incorrect path. ¯\_(ツ)_/¯

Fair.

>Obviously most weaknesses can be overlooked if there are serious strengths.

Which you really won't know until months later.

>I've done a lot of interviewing, happy to answer questions.

I know people who'd do badly on several of these accounts and are completely brilliant at getting the job done.

The problem is that you cannot possibly know how well would the ones you rejected would have done.

If you can have a probation period of 3 months or so, that should be the main yardstick. Of course someone's attitude can become shittier over time but no interviewing process can reliably catch that.


> These are fair enough, although you did say above that the other coworkers in your old scrappy startups were much worse.

Nope, I said the engineers I work with now are consistently better. "My current team is very good" is MUCH different from "my last team was very bad." (And I'm not in a job interview.)

> Some people will struggle to take credit because of politeness. Isn't taking credit trivial to do?

No, it's not so much about credit, it's about specificity and the ability to talk about the different parts of the project. A lot of people get stuck on, "We made a project that sold widgets." I want something more along the lines of, "In our widget selling application, I worked on backend services for payment processing and fraud detection." Some people get really stuck on generalities and won't dive into specifics. It just makes it impossible to evaluate your contribution. Maybe you didn't do anything. I have no idea, you're just not generating useful information for me.

> Most people don't get to decide the general point of what they do unless they are the CTO.

I totally disagree. So many tiny choices that engineers make from day to day have a tangible impact on customers. Even if you're a junior engineer, you have an impact on the latency of service calls you're responsible for, as an example. Do you pick the lightweight framework that loads quickly, or do you need the features of the bigger one? It's that kind of tradeoff that people should consider, and "I saw something shiny" is not a good answer.

> The problem is that you cannot possibly know how well would the ones you rejected would have done.

Yep, for sure. I'm positive that I've rejected good candidates. That's the side you want to err on though, especially if you have lots of good candidates.


> No, it's not so much about credit, it's about specificity and the ability to talk about the different parts of the project..."In our widget selling application, I worked on backend services for payment processing and fraud detection."

If that line is as detailed as you're looking for, then that's reasonable. Although I've seen people get annoyed that I don't remember the specifics of exactly what I implemented and why and a list of the various decisions made on projects that are 2+ years in the past.

At my current company I've worked on 300 tasks over the course of 2 years, across 30+ projects, ranging from simple bug fixes to implementing large swaths of new software for Fortune 100 clients. There are some similar items in there, but most of them were different.

I don't have an amazing memory, so a lot of the old stuff I worked on becomes very vague. Hell, it can take me time to remember what I need to do to work on something I haven't done in the past six months at my current company.

I know those old projects in broad strokes but if you want me to talk about a specific project in detail that happened years ago, then I will struggle to dig up those memories, even after reading a document I made that refreshes things a bit before interviews.


"- Lack of self awareness and introspection. Not being able to give specific examples of times you've made a mistake and how you learned from it."

I'd venture that not being able to answer this question and others like off the top of their head is less an indication that they are unaware of/don't learn from their mistakes than it is an indication how well they practice for bullshit interview questions and can give you a bullshit answer to your bullshit question that gets past your personal bullshit-o-meter well enough for you not to be offended.

But since you're also probably trying to filter out cynical assholes like me, ¯\_(ツ)_/¯


> not being able to answer off this question...

Well it certainly shows a lack of preparation and experience.

Here's how I approach those questions. I think about each project I've launched, and whether they're relevant. It's an easy thought exercise, and then you have an answer to that question forever. And seriously, time you've made a mistake should be an easy one, and it's one everyone should have ready. Over your whole career you can't think of a time you've done something that you'd do differently if you had a chance? I bet you can, and if not, it shows a serious lack of introspection.

We want people trying to make themselves better. Thinking about your career and studying for interviews is part of that.

> filter out cynical assholes

Dude, at least 50% of the tech industry is people who would charitably be described as cynical assholes. Part of being professional is being able to turn it off. If you can't give a professional answer in an interview how can the interviewer expect you to give a professional answer in a contentious meeting?


Can confirm. My work life has been smooth and fairly boring, and I have piss-poor autobiographical memory anyway, so I pretty much have to make those up to have anything worth telling. Maybe start from some half-remembered kernel of truth, but it's all BS from there, because it's all I've got. Have to rehearse them so I'm not making shit up on the fly--too much risk of tripping up, or making it too flashy, or accidentally saying something in a way that makes it seem like I'm shifting blame, or taking credit I shouldn't, or anything like that.

Especially the "tell us about a time you had a conflict with someone" question. Ugh. I guess I just need to seek out some assholes to work with because I've got a big fat nothing to talk about on that one. I ought to start writing down the crap I make up for it so I don't have to do it again every time I interview.


They don't have to be big monumental conflicts or fuckups or whatever. And yeah, you should totally have a pre-canned answer. It can be something minor; "I once spent a lot of time implementing feature X, and then I realized I could have just done feature Y in a quarter of the time." Or, "Bob wanted to build X, and I thought Y would be better, so we disagreed about it, and ultimately Z." (And make sure the story doesn't end in, "and I was super wrong.")

I know it's phony, but job interviews are sales pitches. Preparation helps.


> - Shit talking old coworkers, general attitude that you're great and nothing is your fault.

There's a gulf between "nothing is my fault" and "some of my old coworkers were shit and they deserve the shit-talk they get from me".

I don't think it's fair to judge people universally on the base if they shit-talk or not. Being 100% kind and never saying anything bad about people in your past is vastly overrated.

> - Not thinking about the end user; "I used this technology because it sounded cool" instead of "I wanted this result for my customers." Tell us about the situation and why you made the right or wrong choices.

You're polarizing this way too much. If I go after a "cool technology", there are several things MANY interviewers don't take in consideration:

(1) It's only their own interpretation I chose a "cool" technology over the customer needs. Cognitive bias and all. Not to mention most people really have no qualifications to even claim this.

(2) In the senior programmer area (where I believe I belong) often times you have to make calls nobody can inspect for a while, you have to trust your experience and intelligence and make a decision quickly. If Ruby on Rails consistently fails you on a single website project of yours, it's very okay to start switching out its slowest parts with Elixir's Phoenix <-- that's a recent example in my work place. I chose both "cool tech" and "customer needs" together.

(3) Many times there's no immediate benefit to your work. It's easy for a manager to blatantly reject a hard mid-term decision implemented by a tech lead as "he's after the cool tech only because he's bored" and only I know in my head that the results from that "cool tech" will start showing in a month from now (it also doesn't help at all when I tell them that and they don't believe me).

> - Interrupting the interviewer and not asking questions. This is a weird one. We want you to do well, so sometimes if you're going down the wrong path we'll try to help. I've had candidates talk over me and just barge ahead down a totally incorrect path. ¯\_(ツ)_/¯

I admit I've been in the wrong on this one but I want to give you another perspective. I wanted to make a certain point -- 99% of the time it's the "why did I do this" or "why did I fail doing this" or "how did I succeed by doing this" -- and sometimes I digress because there are sub-points and I overdo my tries of being crystal clear. Granted, that's up to me to perfect as a communication skill but I've been extremely annoyed by interviewers who can't seem to trace my initial line of thought and try to coax me back in it. Instead they give you a smug expression along the lines of "this guy talks too much" and they form a negative impression right there and then. I can see it in their eyes and quite frankly I lose respect for them immediately as well -- they can handle such a situation much better. Interviews are a two-way process and both sides screw up in every single interview.

So overall, I believe you're over-generalizing on a few points.


> "some of my old coworkers were shit and they deserve the shit-talk they get from me

This is just such a red-flag in an interview. I have worked with some terrible people, for sure. But I would never talk about them in an interview.

> t's only their own interpretation I chose a "cool" technology over the customer needs.

Heh, I swear to god when I asked the question, "Why did you pick X technology?" they said, "Because it seemed cool." I want to hear, "because it performed faster" or "because it had X feature I wanted" or whatever. Tell me WHY it's cool. I just want to hear how you'd explain a choice to me as a coworker, I'm not looking to actually second guess old choices. I'd even take, "Well, we had a really short deadline, and I was familiar with the technology. I decided hitting the date was more important than evaluating all the choices." I want to hear about your thought process; I don't care what choice you made. "Because I thought the industry was moving that direction and wanted to be future-proof and improve our ability to hire" is a good answer too. There are a ton of good answers, but "because it looked cool" is not one of them.

> interviewers who can't seem to trace my initial line of thought and try to coax me back in it

For sure. I try really hard not to talk when a candidate is talking because I know how hard it is to get thrown off in such a stressful situation. Sometimes the candidate has just obviously misunderstood my question though. This is much more important on technical questions -- I believe a good interviewer will treat the candidate like they are a coworker, and basically solve the whiteboard problem in a fairly collaborative fashion. Obviously the person doing the interview will take the lead, but I'm SUPER happy to answer clarifying questions, and if you start to struggle the right thing to do is to ask some small questions and see if the answers help. Just sitting there banging your head against the whiteboard isn't useful.

> So overall, I believe you're over-generalizing on a few points.

For sure, I'm just trying to give some really brief summaries of bad behaviors I've seen. I've also given a "hire" to people who did one of each of the above; like I said, none of it is disqualifying.


Now that you clarified, I actually think we're very much on the same page. ^_^

> This is just such a red-flag in an interview. I have worked with some terrible people, for sure. But I would never talk about them in an interview.

Of course! Don't get me wrong. I am not making it a goal to shit-talk former coworkers in an interview. I try to avoid it, but what am I to do when I have to share that I maintained a Rails 4.0.x app for 16 months and was unable to upgrade it even it to 4.1.x because the previous team did a horrific job? It wasn't my fault at all and I actually sunk several weekends trying to upgrade it but was drowned in mysterious and horrifying errors (like an ancient state machine gem having very rigid dependencies and utilizing 80+ monkey patches practically made any upgrade impossible) before giving up? Lie that I suck? No I don't suck, they sucked, and I ain't taking the blame for them. That being said, I have better uses of my leisure time, and my work time as well -- having in mind I am not expected to refactor at all. So I tried sinking 35-40 hours of leisure time in that problem and moved on.

> Heh, I swear to god when I asked the question, "Why did you pick X technology?" they said, "Because it seemed cool."

You'll hear a very hearty (and non-sarcastic) "I am sorry for that awful experience, man" from me. I never do that. I always explain myself. I went out of my way to rehearse in my spare time, just asking myself the question "why did you pick tech X?" -- I am a firm believer of Einstein's saying "If you can't explain it to a 5-year old then you don't understand it"!

> Sometimes the candidate has just obviously misunderstood my question though.

Perfectly fair. Never objected to that and I also went to apologize for the misunderstanding because in an interview every minute counts.

Thank you for being constructive.


> what am I to do when I have to share that I maintained a Rails 4.0.x app for 16 months and was unable to upgrade it even it to 4.1.x because the previous team did a horrific job?

The reason I personally care about the skill of talking about bad coworkers tactfully is that I think it correlates with being able to navigate tricky workplace fuckups without making a big political mess that I have to clean up. The way I would phrase the above situation is:

"I inherited a Rails 4.0 app, and I wanted to upgrade it to Rails 4.1 so that we could take advantage of feature X. This wasn't on our roadmap, so I spent a lot of my own time looking into the feasibility and working on it in my personal time. However, it eventually became clear that too much of the legacy code would have to be totally redone, and there wasn't enough of a business upside."

> I am a firm believer of Einstein's saying "If you can't explain it to a 5-year old then you don't understand it"

Me too! Love this.


> The way I would phrase the above situation is:

You're a better diplomat than me. I could've phrased it almost the same, but with me the tone and the wording depends a LOT on my current mood -- I have to work on this, that's for damn sure.

It does however still leave the question "but whose fault that legacy code is?" open, don't you think? With my wording in the above comment I would've aimed at being crystal clear -- and thus somewhat rude in the process, admittedly.

> without making a big political mess that I have to clean up

Could you please clarify on that point? I am curious.


> It does however still leave the question "but whose fault that legacy code is?"

Heh, I can do this all day. "We accumulated a lot of tech debt on a previous project because of very short deadlines." Or, "the previous owner learned the technology while building the system -- I'm sure he would have done it differently today." What I'm looking for is that you understand what leads to this sort of situation, and it isn't usually, "the last guy was an idiot." I mean sometimes, sure, but even then it's usually closer to, "the last guy was hired into a role he wasn't ready for and needed a better mentor, or better training."

> Could you please clarify on that point? I am curious.

It's really helpful to be able to send an engineer to collaborate with another team without having to worry about whether they'll end up butting heads with someone and making a mess. Tact is important. Sometimes other people are under constraints that you're not aware of, and it's useful to have empathy. Maybe they have super tight deadlines, or maybe they're having to use a technology they've never used before. It's easy to say, "this person is a moron and their code is bad," but if they get the impression that you think that it can really harm working relationships. At that point, I end up having to step in and smooth relationship over, and it's not a great use of time.

Talking about how bad ex-coworkers with tact shows me that you 1) have empathy and 2) will understand how to navigate similar situations if hired.


> Heh, I can do this all day.

I can see that! :D Thanks, you've been very helpful. Believe it or not, I am learning from this interaction.

> Tact is important.

I don't disagree and I'm with you here. But herein lies the dilemma -- I've been tactful and diplomatic way too many times for my taste. I've had my fair share of politics. I am not horrible in it; I simply started lacking any patience for it and thus my mood started leaning heavily towards being blunt and somewhat inconsiderate. I am not outright offensive but I am no longer tactful and diplomatic (sigh).

I started standing up and heavily protest in the face of bad politics however. I have less patience now because I expect everybody else to do the same -- and they don't.

That's why I get easily irritated. I absolutely agree with ALL of your points -- the people might have had horrible customers (or team leaders), they might have drained the budget close to the end of the project so they probably had to cut tens of corners, they might have been gaining experience along the way... and a plethora of other possibilities. I agree.

I do have empathy and tact. But I have lost almost all patience in the last few years.

I appreciate that you might not believe such a polarized message in an interview. But I got slightly carried away sharing. :)


Happy to help!

At the very least, just prove that you can keep your mouth shut if you think someone is the worst, heh. That's really all I'm looking for. ;)

I also totally understand getting frustrated. I've been a total dick to my coworkers in the past, for lots of different reasons. I was burnt out, I was young, etc. I've mellowed out a lot over the years. A good chunk of that is also just being on a good team; it's really hard to be the best version of yourself of you're not well supported.


> I've mellowed out a lot over the years.

And here you have myself at 37 being much more impatient compared to 27. :D Truth be told, I am also much more mellow in general, just not in work lately.

> At the very least, just prove that you can keep your mouth shut if you think someone is the worst, heh. That's really all I'm looking for. ;)

Double thanks, this is an extremely valuable advice!

I appreciate you taking the time.


Thank you for providing the helpful specifics.


Thanks...could you please cite specific examples?


Added some above.


Take a second to read about the experiments this author conducted. They included:

Dummy candidates mixed in with the interview flow that gave randomized answers to questions (interviews were structured to somewhat blind this process), and interviewers lost no confidence from those interviews.

Interviewers, when told about the ruse, were then asked to rank between no interview, randomized interview, and honest interview. They chose a ranking (1) honest, (2) randomized, (3) no interview. Think about that: they'd prefer a randomized interview to make a prediction with over no interview at all.

Of course, the correct ranking is probably (1) no interview, (2) a tie between randomized and honest. At least the randomized interview is honest about its nature.


The interview is also selecting for a single thing - GPA. You can be an utter arsehole and have a high GPA. You can have personal hygiene that stinks out any room you're in and have a high GPA. Basically, you can be completely impossible to work with and have a high GPA. The research they've done is suspect, because they weren't interviewing for an ongoing role, but for a single KPI.

Similarly, the people doing the prediction were other students rather than teachers who know better what to look for. The research would better fit HR people hiring for roles they know nothing about than experienced team members hiring for roles similar to what they do.

The research results are massively over-applied, and in no way whatsoever is sufficient to use the term "utter uselessness of X". Unsurprising, given it is research by a 'professor of marketing'.


Like everyone else, I agree a face to face interview is the best way to verify a candidate doesn't smell bad.

I assume that interview could be pretty short.


Well, I suppose you could fish out the most trivial part of the comment and refute that. Why not set up a battery of short unit tests for the candidates? "Follow me, Jones, and we'll see how you handle the 'coworker with chewing gum' test next..."

The study was done on students, who almost universally have zero experience selecting people to work under them in an industry setting (or at all). Drawing conclusions from this particularly inexperienced subject pool and then extrapolating out is bogus, particularly given the extremely certain language of the article. The subject pool is at an age (18-22) where people are still figuring out what to make of themselves and others; they have extremely little adult experience of the workplace and judgment of character - indeed, at this age, people are notorious for making bad decisions in their personal lives.

When you look at the actual paper they link, out the window goes the declarative language, and instead the article is unusually full of weasel-words ('can', 'may'). There's a major difference between "utterly useless" and the actual conclusion of their paper: "interviewers probably over-value unstructured interviews".


In the experiment the authors ran, interviewers felt more confident after interviewing a "candidate" who had secretly been instructed to randomize the answers. Later, they informed the researcher that so great was the value of the information they extracted from interviews, they'd prefer a randomized interview to no interview at all. I think the authors made their point quite well.


Meeting the "candidate" lets you apply some judgement about them that you can't use from just reading a CV (or just knowing their past GPA). How they look, how they dress, their accent, their body language, etc.

People think they can get valuable information from this, so they want to meet the candidates, even if they are told the answers are random.

You need to prove (and then convince people) that all this extra information (the impression a person makes when you meet them) doesn't improve your prediction of their future GPA over a prediction based only their past GPA.


It's the whole process that's useless.

Remember a few years ago when this forum was drowning about how to find and hire "10x" people? 98% of employees were useless in the face of the 10xer.

The reality is, most of the time screening for general aptitude, self-motivation and appropriate education is good enough.

I've probably built a dozen teams where 75% of the people were random people who were there before or freed up from some other project. They all work out. IMO, you're better off hiring for smart and gets thing done and purging the people who don't work.


This I think is a very a good point. The scariest point about Lazlo Block's book is that he is very against the idea that people can improve or be trained better. To him there is this idea of predestination. You're either always good and always have been, or you're not good and won't get better and there is nothing Google can do.

I feel like as a company you could exploit a lot of value by just hiring people and training them. Training btw doesn't mean at work. Think how much you learned in college lectures vs. reading the actual textbook. It means going home and studying, and the incentive should be that you're getting things that make you more employable, so this shouldn't count towards hours (if it was say a very proprietary old programming language though I could see this argument falling apart).

Not saying this is the best way but shocked that everyone is "trying to find the best". What kind of world will that leave us in if all companies want to hire the same 3% of people? Businesses move slower, talent is lost, and inefficiencies accrue.


Wow, that's really terrible to hear. And of course, conveniently, it lets the employer off the hook for any notion of team or project fit with the employee. If they weren't performing well, then obviously they never will in any capacity in any context. Awesome.


A lot of the pure evil HR bullshit is the dance around discrimination suits and to maintain the use of alma mater as a discriminator. It's literally in their interest to assume that you're a widget fit for a specific purpose.

If you provide meaningful training, you need to be fair in the application of said training. If you admit that you can train people with common existing skills to do most technology jobs, it's going to hard to justify your cozy recruiting funnel with a small number of universities, picked in the basis of where a few bigshots in the company went to school.


One wonders what exactly happened to this industry, why instead of training, companies are offloading the work to universities, MOOCs, and boot camps.


All about cost and trying to offload it somewhere else. Sadly it's very predictable, most businesses work that way.

Then again, theoretically this gives a huge edge to the proactive learners.


The questions are calibrated to find juniors people out of school. They can enter easily and get training afterwards. The training is not offloaded.

However there is some challenge with having experienced people. They simply can't answer the screening after a while and can't get hired. And who's gonna teach the juniors if you don't have seniors???

Personally, I moved to finance. I find that experience, domain knowledge and maturity is valued more there :D


The problem is GPA itself isn't necessary a valid data point. It's less fallible than "gut instinct", as the author here seems eager to claim, but personality type can be more important than ability to memorize facts.

I'd rather hire a programmer who knew less and could get along with others than a master dev who's a total a-hole.


It should probably have been titled "The Utter Uslessness of Unstructured Job Interviews", because that's the kind of interview the author criticizes.

In my personal experience, structured interviews can be very helpful in determining a candidates abilities.


Are you sure you're using the same definitions as the author? In practice, across a pretty decent sample of large tech companies, I've never seen a truly structured interview outside of Matasano, where I was almost murdered by my employees for instituting them. I think we'd hear far more complaints about them if they were common.

It's possible that what you consider to be a structured interview is in fact what this author (and I) would call unstructured. Specifically: if the interviewer has any discretion about questions at all, the interview is probably fundamentally unstructured.

In a structured interview, the interview is less an interrogator than a proctor or a referee. Every candidate gets identical questions. The questions themselves are structured to generate answers that facilitate apples-apples comparisons between candidates: they generate lists of facts. The most common interview questions, those of the form "how would you solve this problem", are themselves not well suited to these kinds of interviews. It takes a lot of work to distill a structured interview out of those kinds of problems.


> Lists of facts

This has me mulling whether that might be a better approach to administering law-school exams than the traditional analyze-this-hypothetical-fact-situation approach. (I'm a part-time law professor.)

More generally: I wonder to what extent school examinations can draw useful lessons from job interviews.


How did you feel about your experience using structured interviews for hiring?


Truly structured interviews are better than even the most rigorous traditional interviews. They are also more expensive to design and much more painful to deliver. Some kind of interview is probably necessary for every serious tech hiring process. Organizations should be realistic about the low quality signal they'll get even from structured interviews. Take time away from interviews and feed that time to off-site work-sample testing.


I've tried work-sample testing with middling results:

-The more hoops a candidate has to jump through, the more likely they are to bail out of your recruiting funnel. This is especially bad for college/postgrad recruiting when you aren't the #1 employer in your field. Everyone wants to work for the Googles and Facebooks of the world. It's hard getting someone to spend a couple hours for your startup job.

-People cheat. We usually issue a short coding project, grade for correctness, then do a code review over Skype or face-to-face. Many candidates turn in the exact same responses. I've even seen people cheat and have a friend do the Skype session with a totally different guy flying out. Do you proctor your test in a secure center? Use an online service to lock down their machine and record? Both are pretty invasive. Switching up the questions constantly is tough and makes your signal noisier.

-Industriousness and raw intellect trump skills/knowledge most of the time. Sure there's a baseline level of skill required to train someone quickly enough, like I wouldn't hire someone who didn't know basic data structures, but work-sample tests are often biased to those with a very specific background. I don't want employees who are great at doing what I need today. I want ones who will be great at figuring out what to do years down the line.


First: if you make candidates do work sample tests, you should reduce the amount of in-person interviewing you do to account for it. Up to a limit, most candidates would prefer your selection/qualification process to happen from their homes than from your office. Unfortunately, companies aren't serious enough about their tests to trust them, and do indeed tend to make this just another hoop.

Second, incorporate the work sample tests into your in-person (or even telephone) interviews. Now you have a (hopefully interesting) technical problem they ostensibly just solved to talk about. Your evaluation should be by formal rubric, not interview, but it's easy to see if someone actually did it. We had no problems at all with people cheating (of people we hired with this process, over about 4 years, we fired not a single one).

Finally, I could not be less interested in the kind of amateur psychoanalysis tech interviewers hope they're accomplishing as a side effect of quizzing people about code in their conference rooms.


I'm curious in your structured interview process what parts of the interview are setup for the person interviewing with your company to interview your company? Interviews go both ways. I've turned down more jobs than I've accepted due to the company doing poor on the interview.


From the hiring side, I agree. There's no other way to smoke out those whose creative writing skills extend to their resumes, and you get a pretty good idea of personality differences between candidates if you use the same pool of questions for a given position.


Registration is open for Startup School 2019. Classes start July 22nd.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: