Hacker News new | comments | show | ask | jobs | submit login
The Utter Uselessness of Job Interviews (nytimes.com)
563 points by tomek_zemla 17 days ago | hide | past | web | 400 comments | favorite



I find it quite problematic that researchers get to talk about their own research and present it as facts without anyone taking a critical look.

Over time I’ve become more skeptical about this kind of psychology research (as more studies fail to replicate) and, as is often the case, here the sample size is quite small (76 students, split across 3 groups), predicting something relatively noisy as GPA. It is unclear to me that one would be able to detect reasonable effects.

Furthermore, some claims that make it into the piece are at odds with the data:

> Strikingly, not one interviewer reported noticing that he or she was conducting a random interview. More striking still, the students who conducted random interviews rated the degree to which they “got to know” the interviewee slightly higher on average than those who conducted honest interviews.

While Table 3 in the paper shows that there is no statistical evidence for this claim as the effects are swamped by the variance.

My point is not that this article is wrong; verifying/debunking the claims would take much more time than my quick glance. But that ought to be the responsibility of the newspaper, and not individual readers.

Politicians don’t get to write about the successes of their own policies. While there is a difference between researchers and politicians, I think we ought to be a bit more critical.


Of course we should be more critical, the methodology here is far from perfect. But right now, people have much faith in interviewing, and this research suggests that this faith may not be justified. As you mentioned, this is not the only research that reaches this conclusion. There is also the work by Daniel Kahneman[1] which I find pretty rigorous and draw the same conclusions about interviewing.

So obviously, the title of this article should be "Maybe interviewing is not that useful" instead of "The utter uselessness of job interviews", but besides this I find your comment unjustified. In fact, it's quite the opposite, I believe this type of work contributed to make us more critical by questioning some basic facts about interviewing, that i would have never questioned just a couple of years ago.

> Over time I’ve become more skeptical about this kind of psychology research (as more studies fail to replicate)

ok, this is interesting, where is it mentioned?

[1] Think fast and think slow. There is this short article which mention some of the results and has been discussed on HN a couple of times already. http://www.nytimes.com/2011/10/23/magazine/dont-blink-the-ha...


It's easy to present data that shows flaws with today's interviewing.

It's a lot harder to present a better way of predicting candidate performance in the workplace, along with substantial data that indicates it's better than today's methods. Corporations would love more effective ways to determine effectiveness/performance before hiring.

Interviewing is terrible, but that doesn't mean there is a better option.


Nonsense. First step is to acknowledge that interviewing doesn't work very well. You can't keep deluding yourself because you haven't found a better way. Accept reality.

This is one of the biggest problems I see with business guys today. They want absolute certainty in a world which can't offer it. Start accepting that there is a lot of stuff we don't know and can't know at the moment and we simply have to work towards getting better.

You can't get better if you don't acknowledge that there is a problem in the first place.

These silly, must have +5 years of experience, and check all these boxes with technologies to get the job clearly shows the industry is completely lost at the moment and isn't willing to acknowledge it.


Of course people know interviews aren't the greatest, but it's the best tool they have right now. What people are pushing back against is criticism without solutions.

There are a bunch of other testing methods that are illegal in the USA too, such as IQ tests.


When users of code I've written complain that something sucks to use, I don't demand they come up with a better solution. Probably I don't even want to know what they think the answer is—unless they design user interfaces for a living, odds are their answer will be terrible and they won't even be happy with it if I build it.

This is as it should be—I'm paid to solve problems.

Why is this different?

The more I read about interviewing, the more I realize too many people think they have this problem solved—their amateur psychology is impeccable and their technical screens test for exactly the right things, no more and no less. Did they do a bunch of controlled studies to convince themselves of this, or are they taking sounding good, or intuition about the statistical outcomes of different techniques, to be equivalent to truth?

Maybe the first step is to collectively realize we have close to no clue what we're doing, and are being asked to solve a hard problem: individually, to talk to someone for an hour and make a hiring recommendation. In aggregate, to make the decision based on a handful of these one-hour conversations.

Maybe the first step is to realize this is a problem worth trying to solve.

Maybe the first step is vocal non-acceptance.


> What people are pushing back against is criticism without solutions.

Which is bullshit. It's perfectly reasonable to criticise something without proposing an alternative. It's especially ridiculous to reject criticism provided without alternatives, when it's literally your job to do the work being criticised.

"Hey the way you're doing this part of your job produces results no better than random selection."

"Bring me solutions, not problems!"


Solution: toss a coin or take n first applicants for the trial period. Same effectiveness, much cheaper. No need for the interviewer.


I still don't understand why employers can't simply set up a two week "trial contract" where as promising candidates simply work for two weeks so everyone can actually see and judge, with real world empirical data, how well the person does in the environment at the actual job.

Yes yes..of course I know this could be gamed as well, but no matter...you can't really argue that this wouldn't be magnitudes better then the typical current/broken interview process.


This will get you the most desperate employees, not the ones you want. I've got a home loan to pay, I'm not switching jobs if you can only guarantee 2 weeks of employment. And if there are multiple offers around, I'll take the 3 month or full time position, even if I'm half way through the 2 week contract.

Also, not all jobs/codebases lend themselves to being productive in 2 weeks, I'd argue they should be, but they aren't.


The concept of a two week trial contract is interesting, but also as flawed as anything else.

I switched roles to a new team, and the first 2 weeks were a trainwreck. It was almost of no predictive quality on how I would do.

Now, there are many reasons why that was the case, and perhaps those underlying issues should be addressed, but from all the role changes I've had, the first two weeks shows how well the group you're going into can onboard, more than it shows how productive the individual will be in the long term.


You would need a longer trial period, a few months is fine with a clause that allows earlier termination.


Yes, because it would be magnitudes worse. So do you give the assignment to? You have several hundred applicants, do all of them get the 2 week assignment? Who watches over them and answers their questions? After spending that much money, is the answer you get any better than a set of interviews On the other hand, assume I am in a job and want to change positions, for any number of reasons... How many two week jobs do I have to take? Or should I quit my job first?


In the 2004-2007 timeframe, the company I worked for hired software engineers via a staffing company for three-month contracts. We interviewed the candidates with the intention of making a full-time hire. As the contract term approached, the management team did a 360 review, the decided to offer a full-time position or just not-renew the contract. This had some downsides, but overall I found it to be better that alternate approaches I've tried before or since. It stopped being viable once software engineering became a sellers market.


This seems to work ok for companies working on green field stuff, though you'd still probably struggle to entice people to leave their jobs for you. For other companies though, where technical debt and poor management is everywhere you look it doesn't. It gives employees a chance to see what they're really dealing with and to look for work two months later.


If someone is currently employed it makes that arrangement difficult.


Indeed, though any kind of arrangement is difficult. You might end up in a dead end job, not matching your skills or otherwise soul crushing regardless of the method.


I took a job half a year ago. I'm still "learning the ropes" so to speak. I remember the first two weeks. Nothing would have been gleaned from them.


In most countries that don't have employ-at-will I.E you could just fire them anyway, this is a real thing that actually happens


Yet, literally every professionally employed person is in their current gig via that process.

It works good enough.


Counterexample: I did not apply to, nor interview for, my current gig, and I'm not cheating by working for myself.

And I bet almost everyone reading this has worked with at least one person they think shouldn't have survived the interview, and that person was making a boatload because they convinced the boss they're brilliant. Meanwhile 90% of their day was spent talking about how great they are, and 10% creating new bugs, and no one dared say anything because the thought of them being more "productive" was horrifying.

'It' mostly successfully matches employees to employers, but the quality of those matches may vary wildly.

Interview processes also vary wildly—you can't really say that it works without defining 'it.' Are we talking multiple technical screens that require writing code or a single fluffy buzzword-laden conversation with a C-level? Both have failure modes, but those failure modes sure are different.


> Interview processes also vary wildly—you can't really say that it works without defining 'it.' Are we talking multiple technical screens that require writing code or a single fluffy buzzword-laden conversation with a C-level? Both have failure modes, but those failure modes sure are different.

End of the day, everything evens out. People add the structure they need when they hire people. If your engineering interview process for some detail oriented gig is buzzword trivia with the CIO, the company will probably tank anyway. Conversely, if you do some nerd-fest whiteboard interview for a CTO in a bigger organization, you're probably not getting the right outcome either.


Did you read the article? People add the structure they think they need to interviews, but are clearly able to troll themselves into worsening their judgments by requiring steps that not only don't help, but actively hurt.

And this is a near universal phenomenon. Almost everyone wants to "get to know the candidate."


That's a syllogism and I'm not sure it tells us anything. If you replaced interviewing with a footrace across hot coals the same assertion would be (just as vacuously) true.


It's deeper than that.

Everyone hates the process, and companies have investing major dollars and hours trying to improve. End of the day, little has changed since 1917. You either acquire-hire, get a strong referral or interview a pool of unknown applicants.

The tests and quizzes are little different to how a city hired an accountant in 1917. The old boys network evolved. Then you're left with the rest.


I strongly disagree. There are all sorts of situations where having a bad option is much worse than having no option, hiring and medicine clearly among them.

> Corporations would love more effective ways to determine effectiveness/performance before hiring.

This is irrelevant to evaluating the current methods. Even if there is no replacement, if they are useless, we should know it, bar none.


That feels a bit like saying we should just stick to blood letting and leeches because we don't know any more effective way of treating disease. Just because we don't have a superior alternative doesn't actually mean the current method is effective.


A)

Leeches are often used in literal modern hospitals in the developed world as an effective treatment for certain ills.

B)

If we had no alternative, we should stick to it, but look at other things in the meantime. You seem to be advocating doing nothing at all, which is trivially easy to show works for no-one


Yes, leeches have extremely limited use. However, bloodletting and leeches killed far more people than they ever "treated."


We have had a better way for decades, its banned. What do you think all these algorithm questions are supposed to do?


Any evidence to say that algorithm questions are a better indicator of job performance? Most development work isn't algorithm heavy at all.


I doubt there's that much, in general, since a lot of jobs just need a floor. But when it's used as a proxy for IQ test, the better you do at algorithms the higher IQ you probably have, and that typically correlates with better job performance, or being able to transform one's job into one with higher impact. (Even more if they have high Conscientiousness too but I don't think algorithms would correlate much with that.) It's also a weak proxy for seriousness. I hate algorithm questions (though fortunately not algorithms), almost everyone I talk to hates them, but if you're not expecting them and don't at least know the basic ones, you're not serious about applying to random tech job. (Which is fine, I don't mind that some people will get in a huff and walk out when asked to write a graph search algorithm or something as if it's beneath them or useless since the day-job never does such things, they just weren't serious about applying to that company.) Some tech companies have gotten rid of them, which is great, but you can't count on that yet as a candidate.


The problem with testing algorithms is that it in no way tests intelligence. I would think that 9 out of 10 programmers that know an algorithm would not be able to derive the algorithm from first principles. So you are just testing esoteric knowledge - it's qualitatively no different that asking someone questions about a specific framework / API.

You could make the argument that algorithms tend to be studied more by smarter people, but if that's what you're going for you may as well ask them about their hobbies, and hire the person that is into playing chess, or doing astronomy (or whatever intellectual pursuit you care to name).

If on the other hand you are interested in a person's ability to code, ask them to do so. The last time I had to hire someone, I wrote a small application with one module that was deliberately written in an obfuscated style. I asked candidates to bring that module under control - rewrite it in a readable code style. To do this, successful candidates needed to identify what the current code was doing by examining the public interfaces in a debugger, documenting what the calls seemed to do, prepare unit tests, and then rewrite the module in a readable style. It took about a day for most candidates to do.

At the end of that, you get to see a candidate's ability to read code, use a debugger, write unit tests, write documentation, and write well structured code, which is a pretty good coverage of the typical tasks in a developer's day. I feel this gives a much more realistic assessment of a candidate's capabilities that asking questions about a more or less randomly chosen algorithm.


> It took about a day for most candidates to do.

This is an issue as well. If you aren't google then a day is too much investment for a single job opportunity, especially if you're already employed.


I don't think someone with an IQ of 80 could program Djikstra's algorithm given a mathematical description of it with diagrams. I'm even skeptical of programming binary search. They might understand an intuitive explanation involving a phone book but I don't think they could program it. And even if they could, I think someone with an IQ of 120 would do it much faster, though both solutions would likely have the integer overflow bug that was even in Java's implementation for a long time. So I think algorithms do test intelligence, just not as well as an actual IQ test. It can easily be gamed by sheer memorization, whereas good IQ tests can't. I agree that other things like ability to play chess would probably test just as well as algorithms. If the industry switched to testing candidates to see if they can solve chess problems, or play a computer self-limited to some specific ELO, you can bet that everyone who was serious about getting a job in the industry would start playing a lot of chess, and those with higher IQs will on average play better chess.

When I have to give an interview and have to include an algorithms section I make candidates type code. Whether that's on a phone screen with a shared online text editor or in person with their laptop / an interview laptop, I want them to type stuff, not just rely on whiteboard pseudo-code and diagramming. As a vim user I discount that their editing environment may not be what they're used to but even if I was forced to use Notepad I could still bang out a function to test the even/oddness of a number (my own fizzbuzz) pretty quickly. So I at least make sure to test coding, even if poorly.

I agree work-sample tests are the best, but as another commenter noted if they take a lot of time for the applicant you're going to get people who refuse that just as some refuse to play the algorithms game. Especially if people have a github repo, especially if some of the projects they've worked on have had more than themselves as commiters, especially if they're currently employed as a developer at some other company that does general software. Unless you're trying to build a top team, which most projects don't need, you're wasting a lot of time trying to rank beyond "would work out ok" and "would not work out at all". I have a section in my phone screen that tests for regex knowledge, I'm primarily just testing to see if they know the concept or if when faced with a problem that regexes can solve (which actually does happen from time to time) they reach for writing some custom parser or not. If they vaguely remember there's a way to specify a pattern and find matches, that's a Pass. If they know grep/their language of choice's regex syntax and can give a full solution, great, I'll rank them slightly higher than someone who just knows the concept, but all I really care about is the concept. If they don't know the concept, that's a strong sign (to me) they won't work out.

I tried to do a semi work sample test with an intern candidate a few months ago instead of a different test, based on experience with a prior intern who struggled on something I thought was basic and left me wondering why I didn't catch that in the phone screen. Basically I gave them some stripped down code from several files that looks a lot like what we have in production (JS, using Backbone) explained the overall mapping from bits of code to what could be shown on the screen, and essentially asked them to add a new component (already written) to one part of the screen by modifying/filling-in-the-functions in a few places. It required them to read and understand some alien code, see what they can ignore, understand what was asked, and then do it (initialize something, pass it around, up to I think 3 indirect function calls of nesting, call a couple things on it). The candidate got through it, I'm not sure the old intern would have...


What is the better way that is banned?


I suspect he is referring to iq tests. They have a pretty high correlation to success and are banned.


In my experience, IQ tests are a good indication of your ability to take IQ tests and very little else. Of course, you can differentiate an absolute dunce from someone who's not, but nothing more subtle than that.


http://www1.udel.edu/educ/gottfredson/reprints/1997whygmatte...

Why g Matters: The Complexity of Everyday Life

Personnel selection research provides much evidence that intelligence (g) is an important predictor of performance in training and on the job, especially in higher level work. This article provides evidence that g has pervasive utility in work settings because it is essentially the ability to deal with cognitive complexity, in particular, with complex information processing. The more complex a work task, the greater the advantages that higher g confers in performing it well. Everyday tasks, like job duties, also differ in their level of complexity. The importance of intelligence therefore differs systematically across different arenas of social life as well as economic endeavor. Data from the National Adult Literacy Survey are used to show how higher levels of cognitive ability systematically improve individuals’ odds of dealing successfully with the ordinary demands of modem life (such as banking, using maps and transportation schedules, reading and understanding forms, interpreting news articles). These and other data are summarized to illustrate how the advantages of higher g, even when they are small, cumulate to affect the overall life chances of individuals at different ranges of the IQ bell curve. The article concludes by suggesting ways to reduce the risks for low-IQ individuals of being left behind by an increasingly complex postindustrial economy.


Much like the way that hackerrank tests how good you are at hackerrank tests?


I wonder how much correlation with IQ is allowed before custom algorithms questions qualify as an illegal IQ test.


I don't think the issue is correlation, but rather disparate impact[0].

Algorithmic tests have a pretty low bar to clear in IT jobs. Using them for secretaries or lab technicians is another story.

https://en.wikipedia.org/wiki/Disparate_impact


Employees will test the limits with lawsuits. I know at least one large state lost a class action lawsuit due to racial bias on civil service exams.

The problem with comparing IQ results at hire to employment success is that employment outcome is difficult to define over time. You're also unlikely to get statistically relevant data without focusing on large organizations with standardized HR processes. Most of the research is based on supervisory evaluations, which are not the most reliable indicators of anything for a variety of reasons.

The other thing I find amusing is that business folk who talk about this miss the fact that there are large workforces in the US that either have or do use standardized testing like this to hire and promote. Those are government bureaucracies, which function relatively well, but are hardly a model that most folks advocating this would aspire towards.


Stem cells and steroids maybe?


> ok, this is interesting, where is it mentioned

https://mobile.nytimes.com/2015/08/28/science/many-social-sc... among other places


People have too much faith in the efficacy of interviewing, ergo we should give slipshod journalism and science lacking rigor a free pass?

I'm all for upending the status quo with interviews, but let's not throw out science and reporting just to get there.


My point is not that this work is flawed, or that there should not be an article reporting on this research or the topic of interviewing. Rather, I think it'd be better for a third party to write about the topic in a more objective manner, rather than a professor promoting his own research (and thus with skewed incentives).

In particular, I was disappointed to find a (short) paragraph in the article that I find bogus. That does not mean the article shouldn't have been posted in the first place, but just that this paragraph should have been edited or removed.

I think there is something wrong when I feel like I have to look up the actual research paper and check whether the claims made in an article are supported by data and methodology. I should not be a skeptic when reading New York times articles.

To be fair, it is posted in the opinion section, but should we really just take this article as an opinion? That doesn't feel right to me either.

Then onto your last question and Daniel Kahneman, we can talk about that for a long time, but let me keep it short. The best place I know (though technical) is the blog by Andrew Gelman ([1][2] turned up in a 5 second Google, but there is way more on his blog), and Daniel Kahneman himself has "admitted" flaws in his studies [3][4].

[1] http://andrewgelman.com/2014/09/03/disagree-alan-turing-dani... [2] http://andrewgelman.com/2016/06/26/29449/ [3] http://retractionwatch.com/2017/02/20/placed-much-faith-unde... [4] https://replicationindex.wordpress.com/2017/02/02/reconstruc...


I worked in the IT arm of a household name US non-IT company (just to provide basic context). Eventually we determined that interviewing just wasn't that helpful. We started telling the recruiting company, "Send me the best you've got that's available by Tuesday." Now certainly the recruiter is going to do some selection biasing there, but we found we had just as much success as our previous interviewing process that had a phone interview, face to face interview, and a test.


The fact that psychology (and other social sciences and even medicine) have a replication crisis in spades is well known. Just Google it, if you haven't been following the general science news for the last few years.

Even the "hard" sciences have trouble, because journals prefer publishing new and positive results, rather than replications or negatives.


The claim you mention about getting to know the person better in random interviews as well as "People can’t help seeing signals, even in noise." is misleading and unsurprising.

People ask questions in interviews that they want to know the answer to and that could go either way. All questions have equivalent expected surprise, either both answers are unsurprising, or one answer is surprising, but you think you already know the answer is the other one.

If interviewers were asking questions like "is 2+2=4" they would have detected random interviewers way easier, but they wouldn't be trying very hard to get to know the person.

As for getting to know the person, the more surprising someone's answers are, if you believe they are telling the truth, the more distinguishes them from "average person who gave answers I expected", so you say you "got to know" them. This is unsurprising.

This isn't to defend unstructured interviews, other studies for a long time have shown them to be worse than structured interviews and test scores. If I had to guess the only reason the research in the article got published as novel was the random interview part.

Edit: Here's a table from a meta-analysis of lots of studies on correlation between different factors and job performance: http://imgur.com/a/YRFTh. Basically unstructured interviews aren't as good as structured ones, but they are better than nothing. Work samples are the best, structured interviews and IQ tests tie for second.

Note that this meta-analysis is combining a bunch of different fields to yield general observations, a specific field may have different results. But in expectation for a randomly selected field these are fairly solid results, and I don't expect fields vary from them too much.


> Politicians don’t get to write about the successes of their own policies

Politicians boast about successes constantly. I don't understand what you mean by "get to".


They also run expensive ad campaigns promoting their own policies, at least here in Australia. They don't even have to be successful policies to be able to talk about them...


Politicians _always_ promote the success of their own policies.


I think the parent means, we don't take as objective fact a politician's claims that their policies worked; we interpret it as self-aggrandizement and scrutinize the claims quite heavily. And that—given the push to publish and the fact that null-result studies aren't very publishable—we should likely do the same for research conclusions.


I understood, but I think recent evidence in the US suggests that a great many people absolutely take as objective fact what a politician claims. A similar number probably absolutely believe the opposite with little or no evidence. Net, I think citing politicians was perhaps not the best analogy.


Sorry I was not more clear. What I meant was that the NY Times employs journalists and fact checkers and editors to validate stories, say on politics to make sure, to the best of their abilities, that the articles they post are correct.

Why is that not the case here, where a professor is allowed to sell his own work? It is as if Obama is the NYT reporter for Obamacare.

That people believe politicians blindly is a topic for another day :)


Maybe, but doesn't a claim like "a 30-minute conversation with someone is a strong indicator of job performance" deserve scrutiny of its own?


Unless you're talking about a very high-level position that either requires substantial leadership traits or very specialized knowledge, interviewing has never been about finding the right person for the job, it's been about finding a right person for the job. The reality is that most jobs can be done successfully by a large number of people. The accuracy of the assessment of the candidate's abilities is, to my mind, a secondary concern in the interview process and yet, just as in this study, is the only part of the interview process studied and critiqued.

But what I find far more important in the interview process is involving current team members in the process of selecting new coworkers. It's one way of getting teams to be bought into a feeling of shared purpose and is the first step in establishing a working relationship. If you don't give at least some of the current team a role in the hiring process, teams will feel imposed upon by those hiring and won't be as understanding about flaws in those added to the teams.

Focusing on selecting the "right candidate" is really myopic in a situation where there are likely many right candidates. We should, instead, be focusing on not selecting a wrong candidate and fostering the right team dynamic. We already know that strong teams significantly outperform strong individual performers who don't cooperate. Yet hiring still seems focused on optimizing for strong individual contributions. And I've yet to see a study that looks at the flaws of the interview process in building ineffective teams.


I think you may be misconstruing the argument here. I agree that any number of people will fit -- but the problem is, what if interviews aren't even selecting "a" right person for the job?


> what if interviews aren't even selecting "a" right person for the job?

We know that interviews are selecting a right person some percentage of the time. That percentage will never be 0 or 100. We will always have to accept some bad hires.

I would rather that successful hire percentage be somewhat lower and keep the team engaged in defining culture and hiring standards than to give up team involvement for a higher right-person rate.

My main point is that the people studying this issue and arguing against the interview process tend to only look at less than half of the benefit of the interview process. If we're going to ditch the interview process, whatever replaces it needs to have the same property of involving the existing team or it is, to my mind, automatically worse than what we have now. Because while we're all aware of how flawed the interview process can be, it does work to some extent. When I was hiring, only 2 out of the 50 or so that I hired didn't work out. Would I like to have avoided hiring one or both of them? Sure. Am I willing to sacrifice the team involvement benefits to got to do so? Absolutely not.


But you have no control. How do you know your success can be attributed to the interview?


Of course it does. However the article and perhaps the underlying paper may be as useless as the interviews they are criticizing.


It's also worth noting that those students are probably very inexperienced interviewers.


Indeed, warrants further study.


I don't see the problem. That researchers haven't done a perfect study in every possible way is hardly the issue here, but rather the blind trust in our ability to interview and hire people. This research paper does not come out of the blue. This is something that has been observed by many organizations and people for quite some time, that interviews are broken.

In fact there is an awful lot of wrong stuff society keeps doing despite evidence over many year suggesting it is stupid. The stupidity of high CEO salaries e.g. has been proven quite well, yet business keeps a blind faith in it. Getting employees in complicated jobs to perform by rewarding them on extremely narrow metric has also proven itself counterproductive yet the MBA crowd refuse to believe it doesn't work.

Business practices often seem to be more like religion than founded on reality. It is a GOOD thing that some researchers are trying to do their bit to correct this flawed picture.


> I find it quite problematic that researchers get to talk about their own research and present it as facts without anyone taking a critical look.

Are you suggesting anything out of the ordinary is going on here? Doesn't one have to present their work as legitimate and wait for feedback? So long as they are open to that feedback, I don't really get this argument?

> Over time I’ve become more skeptical about this kind of psychology research (as more studies fail to replicate) and, as is often the case, here the sample size is quite small (76 students, split across 3 groups), predicting something relatively noisy as GPA. It is unclear to me that one would be able to detect reasonable effects.

So present that back them as a refutation? Are you waiting for someone else to do it? Why are you debating the reliability here?


There are many, many studies showing the same thing. This study is just one example. See, e.g., meta-analyses of your choice:

http://psycnet.apa.org/journals/apl/79/4/599/

They're predictive, but not very.

Structured interviews are better, but not really by much.

The problem with all of these things is they work, but not very well, and people make too much of any one thing. An interview is a very small slice of behavior, even if it's structured very well.

People make too much of them.


Psychology is a science like astrology is a science. Sample sizes are the least relevant of the flaws with its studies.


You are absolutely right!


Laszlo Bock (former SVP of People at Google) did a great job summarizing decades of research around structured interviewing in his book 'Work Rules!'

For a quick reference, the two defining criteria for a structured interview are:

1.) They use one or several consistent set(s) of questions, and

2.) There are clear criteria for assessing responses

That second point is really important. You can't only ask candidates the same sets of questions and have a structured process: you need to understand what a "one-star" response vs. a "five-star" response actually looks or sounds like. Training and calibrating all of the interviewers in a large company around a similar rating system is nightmarish, so most companies don't bother.

The book also outlines that pairing a work sample with a structured interview is one of the most accurate methods of hiring.

If anyone is interested in some in-depth structured interview questions or work sample ideas, feel free to email me. I've spent the last few years working on a company in the interviewing space and would love to chat.


Don't take his word as a word of god from heaven. Google's interview process is just as shitty as anyone else's, and shittier than some I've been through. It selects for people who do well on the whiteboard under pressure, which are very often not the best workers overall. It also wastes a ton of time both on the employer and on the candidate side. Source: interviewed ~100 people in my 6+ years at Google.


Yes, I don't think their process is magic. Google gets a good workforce because they are generous and prestigious, which means a lot of good people apply there. And Google is willing to say no to a lot of people in their search for good people. They reject a lot of candidates who would probably have worked out just fine.


Or, put another way, they choose to accept a high rate of false negatives to avoid false positives.


> Or, put another way, they choose to accept a high rate of false negatives to avoid false positives.

Which is how it is typically presented because it sounds much better than "reject a lot of candidates who would probably have worked out just fine". It is useful to perceive both the potential value in an approach like this and the shortcomings. Google can absorb the massive expense in man hours, lost opportunity, etc. that comes with trying to craft genuinely predictive interview processes, but a lot of the companies trying to emulate them can't. Too often, interviewees don't realize a process of this sort is stacked against them, and interviewers don't appreciate the negatives of adopting a still-nascent approach that sounds more reliable simply because it is quantitative - and assuming since Google does it it must work.


Interviewing is hard. I wonder if a number of great candidates just refuse to interview with Google because it's too cumbersome? I know a couple of great folks who just dropped half way because they couldn't be bothered with Google's lack of organization and their lengthy process.

Its not like Google pays the best or still has the best workplace. It's a large company with large company politics and red tape.


I wonder if a number of great candidates just refuse to interview with Google because it's too cumbersome?

I've met a few such people in this forum. Not many.

I'm not sure how one would even begin getting a rigorous estimate of that number. What is a credible sample of "great candidates" in this industry?


I'm not sure I believe that, actually. As elaborate as the process is, there are still plenty of false positives. I'm not convinced a simpler process would have produced a materially different outcome.


That's only a good tradeoff if the false positive generator carries the weight it generates in false negatives.


It's more than they're optimizing on a very specific set of skills/experience - the ability to answer a particular type of problem (eg. from "Crack the Coding Interview") on a whiteboard in under 45 minutes.


The trouble with that is it probably makes false positives more likely to slip through because they have to interview more people to fill a position...


Which is unfortunately wrong (unsafe), as it assumes the noise is random.

That seems to be an unproven assumption, and quite likely to be a wrong assumption. For example if good people turn out to be less interested in "honing their interview skills", adding parasitic noise to the signal.


bingo


It selects for people who do well on the whiteboard under pressure, which are very often not the best workers overall.

This gets tossed around as a truism. I'm curious, does anyone have any evidence for it? Call me a skeptic, but these kinds of "everyone knows" truths are often wrong.

Google and other such companies have a vested interest in getting hiring right. They also have the wherewithal to conduct studies, collect data, and let the evidence guide their hiring practices. Google in particular has shown a willingness to completely overhaul their practices by eliminating ineffective practices (remember their reputation for "thought puzzle" type questions?).

So I'm curious if you have anything to back up the idea that they're doing it all wrong.


To quote Abraham Lincoln: "Do not trust anything you read on the internet".

I know it from my own experience and that of many others who have been through the gauntlet. Take it for what it's worth, I'm not selling you anything. I don't look impressive on the whiteboard, but I do have a rather impressive track record. Something doesn't line up. :-)

FWIW, as far as I recall there was another experiment at Google where they tried to establish correlation between interview performance and job performance, and as far as I recall, there was no meaningful correlation. This, of course, is not fully representative, because it does not include poor whiteboard performers.


Don't take this the wrong way, but the anecdotes of people who didn't make it through "the gauntlet" are quite likely to be biased. Those of people who did make it are as well.

This is not data.


Did I say it was "data"? The closest anyone has come to "data" on this (that I know of) is Google, in that experiment where they just hired people at random. But they decided to ignore the results and stick to the soul crushing 5 hour interviews anyway, so data did not change the relevant people's minds.


Looking up the actual experiment, you're completely misrepresenting the conclusions. Here: https://www.google.com/amp/business.financialpost.com/entrep...

These were their conclusions: 1. The ability to hire well is random. This is referring to individuals, not the system as a whole. 2. Forget brain-teasers. Focus on behavioral questions in interviews, rather than hypotheticals 3. Consistency matters for leaders 4. Grades don’t predict anything about who is going to be a successful employee. School grades, that is.

So, stop making stuff up from behind your throwaway account.


Ouch, "making stuff up". That's harsh, my man. Thus far I've made absolutely nothing up in this thread, or indeed in any others under this account. And you're using a PR puff piece written by Google HR to discount years of personal experience that I'm sharing here. You're free to not believe me, but let's not level accusations without evidence, OK?


And yet you fail to provide a non-puff-piece link to the study you're talking about?


> Google in particular has shown a willingness to

Google is collecting and analysing data to improve its hiring process... not to improve the hiring process of the industry at large.

There is an effectively limitless supply of great engineers who will jump through hoops to work for Google.

That's just not true for the vast majority of the industry.


Is it really a truism? If anything, the general industry consensus is the opposite, that Google engineers are brilliant, the cream of the crop. Every big tech company and Google wannabe emulates their interview process. My Quora feed for whatever reason is littered with questions pertaining to how amazing working at Google is. In my experience, the people who question the effectiveness of Google style interviews seem to be in the minority.


> It selects for people who do well on the whiteboard under pressure

If this were true, Google would have crashed and burned a long time ago.

Obviously, their interview process selects for much more versatile engineers than that. Engineers who not only produce reliable and maintainable code, but who can actually come up with products that generate billions of dollars over the years.


A simpler explanation is that they pay well and are prestigious and so get a lot more good candidates. The proof is pretty obvious: what companies pay as much as Google and are as prestigious as Google and have bad engineers?


> what companies pay as much as Google and

Actually, Google pays under the average of top companies, because as a top tier company, they can afford. Most people I know who went to work for Google took a pay cut but don't have a single regret about it.

> A simpler explanation is that they pay well and are prestigious and so get a lot more good candidates.

Their pay and prestige will attract even more bad candidates.

How do you separate good from bad candidates?

That's right: a kick-ass interview process.


>Actually, Google pays under the average of top companies, because as a top tier company, they can afford. Most people I know who went to work for Google took a pay cut but don't have a single regret about it.

Compared to who? They pay more than AMZN/MS/FB/AAPL/etc. for equal level, but are stingier with levels. You might be correct that certain other companies pay more than Google (Netflix maybe?), but they're certainly above average.


> They pay more than AMZN/MS/FB/AAPL/etc. for equal level, but are stingier with levels.

Google is more generous to good performers via bonuses once you are working there, but if you have two offers in hand, you are going to find Google highly resistant to negotiating. The notion of people taking a pay cut to work at Google sounds plausible to me.

If one's goal is to maximize compensation (particularly in the short term), a Google offer is better used to get a higher paying offer at one of their competitors.


> Google is more generous to good performers via bonuses once you are working there, but if you have two offers in hand, you are going to find Google highly resistant to negotiating. The notion of people taking a pay cut to work at Google sounds plausible to me.

I don't think that's true. While google is by all accounts (including in my personal experience) unwilling to move significantly on base salary, they'll happily match pretty much any offer with stock from my experience (and the experience of others I've talked to).


What they will not do is adjust for cost of living or differences in taxation when comparing an offer in Mountain View with one in a cheaper locale such as Seattle (which is where two of the companies you listed above are headquartered).

Perhaps I dealt with a particularly nasty Google recruiter. I felt like the recruiter had misrepresented the health benefits and relocation package once I got the actual offer letter and related paperwork.


Ah, you're correct, I was specifically told "we don't take cost of living into account when deciding compensation" or similar language. (which isn't strictly true either)

That said, from what I've seen, compensation growth at Google from everything I've seen is faster than at the other companies, which means that for someone coming in at L>3, they will likely be given greater compensation at google than elsewhere.

I'm curious as to how they misrepresented things. I was actually pleasantly surprised once I got here by how extensive the benefits were, but I'm always interested in learning more, since while I actually think that 4 google interviews is a decent way to judge someone for google, I really hate their interview/negotiation process.


>If this were true, Google would have crashed and burned a long time ago.

How many of their projects succeed simply because they are google? Some major ones (like android) come to mind.


I guess the issue is that even the whiteboard style interviews can be gamed?


How does this work in practice?

If there are consistently used questions and specific criteria for assessing responses, can a candidate just learn the likely questions and what constitutes the "right" answer?


This is a good and important question. Also: while I have a lot of respect for people at Google trying to innovate on hiring, make no mistake: Google's heart is in the right place, but they aren't at the forefront of structured hiring, and their hiring processes are notoriously capricious.

The reality is that generating good questions for a structured interview is difficult. You can't just pose a programming problem. As you've noted, most programming problems have multiple good answers. Differentiating between multiple good answers from different candidates re-introduces subjectivity. Your brain would rather convince you that one valid answer is less good than another, even when it's not, that to admit to you that it can't differentiate or generate a narrative for you. The part of your brain that generates narratives is incredibly powerful and does not care about how accurate your hiring process ends up being.

What we tried to do was create questions that generated lists of facts. "Spot all the errors in this code" would be an example of this approach (but none of the three we used). We went into the process wanting to embrace epistemological uncertainty, generating a historical trail of data that we could retrofit to candidate performance.

In the end, work sample testing was so much more powerful a predictor for ourselves that we never fully got around to analyzing the data. Sometimes we'd get candidates that clearly generated inferior "lists of facts"; I think there may have been 1-2 instances where that outcome actually overruled work-sample testing delivered prior (out of a few tens of engineering hires and probably ~100 interviews).


Making sure I understand you: When you refer to work sample tests here, that refers to the crypto challenges and things like that that you published at Matasano? And you're saying that was much more predictive than the list of facts methods, right?


That's what people ordinarily assume I mean, but while our work sample challenges were similar to the cryptopals and Microcorruption stuff, they were not the same, or even derived from them. They were designed specifically to qualify candidates, and in fact predated our public challenges.


Just to further clarify, by work sample you do NOT mean an example of previously produced work? This still seems fickle: my dozen years of work output (for example, building and running a site with perfect uptime for millions of users) is not as valid an indicator of my future performance as how I happen to score on some arbitrary timed test?


Absolutely not. Samples of previous work are deceptive at the best of times.


Ahh, interesting. Would you mind sharing an example of how they were different, or how the fact that they were specifically for qualification changed the thought process? I'm working on making tech interviewing better and am fascinated by this stuff.


NDAs? Everything I've done yet is proprietary and judiciously covered by various binding legal agreements.

I wouldn't breach such a contract nor should you hire someone that would be willing to do so. The sole exception being in the cases where it would be the best for the public interest, i.e. whistle-blowing.


> but they aren't at the forefront of structured hiring,

Is anyone though?


Google tries to keep interview questions confidential - that's why candidates sign an NDA - and periodically rotates out questions that have appeared in public. Many engineers are also continually trying to think up new questions as well, usually based on their work.

For most questions, there's no "right" answer, but there are a set of points that the interviewer wants to see you touch on. For example, they might first want to see that you can code up a naive brute-force variant of the algorithm, checking whether you know the programming language claimed and can think through the problem, and ask you the algorithmic complexity. Then they'll want to see if you can get a divide-and-conquer or dynamic programming variant with lower time complexity. Then they might ask "What if it has to be an online algorithm, where new input arrives before the computation finishes?" Then they'll ask "How would you distribute this over 1000 machines, and what are the failure modes?"

At each stage, they're watching how you answer, and where you get stuck. If you ask clarifying questions or spend time to think before diving into coding, that's a plus. If you have never heard of the problem before (this is frequent - many questions are not in textbooks), they want to see how you would reason through it, and break it down into subproblems that are similar to textbook problems. If you miss language trivia, most people don't care; when I did interviews I'd usually volunteer the answer if they missed some API call, and when I interviewed my interviewers did the same. If you don't know how to solve the problem and can't make any effort to move forward through a solution, that's a big negative. Similarly if you don't know what the concept of big-O is or why it's important.


Maybe, but that doesn't sound like my interview at Google, which was interesting but really rather random. Perhaps most notably, one interviewer became irate that I didn't know the "ps -o" flags off of the top of my head. :-/


According to `man ps` on my Mac, there are nearly seventy of them - how many did he want you to know?!


It's been a long time since I interviewed an ops person. But when I did something like this, I'd ask, "Name as many arguments to ps as you can." (Or ls, also good.) I didn't really care about which flags they knew. What I was looking for was a pattern. Anybody good knows some arguments cold. Often they know them so well as part of a phrase that they have to think about what the individual ones mean, which tells me that they've done something enough that it's become a habit. Then they'll know a few others that they use occasionally. And then they'll stretch to name a few more that are obscure for them.

But the real magic comes when they run out. What do they do then? Good people say that they don't know. Often, they will take guesses at a few more based on general principles. E.g., "I'm sure there are arguments for sorting, but I only use top for that." Or, "There must be more ways to filter, but I only do it by user or command." They'll usually look a bit uncomfortable, because they are the sort of person who likes knowing things. But what they won't do is bluff or make things up.

And honestly, that's my biggest tip for hiring: find people who like understanding things and know a good amount for their level, but are willing to say when they don't know. People who can't do that are dangerous.


I feel pretty comfortable with my Unix abilities, and have held ops roles in the past, and I would have bombed this question.


I doubt that. The only way to bomb it is to make shit up, and I don't think you're the sort to do that.

I don't score it by how many option-letters people know. I score it by how well they demonstrate familiarity with a basic ops activity, which is finding out what's running.

Even if somebody said they never used ps, or like below, only use one incantation, that isn't a problem, because my follow-up there would be to ask how you'd find out what's using all the RAM or what's using a lot of CPU.

Maybe they're just used to using different tools, in which case I can ask for details for those tools. (And then check manually later to make sure I'm not getting snowed.) But if they've done a bunch of ops stuff, they'll demonstrate the sort of know-from-doing level of familiarity with something.

In theory I could just ask the broader question to start. But some people are good talkers, and I'm looking for evidence that people are good doers. All that said, I've shifted mainly to more experiential interview processes, as actually doing is the best way to see if somebody's a doer.


I would almost assuredly stop the interview at this point and try to figure out what your malfunction is.

To me any question where the obvious answer is "I can trivially look all that up in the man pages" is a gigantic red flag that I don't want to work at that place.

If I'm feeling curious that day, I might begin a conversation about what you think you are testing for with that question, as I've spent a fair bit of my career studying hiring pipelines. But if I hadn't had my coffee yet or were irritated or something I'd probably just ask to talk to someone else.


That's fine. If you go into something assuming that anything you don't understand is a malfunction, and you excuse your rude behavior with your mood and/or caffeine level, you're probably not the kind of person I'd want to spend a lot of time working with.

That said, as I've explained elsewhere in the thread, the point isn't to see how many they know. It's to see that a) they know some portion of it that people doing the work would know, and b) have reactions to the rest that indicate useful work habits and attitudes.

A perfectly find first answer to is "I only use ps -aux. I'd just look in the man page if I needed anything else." Because then we could have a good discussion of how they use that output to figure out things, what sort of things they'd expect to see in the man page, and what other tools they use to see what's going on.


I'm pretty confident both of you are people I'd be happy to work with in the future and so I want to use this as an opportunity to point out that your reaction to what Kasey said his reaction would have been to your interview question practically guarantees that you would fail to acquire Kasey for your team. I know Kasey a little bit better than I know you (he's a Chicago person) so I'll just sum this up as "that's not a point in the win column for this interviewing strategy".

I know you're not asking people recite "ps" flags from memory. The problem is, you're looking for a subjective "X-factor" in how someone answers a trivia question.

I used to really like this approach too. I'd think of things that you'd only know if you'd actually done the kind of work I'd done. I had some favorite interview questions: "what are some of the functions you carry around with you from project to project in your 'libyou.a' library", and "where would you start looking if you had to debug a program that segfaulted in malloc"? I think the logic I was using is the same as the logic you're using here.

Ultimately, I think it's a bad approach. Equivalently strong candidates look at their work through different lenses, and find different things memorable. Far more important to me, though, is that some people really suck at interviewing --- and, what motivates me more, having succumbed to this repeatedly as a hiring manager and also being myself I think an example of the phenomenon --- some people just interview way better than they should.


The issue with this question falls into 2 main categories. The first is that its a rude question. The basic premise seems to be, "if you are an experienced ops person you have to know something about ps". Which may be true, but by asking it of an experienced operator you are hinting that you don't believe their resume. There is a much more straight forward way to see if they are lying on their resume, call their references and prior employers. For really good operations folks, the answer to this question might be "I haven't needed ps in so long I've forgotten how it works. I've automated away the process running abstraction and have a library of python/bash/salt/chef/esoteric top commands/etc that I find more useful than remembering ps flags". This question implies their experience in that regard is also suspect.

The other problem with this question is that, by its phrasing, it implies that there is a right answer. If I can name more ps flags than an other person, I'm a better candidate. Which when put that way, I think points out its faults. Maybe you don't intend for that to be true, but you'd have to fight your own cognitive biases pretty hard to not at least bias towards the person that can name 17 flags off the top of their head, or the one that teaches you a new flag combo you didn't know. Even though those things are likely not that good of predictors of a good candidate.

You've said what you really want to get to, is can they admit when they don't know something. If that is important to your hiring process (and I'd encourage you to validate that with data), ask them a question no one could know the answer to. Ask the same question to every candidate, and grade it purely pass fail. Did they say "I don't know" or not.


  >> admit when they don't know something... encourage you to validate that with data
Correlating this with employee performance data might not be the only metric.

My experience is that people that won't admit they don't know something are not necessarily bad employees. Often they are capable employees, although this is a trait I rarely see in the best employees. However, I find they are toxic to a good work environment, since they won't listen to people who do know something.

They do seem to get promoted to management had a much higher rate :)


They'll usually look a bit uncomfortable, because they are the sort of person who likes knowing things.

So the process is, in fact, designed to make people feel uncomfortable, for a little bit (or at least you're definitely aware that this is a frequently occurring side effect).

Does that not suggest to you that there may be a negative tradeoff at play here? (To wit: yes you do manage do efficiently extract a few bits of information from them... but at cost of having them feel like they're being, well... interrogated. And are already "losing points" by that point in the conversation with you).


I think part of the point may be to select for people who don't consider "being interrogated" to be a contest. There are plenty of folks who are just like "Here's what I know, here's what I don't know, you decide whether you would want me on your team. If you don't, that's fine, I'll find another team to join." Then there are other folks who, when they're asked a question and don't know the answer, feel like their self-worth is under attack. The latter group can be really hard to work with, and can do a lot of damage to a team.

I had a PM friend who would try to push the bounds of a candidate's knowledge until they gave up and said "I don't know". The actual knowledge wasn't the point of the question: it was that they could say "I don't know", because PMs who can't tend to make life miserable for the engineers & other teammates who work with them.


Yes, exactly. I think to be a good developer, you have to be willing to keep learning. And to be a good developer in a team context, you have to be willing to learn from your teammates, ask for help, and share knowledge in a way that's supportive, not prideful.

And I agree with your PM friend; I think willingness to admit ignorance is even more valuable there.


If somebody goes into an interview expecting never to be asked a question they might struggle with, I'm not sure they're somebody I want to hire.

But no, I don't think they walk away feeling interrogated. My style is pretty conversational, and when people don't have all the answers, I definitely work to make them feel comfortable with that. E.g., for this question I'd close with something like, "Of course nobody knows all the arguments to ls. I sure don't. So you did fine."


This is a stupid trivia question. I use ps -ef and leave it at that. In my entire career I've rarely needed anything else and if I did I would look it up but never enough to remember. To base an interview question off of that speaks volumes about you.


Well, yes and no. If somebody just scored it as written, giving points for each argument known, it would be a dumb trivia question. But as I said, that's not what I do.

Try it next time you're mingling with ops people. Say in a bar at a conference. Don't look at what they answer. Look at how they answer. Some people are comfortable not knowing. Some people are excited to discover the ones they forgot they knew. some people get curious and eager to fill the hole they've just noticed in their knowledge. And some people get defensive and peevish.


Some people focus their curiosity on more important things than arguments for a command line utility.


I agree, and it's interesting that you know -e, while I've memorized -axuf ; I guess it's more portable?

Again, this is one of those, "Here's what I've got cached, and here's where I'd look up what I don't know offhand" questions.


No, "-axuf" is a weird mixture of BSD and POSIX syntax.

From the ps manpage:

Note that "ps -aux" is distinct from "ps aux". The POSIX and UNIX standards require that "ps -aux" print all processes owned by a user named "x", as well as printing all processes that would be selected by the -a option. If the user named "x" does not exist, this ps may interpret the command as "ps aux" instead and print a warning. This behavior is intended to aid in transitioning old scripts and habits. It is fragile, subject to change, and thus should not be relied upon.


I'd probably be snarky and say "I don't know - but I can google it!"

/actually I'd probably hit up the man page first...


dangerous because they prefix things with sudo?


I imagine there's a correlation there. But for me it's the danger that instead of admitting when they don't know they'll just bluff, going off and building things that look good but aren't. If somebody knows their limits and can admit them, you have to supervise them much less closely than a bluffer.


He probably had at least five or ten in mind. I prefer other methods of extracting this information. Usually 'ps -efly' piped into tools like awk/etc to gather what I want, or on occasion, extracting directly from the /proc filesystem. He did not consider these to be proper alternatives.

Since the 'ps -o' flags are only useful in that one command, I just look them up in the (rare) situation where they seem the best alternative. Having been at this for decades, I try to reserve my limited memory capacity for factoids that will pay off, and being able to process columnar data generically at an instant seems more useful than knowing a special way to do it with that one command.


If the stuff online on Quora etc. are to be believed, the hiring committee will disregard such trivia interviews (and possibly send a node to the interviewer asking him/her to correct their technique).

Not sure how well this works in practice though.


Last time I was job-hunting I experimented with Google to see if their interviews were as bad as everyone said.

They were. I was asked a ton of this kind of Linux/Unix trivia in one session. As well as just random stuff. For example:

https://twitter.com/ubernostrum/status/659182356171874304

https://twitter.com/ubernostrum/status/659182564309995520

https://twitter.com/ubernostrum/status/659182708996706308


The question there was something like "roughly how big is 2^24?". I actually think that's a reasonable enough question, but anyway there's a nice story about this same question in another context. The mathematician Solomon Golomb was (as an undergraduate, I think) taking some sort of biology class, and the lecturer described some process to do with cell division and said "... so the number of ways to do that is 2^24, and we all know that that is, don't we?" (meaning, of course, "and no one knows what that is but it's OK because I'm about to tell you". But Golomb happened to have been memorizing numbers of the form n^n, and 2^24 = 8^8, so he immediately called out "Yes, it's 16777216".

As a result of having read this story, I can now always instantly remember what 2^24 is too :-).


Another way to answer this kind of question is to know 2^24 = 2^10 * 2^10 * 2^4 and that 2^10 is 1024 which is roughly 1000.

So 1000 * 1000 * 16 = 16 million would be an easy estimate to make.

If you are designing a system and want to have an idea of how much space or memory an approach will take, being good at this kind of math can be really helpful.


The problem is that there are some people who will need to go through that process in their head to produce an estimate of 16 million - and then there are people who have had cause at some point in their life to know of the existence of the concept of '24 bit color' and that the number of colors expressible in 24 bits (8 bits each of R, G and B) is exactly 16,777,216.


Then they should probably ask people to estimate a thing that's even remotely relevant to the job.

(I mean, I get it, this kind of general estimation and/or ball-parking can be incredibly valuable, but this doesn't sound relevant in any way.)


This kind of general estimation and/or ball-parking can be incredibly valuable, but this doesn't sound relevant in any way.

To be fair to Google (in regard to this one filter question only), they actually are up front about expecting a significantly higher baseline of general mathematical awareness (even for non-mathy roles) than most other shops. At any rate, this kind of estimation comes up in capacity planning (both in architecture planning, and for algorithms on a single box) all the time. Which after all is what Google does, at nearly every level of their stack, (nearly) all the time.


It "comes up all the time"... and these ballpark guesses are probably(!) completely useless for anything remotely practical. At the scale Google's operating you want a Statistician, not a Guesstimator.

So, no, still not useful questions.


What about when you know what the concept of big O is, know why it's often touted as important, and disagree that it's as important as it's touted?

My opinion is that big-O often ends up being used as a premature optimization effort hindering "just get the right answer first". Maybe that brute force method will work in a reasonable clock-time cost, despite having egregious algorithmic cost. You won't know if you're busy overengineering and optimizing before you even have a working solution.


That's a big red flag, at least in a Google setting.

The reason is because when your dataset is in the petabytes, any algorithm bigger than O(N log N) is not going to terminate. You're not going to get any answer at all; your MapReduce is going to sit there burning CPU time for a day, and then you'll kill it, and you won't have any idea what went wrong. (This is learned from experience, if you couldn't tell.)

In a startup context, I'd certainly agree with you - that's exactly the approach I'm taking with my startup, where I'm doing a lot of the debugging and code-wrangling on my local laptop after verifying that the data I want exists and calculating its size with a couple cloud-computing passes. It can sometimes be useful to prototype & iterate with a small subset of the data until you've got the algorithms down.

But many of the algorithms Google cares about don't give any useful results on small data sets. I remember running a few collaborative-filtering algorithms and finding that they gave zero answers because within my small sample of 0.01% of the total data, nobody liked anything else that anybody else did.


In my experience as a Google engineer, this is mostly false. None of the work I or any of the people I know well at Google is on petabyte datasets; most people just aren't on the hot path of indexing/websearch/youtube. In fact almost none of the stuff I was asked about in my interview has been relevant to my work. I have thought about complex algorithms two or three times in the past year.


Not only that, but (in my experience as a Google engineer), factors relating to the location of data (distribution and disk vs ram, etc.) often greatly outway algorithmic performance. Constant factors often dominate variable factors.


Exactamundo! Cache/memory/disk locality dominates almost all other factors[1] unless you're doing something very special. (Speaking as a non-Google person who also likes to think that he knows what he's talking about.)

[1] Well, it's really the amazing amount of throughput that modern architectures can achieve. Latency hasn't improved quite as much given the inescapable limitation of the speed of light.


What department? I was in Search and used that knowledge all the time, and also did rotations on GWS, Google+, and GFiber. Certainly I used less of that knowledge for GWS (where everything you need is in-process) and for GFiber (where I was working on marketing & customer signup - the job req for this was actually different, and both interviews & job duties were much more like a traditional SWE job in other companies). But even Google+ needed a lot of algorithmic complexity knowledge; G+'s scaling needs are similar to Twitter's, and we know how that went.

Perhaps it has changed as well; when I left (2014) they had just started encouraging engineers to focus on one particular task, with the architecture already defined, while when I started (2009) there were still a number of problems of the form "here's a feature we want to add; here's the data we have available; how can we build it?"


I'm on Ads and Commerce front-end, and I also know a number of people on Android. These of course are mostly constrained by users' devices, so don't have the scale your talking about. But I also think the trend you're talking about of smaller roles has been continuing and has a lot to do with it as well.


Everyone loves to put Big-O up on a pedestal, however many times the constant costs can outweigh the rest of the algorithm. Cache access times in particular can be brutal for non-linear access patterns.


> What about when you know what the concept of big O is, know why it's often touted as important, and disagree that it's as important as it's touted?

What if you're ​having a conversation with someone about local politics and you're convinced the local zoning rules don't encourage growth in the way they think they do. Do you force the subject of conversation so you can make sure they know just how wrong they are? Not if you want them to walk away having a good impression of you.

Instead if the talk turns to zoning you put out feelers to see if they'll want to talk about the larger issue, and only engage in the conversation if it seems like the time for it.

The analogy isn't perfect, but if the interviewer wants to talk about the larger issue, you're probably well matched and will have a great interview. If not, and you know enough about runtime complexity to discuss the tradeoffs, then you know enough to just answer the textbook big-O question and move on.


There are situations in which you want to just get a right answer and ship, like at an early-stage startup.

Then there are situations where the code you write must play nicely within a massive, already-complex system, wherein it will work with potentially huge inputs. Like at Google.

So, saying in a Google interview that you don't think Big Oh is super important, and that you prefer shipping whatever correct solution you come up with first and then worrying about efficiency if it ends up being slow, would likely not get you very far.


> What about when you know what the concept of big O is, know why it's often touted as important, and disagree that it's as important as it's touted?

Let's say you created your own company and you're interviewing engineers.

Would you be comfortable hiring someone who doesn't know the difference between O(n^2) and O(2^n)?

Or someone who doesn't even know what these concepts represent?


I've always called this kind of thing "trying to get the sizzle before the steak". You gotta have a steak first; get that, then work on cooking it - then work on cooking it to perfection if that is what is needed or wanted.

There's also the thing I picked up from playing around with graphics demo coding; get the algorithm working first, even if it takes 10 seconds per frame. Then look for the slow parts, concentrating first on the inner loops. Whatever you do, don't try to prematurely optimize the code as you write it, as tempting as it may be. Because almost certainly you'll make things worse than if you waited until after you have a working first pass.

This applies to way more than just fast graphics code, of course.


The questions about BigO isn't about premature optimization or actual implementation. It's about understanding exactly what a particular program is doing and how it's doing it.

Any sort of technical solution given at a Google interview would generally have the following questions:

1) What big O time does this algorithm run in? Why? What big O space requirements does it take.

2) If space were more/less expensive, or time more/less important, how would you change the solution and why?

Understanding those tradeoffs and being able to analyze code at that level is a big part of most software engineering jobs.


Of course. For example, many of the questions for technical interviews involve knowing which data structure to use in a given setting, and applying it to find a solution.

The interviewee is more than welcome to study data structures and interview questions about data structures. A poor student will just memorize questions and answers by rote without really understanding the data structure in question. The good student will actually learn and understand data structures in the context of the solution.

One would hope that interviewers are able to ask follow-up questions about the solution which distinguish between the two.

EDIT: During the best tech interview I have done, I had no idea how to solve the question asked (that is, I had not studied its solution despite this being a relatively common question). I was able to ask articulate questions and invoked understanding of data structures as I went along. Memorizing solutions is a game of luck, and not recommendable.


Bullsh*t. Seriously, these questions are all tricks. If you memorize the tricks you can win the game. Even the "really easy" questions like reverse a string involve a trick -- swapping from the ends and stopping in the middle. The interview game involves memorizing tricks and identifying which memorized trick applies to the question.


I feel like 90% of writing software is identifying which "trick" applies to the problem you're trying to solve and applying it correctly.


Yeah, but you can look them up.


The thing about looking up tricks is you can only look them up if you remember they even exist to look up.


This assumes the trick is easily-indexed-by-name and can be looked up - so you have to know it exists. A lot of the better tricks are more like 'frameworks' whether specific ("Four Russians", "prefix sum") or general ("dynamic programming", "branch-and-bound").

There's a limit to how far you can get just Googling stuff ("how do I reverse a string") for recipe-book solutions to things. I think practically everything in Computer Science could be Googled ("how does merge sort work", "what is an inclusive vs exclusive cache") at some level but this doesn't mean one shouldn't know a great deal of it. At least, if you want one of these jobs...


And yet most people are memorizing solutions to hundreds of different questions and getting great jobs because of it. I know this because I have several friends who do this.


Well, at least hope that some of that information sticks.


This is a great point. There are absolutely candidates who are "good interviewers."

I think one of the most helpful methods of determining a good interviewer vs. a good future employee is taking a structured, situational interview question, i.e. "How would you go about selling a new product to a customer?" and turning it into an actionable work sample that is tailored to your company. "We make this piece of software that does this. Spend the next 10 minutes writing a cold email to a prospect that outlines our offering." As the interviewer, grade the work sample on structured criteria that is important.


This is the "calibrated interviewer" system. It strikes me as funnily similar to machine learning models, in that it's a black box that somehow demonstrates statistically it's effectiveness at a business problem, and you just have to hope there isn't some glaring bias or fatal flaw in the model that only occurs sporadically.


Only if you use a small, static body of questions. But there's nothing wrong with, say, each interviewer coming up with a couple of questions and rotating it them in a while.

You do want to try out the question and criteria before depending on it too much, but it's easy enough to try it out on a colleague.


Among the many things wrong with this approach is the fact that you aren't generating a body of questions and responses that you can evaluate against your choices and the resulting performance of candidates you hire, making it impossible to effectively iterate.


Depends on how you think of the interview process. The theory at work here is that if you get a number of domain experts to give an independent evaluation of a candidate and then compare notes, you'll get a decent outcome. Requiring them to use a structured, repeated question with a formal scoring rubric is to subtract certain sorts of bias from each individual evaluation. When hiring somebody doesn't work out, you go back and work on the interviewers and their interviewing skills, not the questions themselves.

I take it you're working from a different theory, where you try to put the expertise in a machinery of questions and answers, relying less (or not at all) on the interviewers themselves. I think that's also a valid approach, one with different limitations.

Personally, as I mentioned elsewhere, I'm not a big fan of interview questions anymore period. If I want to know if somebody can do the work, now I try to create situations where they just do the work. But if one is trapped in the dominant tech interview paradigm, I personally favor investing in interviewers more than the questions themselves.


The same way it works for teachers that give tests. Come up with a bunch of questions and only ask a random subset of those questions to each person. Occasionally evaluate the questions and add/remove some.


There few questions that you can learn to answer but it is probably very hard to reach the 5 star answer if you do not understand what you learned in detail. An example question would be: what happens when you type www.domain.com into your browser and hit enter. You can answer this many different ways and levels.


That question has also been asked so many times most candidates who have done a few minutes of googling can give you the "right" answer. See here: https://github.com/alex/what-happens-when/blob/master/README...

I'd probably start here to lighten up the mood due to how cliche the question has become:

https://github.com/alex/what-happens-when/issues/231


Candidates doing a few minutes googling to find this, but will they put in the few months to get to the bottom of every aspects of it?


Probably not if they have actual work to do rather than memorize easily google-able trivia questions.


A book conveniently published in the wake of the PR disaster that was their wage fixing scandal.

Nothing about Google's hiring process can be deemed reputable when they'd go so far as to illegally conspire with other tech giants to prevent potential employees from achieving the best possible outcomes for themselves. Maybe things have changed over there, but given the slap on the wrist they got, I'm skeptical.


> Laszlo Bock (former SVP of People at Google) did a great job summarizing decades of research around structured interviewing

Most importantly most companies interviewing don't get to pick from the pool of applicants that google gets to. And even more importantly most companies don't even get to pick from the pool of applicants that a typical funded (or soon to be funded) startup will get.

While I am sure there are things that can be learned from 'Work Rules' the question is how much of that applies to the vast majority of companies in America.


Google's hiring process is famously painful, and they lose a lot of great candidates that way. I wouldn't trust their insights to build a hiring pipeline.


Google is definitely not a benchmark for interviews. From what I experienced, they must have a huge rate on false negatives. You can only afford this if you're Google (or that level).


Yup, it was pretty common knowledge at Google that if you took all the engineers and ran them through the interview process again, it would have rejected most of them.


The view is: better to lose a good candidate than to hire a bad one.


Google's interview process is the single shining example of the worst interview process I've ever been through. So much so I actually took the interviewer to task for it.

It may be good at screening University graduates but it's pretty awful at anything else. It's also really obvious that the interviewers don't actually know how to interview people and are pretty much cargo-culting the same process that hired them.

Awful. It totally destroyed my perception of the company.

Now granted this was... 7 years ago, no (or more) but from colleagues who have interviewed there (or been hired!) it's not changed fundamentally that much.


Getting candidates to jump through crazy hoops is a sensible strategy for a big organization. It selects for compliance, which is important when your enterprise is too complex, uncertain or fast-moving to operate by consensus


A fair point, though not really a defense as such ;)



Easy link to the relevant table from the paper: http://imgur.com/a/YRFTh


From my experience, I'd say Google's interviews need to ask harder questions and last longer.

Today they ask fairly easy questions (given a list of integers, find the subsequence that has the following property etc. - basically leetcode medium level) that you can solve fairly easily using basic CS 101 knowledge. However, it ends up (again, IME) being a race against time where you have to whiteboard code a simple solution in 20 minutes approximately.

I'd much rather have them as questions that take real insight to solve, but have the interviews last 90 minutes or so.


I recently suggested to them not to dumb down their interviews as they simply make the equation about talking to them not worth the time. I remember they used to have much tougher interviews in the past, which were super motivating; now it's like standard hackerrank stuff which is pretty uninteresting. OTOH, each company has its blind spots so that they won't be on top forever, so maybe it's better that way as those that were very capable but rejected could dethrone them at some later time.

To me their approach lately is like forcing a top tennis player to play with kids (well, you would be surprised how many pros have issues slowing down their game) or a double diamond slope skier to ski on blue slopes only (and increasing the risk of getting an injury as their timing/moves are optimized to ultimate performance instead of basics). I've heard of people that solved stuff on interviews nobody solved before them but were rejected as they messed up some basic stuff and recruiter was communicating back to them that it looked bad in the hiring committee.


But if you were interviewing for a tennis instructor, wouldn't that be a valid result? You probably shouldn't hire pros who aren't good at teaching when the job involves teaching.

Similarly, as a senior engineer at a large company, you need to be adaptable enough to work with people with less/different experience than you. Adjusting your message to the audience is an important skill.


It ensures consistency, but how do you know that whatever you're measuring for is actually correlated with job performance?


They don't. In fact they even did an experiment and admitted some people at random, irrespective of how well they did in the interviews. Those people were found to perform about as well as the legitimate interview "lottery winners" hired during the same time period.

A lot of people (anecdotally, the majority) at Google have the "impostor syndrome", and the news of the experiment did nothing whatsoever to quell the symptoms. Now they don't know if they are, in fact, not impostors, but they do know that on average they perform about as well. :-)


Could the performance of the lottery winners have been "environmental"? That is, they benefited from being surrounded by competent people (which was, in turn, guaranteed by those people having gone through the interview process) and "leveled up" due to that?

In other words, maybe as long as you let in a small number (but only a small number) of non-performers, you're fine (which is bound to happen anyway - I'm sure there is some noise in the interviews).


Yes. Getting hired by Google is only part of the deal. Actually _succeeding_ when you're already there is much more difficult. It's a high pressure environment with a lot of very smart overachievers. Because of this it's sort of a self-fulfilling prophecy, and people who don't measure up also don't feel welcome, as it were. Since performance reviews are largely derived from peer feedback, hiring mistakes tend to be self-correcting. Most of the time, though, I've seen great people leave just because they didn't like the pressure. The amount of pressure depends on the team. The higher the profile -- the more pressure (but also more rewards, greater career potential, etc). But the general bar for what's considered "good work" is pretty high, and more uniform than in any other large company I have ever worked at.

Then there's the issue that by the time you even get an on-site, you're already very much not a random candidate. Recruiters actually do look at your track record, etc. You can bullshit there, but I don't recommend it, since references will be spot checked, and they better line up.

Google interviews are largely a roll of the dice above certain level of basic engineering competence. I.e. if you don't know the basics, you will almost certainly not pass them. But if you're a more senior candidate, Google doesn't really know how to interview you, and their interview process turns into a random number generator biased heavily towards "no hire".


They are no longer amongst top choices for top people. Alphabet might be, Google isn't. That's why they are dumbing down their interviews in the past 8 years and repelling even more top people that want to change the world and not be just another cog in the machine.


They certainly still _are_ among the top choices for top people, but they're no longer the _best_ choice for most. I can't in good conscience advise anyone to join any 70K person company. "Cog in a machine" describes it pretty well. Ignore "self driving cars" and "internet balloons" and other BS: there's near zero chance you'll get to work on any of that, particularly if you don't already have a stellar track record at some company Google/Alphabet respects (of which there are very few).


Yes, but even when this is done well, there is still the question of what's measured vs what results in hiring good candidates and not hiring the others. Turning away a huge number of false negatives is also not optimal, especially in a situation where talent is scarce.


this just gives them resource liquidity for operations - the people that take things forward are mostly acquired

if you have to start using structured interview questions to expand a team - you've already lost


I have some major issues with their conclusions... and the title of the article (which is mostly nonsensical clickbait).

The real conclusion should be that "unstructured interviews provide a variable that decreases the average accuracy of predicting GPAs, when combined with (one) other predictive variable(s) (only previous GPA)."

This conclusion seems logical. When combined with an objective predictive measure of a person's ability to maintain a certain GPA (that person's historical ability to maintain a certain GPA), a subjective interview decreases predictive accuracy when predicting specifically a person's ability to maintain a certain GPA.

To then go on to conclude that interviews then provide little, to negative, value in predicting something enormously more subjective (and more complicated), like job performance, is absurd - and borderline bad science.

There are numerous (more subjective) attributes that an unstructured interview does help gauge, from culture fit to general social skills to one's ability to handle stressful social situations. I'd hypothesize all of these are probably better measured in an unstructured (or structured) interview than in most (any) other way. To recommend the complete abandonment of unstructured interviews (which is done in the final sentence of the actual paper) is ridiculous.


> There are numerous (more subjective) attributes that an unstructured interview does help gauge, from culture fit to general social skills to one's ability to handle stressful social situations. I'd hypothesize all of these are probably better measured in an unstructured (or structured) interview than in most (any) other way. To recommend the complete abandonment of unstructured interviews (which is done in the final sentence of the actual paper) is ridiculous.

Well, based on what? Is there any evidence for any of these hunches?


I didn't say it was based on hard data, that's why it's a hypothesis...

But, if you really believe that it's a huge leap to hypothesize that interviews would be a better measure of subjective social skills than metrics like GPA, I don't know what to tell you.


I mean let's go further back. I question the assumption that selecting for "culture fit" does much more than shut people out whose ethnic background or social class is too different.


I mean, that's an extremely narrow reading of "culture fit".

Believe it or not, there are non-racist and non-classist character traits that are also not represented in one's GPA.

But, I guess hiring for one's willingness to work with specific clients, or for one's happiness with the organization's structure (in, for example, a holacracy) wouldn't be valid things to hire around to you?


I've been a programmer for a bit over ten years. I've worked at scrappy little startups, midsized companies, now for a tech giant for a few years. The engineers I work with at the tech giant are consistently better engineers than my other coworkers have been, and I credit the very structured interview process. We're trained to ask specific questions, look for specific types of answers, and each interviewer is evaluating different criteria.

Also, it is quite often not the technical questions that end up making us decide not to hire someone. That's just one area. I know it's the part that sticks out, and candidates give a lot of weight to it in their memory of the interview, but you shouldn't just assume it was that your whiteboard code wasn't quite good enough. I actually don't think that's the most common thing we give a "no hire" for.


Which of the companies paid the best/has the best benefits?

If it's the tech giant, it's very possible they attract better candidates because they offer more.


In my experience, the big Corps don't tend to offer anywhere near as much compensation but better days off etc etc.

The best sofware guys I've ever seen work at those big corps. The big corps are the ones that have the resources to work on the REALLY hard problems and not writing the same CRUD app over and over again. Those same companies also provide tons of educational benefits so that you become an expert in your field.

Things like fighter jet software are fantastically complex. You hear about the dumb mistakes that are made (International Date Line Issues), but you never hear about the thousands of hours of testing every change to code goes through and the insane calculations that are made every second in even standard level flight.

I have no idea why there is such a backlash against interviews. What's so hard about studying for a job you want? Even if you don't USE knowledge, you should at least be able to rederive things using your base knowledge. Thoe interviews I've seen at aero companies are damn hard. Tons of grilling on if you actually understand mechanics and fliud dynamics.


>I have no idea why there is such a backlash against interviews. What's so hard about studying for a job you want?

That's a really good question and I had to think about why I don't like interviews.

1. It's an interrogation and the stakes are extremely high. There are so many aspects of it you don't control. Some guy had a shitty morning, or just doesn't like you, dropped your resume on the way back to his office, etc.

2. It's completely phony and it starts with the first question, "So why do you want to work at generic company X?" "I need money," is not an acceptable answer. Now I have to be phony and tell them why their company is awesome (it's not) which makes be feel like a kiss ass. I have to pretend to be excited about working at a boring ass company. Just shoot me.

3. We have to do it, unless we have rich parents that died young like Batman.

4. The person or people that are interviewing you have no notion of what you've accomplished aside from skimming your resume. All the hard work I've done to produce miracles in the last 20 or so years means, really nothing.

5. Companies only reward tenure at that company. It signals the start of something new but that's typically a bad thing when it comes to work. Less vacation, less credibility, less influence, etc.

6. You really don't know if the person or people you are interviewing are bozos or not. It doesn't matter, they cut the checks, they have the power.

7. As the article insinuates, they're fairly pointless.


they may also be able to hire more, and let people leave on their own (or let them go later), keeping the 'better' ones on board. Smaller companies can't make the same number of hires in the first place.


The tech giant, for sure.


I say the same thing on many of these threads. Most people rejected a at BigCos aren't rejected for failing to pass the technical bar(s); those people get filtered out in the phone screens.

The by far most common reasons interviewers offer for low scores are: didn't listen/didn't acknowledge when they reached the end of their knowledge/didn't test when they got off the rails/just didn't seem like a cooperative coworker.

These are all filters that may not matter at ye olde startup; but in a stable long lived team, these are red flags that seem corellated in my experience with lowering the morale of a dozen other people. So, even if you're a rock star, you might get rejected.


This is a great comment. Totally agree.

> didn't acknowledge when they reached the end of their knowledge

Oh man, this one. How hard is "I don't know?" You're not supposed to have the entirety of computer science and software engineering knowledge jammed into your head. Let's talk about what you know and what you don't and figure out whether you're a good fit.

We hire lots of people without direct experience in what we do because they seem great to work with and we think they'll learn quickly. Just please don't try to bullshit; it doesn't usually work.


> I actually don't think that's the most common thing we give a "no hire" for.

Can you cite any of the more common things you give no hires for?


Some common ones:

- Lack of self awareness and introspection. Not being able to give specific examples of times you've made a mistake and how you learned from it.

- Shit talking old coworkers, general attitude that you're great and nothing is your fault.

- Being unable to talk about YOUR specific contributions; "we did this, we did that, the project made money..." Okay, what did YOU do? Surprising amount of people fail on that one.

- Not being able to give context about the "why" of what you worked on, what other options you considered, etc.

- Not thinking about the end user; "I used this technology because it sounded cool" instead of "I wanted this result for my customers." Tell us about the situation and why you made the right or wrong choices.

- Interrupting the interviewer and not asking questions. This is a weird one. We want you to do well, so sometimes if you're going down the wrong path we'll try to help. I've had candidates talk over me and just barge ahead down a totally incorrect path. ¯\_(ツ)_/¯

Obviously most weaknesses can be overlooked if there are serious strengths.

I've done a lot of interviewing, happy to answer questions.


>- Lack of self awareness and introspection. Not being able to give specific examples of times you've made a mistake and how you learned from it.

>- Shit talking old coworkers, general attitude that you're great and nothing is your fault.

These are fair enough, although you did say above that the other coworkers in your old scrappy startups were much worse.

> - Being unable to talk about YOUR specific contributions; "we did this, we did that, the project made money..." Okay, what did YOU do? Surprising amount of people fail on that one.

Some people will struggle to take credit because of politeness. Isn't taking credit trivial to do? Especially when the interviewer cannot possibly check up on that. I'm absolutely not convinced about this one, although I do make a point to talk about my contributions when I interview (which frankly doesn't happen much at all, since I tend to stay in the same job for long periods if possible and try to make things better there).

>- Not thinking about the end user; "I used this technology because it sounded cool" instead of "I wanted this result for my customers." Tell us about the situation and why you made the right or wrong choices.

This sounds like bullshit too, to be honest. Most people don't get to decide the general point of what they do unless they are the CTO. However I do realise that this is absolutely the kind of stuff interviewers look at and you need to prepare for it.

>- Interrupting the interviewer and not asking questions. This is a weird one. We want you to do well, so sometimes if you're going down the wrong path we'll try to help. I've had candidates talk over me and just barge ahead down a totally incorrect path. ¯\_(ツ)_/¯

Fair.

>Obviously most weaknesses can be overlooked if there are serious strengths.

Which you really won't know until months later.

>I've done a lot of interviewing, happy to answer questions.

I know people who'd do badly on several of these accounts and are completely brilliant at getting the job done.

The problem is that you cannot possibly know how well would the ones you rejected would have done.

If you can have a probation period of 3 months or so, that should be the main yardstick. Of course someone's attitude can become shittier over time but no interviewing process can reliably catch that.


> These are fair enough, although you did say above that the other coworkers in your old scrappy startups were much worse.

Nope, I said the engineers I work with now are consistently better. "My current team is very good" is MUCH different from "my last team was very bad." (And I'm not in a job interview.)

> Some people will struggle to take credit because of politeness. Isn't taking credit trivial to do?

No, it's not so much about credit, it's about specificity and the ability to talk about the different parts of the project. A lot of people get stuck on, "We made a project that sold widgets." I want something more along the lines of, "In our widget selling application, I worked on backend services for payment processing and fraud detection." Some people get really stuck on generalities and won't dive into specifics. It just makes it impossible to evaluate your contribution. Maybe you didn't do anything. I have no idea, you're just not generating useful information for me.

> Most people don't get to decide the general point of what they do unless they are the CTO.

I totally disagree. So many tiny choices that engineers make from day to day have a tangible impact on customers. Even if you're a junior engineer, you have an impact on the latency of service calls you're responsible for, as an example. Do you pick the lightweight framework that loads quickly, or do you need the features of the bigger one? It's that kind of tradeoff that people should consider, and "I saw something shiny" is not a good answer.

> The problem is that you cannot possibly know how well would the ones you rejected would have done.

Yep, for sure. I'm positive that I've rejected good candidates. That's the side you want to err on though, especially if you have lots of good candidates.


> No, it's not so much about credit, it's about specificity and the ability to talk about the different parts of the project..."In our widget selling application, I worked on backend services for payment processing and fraud detection."

If that line is as detailed as you're looking for, then that's reasonable. Although I've seen people get annoyed that I don't remember the specifics of exactly what I implemented and why and a list of the various decisions made on projects that are 2+ years in the past.

At my current company I've worked on 300 tasks over the course of 2 years, across 30+ projects, ranging from simple bug fixes to implementing large swaths of new software for Fortune 100 clients. There are some similar items in there, but most of them were different.

I don't have an amazing memory, so a lot of the old stuff I worked on becomes very vague. Hell, it can take me time to remember what I need to do to work on something I haven't done in the past six months at my current company.

I know those old projects in broad strokes but if you want me to talk about a specific project in detail that happened years ago, then I will struggle to dig up those memories, even after reading a document I made that refreshes things a bit before interviews.


"- Lack of self awareness and introspection. Not being able to give specific examples of times you've made a mistake and how you learned from it."

I'd venture that not being able to answer this question and others like off the top of their head is less an indication that they are unaware of/don't learn from their mistakes than it is an indication how well they practice for bullshit interview questions and can give you a bullshit answer to your bullshit question that gets past your personal bullshit-o-meter well enough for you not to be offended.

But since you're also probably trying to filter out cynical assholes like me, ¯\_(ツ)_/¯


> not being able to answer off this question...

Well it certainly shows a lack of preparation and experience.

Here's how I approach those questions. I think about each project I've launched, and whether they're relevant. It's an easy thought exercise, and then you have an answer to that question forever. And seriously, time you've made a mistake should be an easy one, and it's one everyone should have ready. Over your whole career you can't think of a time you've done something that you'd do differently if you had a chance? I bet you can, and if not, it shows a serious lack of introspection.

We want people trying to make themselves better. Thinking about your career and studying for interviews is part of that.

> filter out cynical assholes

Dude, at least 50% of the tech industry is people who would charitably be described as cynical assholes. Part of being professional is being able to turn it off. If you can't give a professional answer in an interview how can the interviewer expect you to give a professional answer in a contentious meeting?


Can confirm. My work life has been smooth and fairly boring, and I have piss-poor autobiographical memory anyway, so I pretty much have to make those up to have anything worth telling. Maybe start from some half-remembered kernel of truth, but it's all BS from there, because it's all I've got. Have to rehearse them so I'm not making shit up on the fly--too much risk of tripping up, or making it too flashy, or accidentally saying something in a way that makes it seem like I'm shifting blame, or taking credit I shouldn't, or anything like that.

Especially the "tell us about a time you had a conflict with someone" question. Ugh. I guess I just need to seek out some assholes to work with because I've got a big fat nothing to talk about on that one. I ought to start writing down the crap I make up for it so I don't have to do it again every time I interview.


They don't have to be big monumental conflicts or fuckups or whatever. And yeah, you should totally have a pre-canned answer. It can be something minor; "I once spent a lot of time implementing feature X, and then I realized I could have just done feature Y in a quarter of the time." Or, "Bob wanted to build X, and I thought Y would be better, so we disagreed about it, and ultimately Z." (And make sure the story doesn't end in, "and I was super wrong.")

I know it's phony, but job interviews are sales pitches. Preparation helps.


> - Shit talking old coworkers, general attitude that you're great and nothing is your fault.

There's a gulf between "nothing is my fault" and "some of my old coworkers were shit and they deserve the shit-talk they get from me".

I don't think it's fair to judge people universally on the base if they shit-talk or not. Being 100% kind and never saying anything bad about people in your past is vastly overrated.

> - Not thinking about the end user; "I used this technology because it sounded cool" instead of "I wanted this result for my customers." Tell us about the situation and why you made the right or wrong choices.

You're polarizing this way too much. If I go after a "cool technology", there are several things MANY interviewers don't take in consideration:

(1) It's only their own interpretation I chose a "cool" technology over the customer needs. Cognitive bias and all. Not to mention most people really have no qualifications to even claim this.

(2) In the senior programmer area (where I believe I belong) often times you have to make calls nobody can inspect for a while, you have to trust your experience and intelligence and make a decision quickly. If Ruby on Rails consistently fails you on a single website project of yours, it's very okay to start switching out its slowest parts with Elixir's Phoenix <-- that's a recent example in my work place. I chose both "cool tech" and "customer needs" together.

(3) Many times there's no immediate benefit to your work. It's easy for a manager to blatantly reject a hard mid-term decision implemented by a tech lead as "he's after the cool tech only because he's bored" and only I know in my head that the results from that "cool tech" will start showing in a month from now (it also doesn't help at all when I tell them that and they don't believe me).

> - Interrupting the interviewer and not asking questions. This is a weird one. We want you to do well, so sometimes if you're going down the wrong path we'll try to help. I've had candidates talk over me and just barge ahead down a totally incorrect path. ¯\_(ツ)_/¯

I admit I've been in the wrong on this one but I want to give you another perspective. I wanted to make a certain point -- 99% of the time it's the "why did I do this" or "why did I fail doing this" or "how did I succeed by doing this" -- and sometimes I digress because there are sub-points and I overdo my tries of being crystal clear. Granted, that's up to me to perfect as a communication skill but I've been extremely annoyed by interviewers who can't seem to trace my initial line of thought and try to coax me back in it. Instead they give you a smug expression along the lines of "this guy talks too much" and they form a negative impression right there and then. I can see it in their eyes and quite frankly I lose respect for them immediately as well -- they can handle such a situation much better. Interviews are a two-way process and both sides screw up in every single interview.

So overall, I believe you're over-generalizing on a few points.


> "some of my old coworkers were shit and they deserve the shit-talk they get from me

This is just such a red-flag in an interview. I have worked with some terrible people, for sure. But I would never talk about them in an interview.

> t's only their own interpretation I chose a "cool" technology over the customer needs.

Heh, I swear to god when I asked the question, "Why did you pick X technology?" they said, "Because it seemed cool." I want to hear, "because it performed faster" or "because it had X feature I wanted" or whatever. Tell me WHY it's cool. I just want to hear how you'd explain a choice to me as a coworker, I'm not looking to actually second guess old choices. I'd even take, "Well, we had a really short deadline, and I was familiar with the technology. I decided hitting the date was more important than evaluating all the choices." I want to hear about your thought process; I don't care what choice you made. "Because I thought the industry was moving that direction and wanted to be future-proof and improve our ability to hire" is a good answer too. There are a ton of good answers, but "because it looked cool" is not one of them.

> interviewers who can't seem to trace my initial line of thought and try to coax me back in it

For sure. I try really hard not to talk when a candidate is talking because I know how hard it is to get thrown off in such a stressful situation. Sometimes the candidate has just obviously misunderstood my question though. This is much more important on technical questions -- I believe a good interviewer will treat the candidate like they are a coworker, and basically solve the whiteboard problem in a fairly collaborative fashion. Obviously the person doing the interview will take the lead, but I'm SUPER happy to answer clarifying questions, and if you start to struggle the right thing to do is to ask some small questions and see if the answers help. Just sitting there banging your head against the whiteboard isn't useful.

> So overall, I believe you're over-generalizing on a few points.

For sure, I'm just trying to give some really brief summaries of bad behaviors I've seen. I've also given a "hire" to people who did one of each of the above; like I said, none of it is disqualifying.


Now that you clarified, I actually think we're very much on the same page. ^_^

> This is just such a red-flag in an interview. I have worked with some terrible people, for sure. But I would never talk about them in an interview.

Of course! Don't get me wrong. I am not making it a goal to shit-talk former coworkers in an interview. I try to avoid it, but what am I to do when I have to share that I maintained a Rails 4.0.x app for 16 months and was unable to upgrade it even it to 4.1.x because the previous team did a horrific job? It wasn't my fault at all and I actually sunk several weekends trying to upgrade it but was drowned in mysterious and horrifying errors (like an ancient state machine gem having very rigid dependencies and utilizing 80+ monkey patches practically made any upgrade impossible) before giving up? Lie that I suck? No I don't suck, they sucked, and I ain't taking the blame for them. That being said, I have better uses of my leisure time, and my work time as well -- having in mind I am not expected to refactor at all. So I tried sinking 35-40 hours of leisure time in that problem and moved on.

> Heh, I swear to god when I asked the question, "Why did you pick X technology?" they said, "Because it seemed cool."

You'll hear a very hearty (and non-sarcastic) "I am sorry for that awful experience, man" from me. I never do that. I always explain myself. I went out of my way to rehearse in my spare time, just asking myself the question "why did you pick tech X?" -- I am a firm believer of Einstein's saying "If you can't explain it to a 5-year old then you don't understand it"!

> Sometimes the candidate has just obviously misunderstood my question though.

Perfectly fair. Never objected to that and I also went to apologize for the misunderstanding because in an interview every minute counts.

Thank you for being constructive.


> what am I to do when I have to share that I maintained a Rails 4.0.x app for 16 months and was unable to upgrade it even it to 4.1.x because the previous team did a horrific job?

The reason I personally care about the skill of talking about bad coworkers tactfully is that I think it correlates with being able to navigate tricky workplace fuckups without making a big political mess that I have to clean up. The way I would phrase the above situation is:

"I inherited a Rails 4.0 app, and I wanted to upgrade it to Rails 4.1 so that we could take advantage of feature X. This wasn't on our roadmap, so I spent a lot of my own time looking into the feasibility and working on it in my personal time. However, it eventually became clear that too much of the legacy code would have to be totally redone, and there wasn't enough of a business upside."

> I am a firm believer of Einstein's saying "If you can't explain it to a 5-year old then you don't understand it"

Me too! Love this.


> The way I would phrase the above situation is:

You're a better diplomat than me. I could've phrased it almost the same, but with me the tone and the wording depends a LOT on my current mood -- I have to work on this, that's for damn sure.

It does however still leave the question "but whose fault that legacy code is?" open, don't you think? With my wording in the above comment I would've aimed at being crystal clear -- and thus somewhat rude in the process, admittedly.

> without making a big political mess that I have to clean up

Could you please clarify on that point? I am curious.


> It does however still leave the question "but whose fault that legacy code is?"

Heh, I can do this all day. "We accumulated a lot of tech debt on a previous project because of very short deadlines." Or, "the previous owner learned the technology while building the system -- I'm sure he would have done it differently today." What I'm looking for is that you understand what leads to this sort of situation, and it isn't usually, "the last guy was an idiot." I mean sometimes, sure, but even then it's usually closer to, "the last guy was hired into a role he wasn't ready for and needed a better mentor, or better training."

> Could you please clarify on that point? I am curious.

It's really helpful to be able to send an engineer to collaborate with another team without having to worry about whether they'll end up butting heads with someone and making a mess. Tact is important. Sometimes other people are under constraints that you're not aware of, and it's useful to have empathy. Maybe they have super tight deadlines, or maybe they're having to use a technology they've never used before. It's easy to say, "this person is a moron and their code is bad," but if they get the impression that you think that it can really harm working relationships. At that point, I end up having to step in and smooth relationship over, and it's not a great use of time.

Talking about how bad ex-coworkers with tact shows me that you 1) have empathy and 2) will understand how to navigate similar situations if hired.


> Heh, I can do this all day.

I can see that! :D Thanks, you've been very helpful. Believe it or not, I am learning from this interaction.

> Tact is important.

I don't disagree and I'm with you here. But herein lies the dilemma -- I've been tactful and diplomatic way too many times for my taste. I've had my fair share of politics. I am not horrible in it; I simply started lacking any patience for it and thus my mood started leaning heavily towards being blunt and somewhat inconsiderate. I am not outright offensive but I am no longer tactful and diplomatic (sigh).

I started standing up and heavily protest in the face of bad politics however. I have less patience now because I expect everybody else to do the same -- and they don't.

That's why I get easily irritated. I absolutely agree with ALL of your points -- the people might have had horrible customers (or team leaders), they might have drained the budget close to the end of the project so they probably had to cut tens of corners, they might have been gaining experience along the way... and a plethora of other possibilities. I agree.

I do have empathy and tact. But I have lost almost all patience in the last few years.

I appreciate that you might not believe such a polarized message in an interview. But I got slightly carried away sharing. :)


Happy to help!

At the very least, just prove that you can keep your mouth shut if you think someone is the worst, heh. That's really all I'm looking for. ;)

I also totally understand getting frustrated. I've been a total dick to my coworkers in the past, for lots of different reasons. I was burnt out, I was young, etc. I've mellowed out a lot over the years. A good chunk of that is also just being on a good team; it's really hard to be the best version of yourself of you're not well supported.


> I've mellowed out a lot over the years.

And here you have myself at 37 being much more impatient compared to 27. :D Truth be told, I am also much more mellow in general, just not in work lately.

> At the very least, just prove that you can keep your mouth shut if you think someone is the worst, heh. That's really all I'm looking for. ;)

Double thanks, this is an extremely valuable advice!

I appreciate you taking the time.


Thank you for providing the helpful specifics.


Thanks...could you please cite specific examples?


Added some above.


Take a second to read about the experiments this author conducted. They included:

Dummy candidates mixed in with the interview flow that gave randomized answers to questions (interviews were structured to somewhat blind this process), and interviewers lost no confidence from those interviews.

Interviewers, when told about the ruse, were then asked to rank between no interview, randomized interview, and honest interview. They chose a ranking (1) honest, (2) randomized, (3) no interview. Think about that: they'd prefer a randomized interview to make a prediction with over no interview at all.

Of course, the correct ranking is probably (1) no interview, (2) a tie between randomized and honest. At least the randomized interview is honest about its nature.


The interview is also selecting for a single thing - GPA. You can be an utter arsehole and have a high GPA. You can have personal hygiene that stinks out any room you're in and have a high GPA. Basically, you can be completely impossible to work with and have a high GPA. The research they've done is suspect, because they weren't interviewing for an ongoing role, but for a single KPI.

Similarly, the people doing the prediction were other students rather than teachers who know better what to look for. The research would better fit HR people hiring for roles they know nothing about than experienced team members hiring for roles similar to what they do.

The research results are massively over-applied, and in no way whatsoever is sufficient to use the term "utter uselessness of X". Unsurprising, given it is research by a 'professor of marketing'.


Like everyone else, I agree a face to face interview is the best way to verify a candidate doesn't smell bad.

I assume that interview could be pretty short.


Well, I suppose you could fish out the most trivial part of the comment and refute that. Why not set up a battery of short unit tests for the candidates? "Follow me, Jones, and we'll see how you handle the 'coworker with chewing gum' test next..."

The study was done on students, who almost universally have zero experience selecting people to work under them in an industry setting (or at all). Drawing conclusions from this particularly inexperienced subject pool and then extrapolating out is bogus, particularly given the extremely certain language of the article. The subject pool is at an age (18-22) where people are still figuring out what to make of themselves and others; they have extremely little adult experience of the workplace and judgment of character - indeed, at this age, people are notorious for making bad decisions in their personal lives.

When you look at the actual paper they link, out the window goes the declarative language, and instead the article is unusually full of weasel-words ('can', 'may'). There's a major difference between "utterly useless" and the actual conclusion of their paper: "interviewers probably over-value unstructured interviews".


In the experiment the authors ran, interviewers felt more confident after interviewing a "candidate" who had secretly been instructed to randomize the answers. Later, they informed the researcher that so great was the value of the information they extracted from interviews, they'd prefer a randomized interview to no interview at all. I think the authors made their point quite well.


Meeting the "candidate" lets you apply some judgement about them that you can't use from just reading a CV (or just knowing their past GPA). How they look, how they dress, their accent, their body language, etc.

People think they can get valuable information from this, so they want to meet the candidates, even if they are told the answers are random.

You need to prove (and then convince people) that all this extra information (the impression a person makes when you meet them) doesn't improve your prediction of their future GPA over a prediction based only their past GPA.


It's the whole process that's useless.

Remember a few years ago when this forum was drowning about how to find and hire "10x" people? 98% of employees were useless in the face of the 10xer.

The reality is, most of the time screening for general aptitude, self-motivation and appropriate education is good enough.

I've probably built a dozen teams where 75% of the people were random people who were there before or freed up from some other project. They all work out. IMO, you're better off hiring for smart and gets thing done and purging the people who don't work.


This I think is a very a good point. The scariest point about Lazlo Block's book is that he is very against the idea that people can improve or be trained better. To him there is this idea of predestination. You're either always good and always have been, or you're not good and won't get better and there is nothing Google can do.

I feel like as a company you could exploit a lot of value by just hiring people and training them. Training btw doesn't mean at work. Think how much you learned in college lectures vs. reading the actual textbook. It means going home and studying, and the incentive should be that you're getting things that make you more employable, so this shouldn't count towards hours (if it was say a very proprietary old programming language though I could see this argument falling apart).

Not saying this is the best way but shocked that everyone is "trying to find the best". What kind of world will that leave us in if all companies want to hire the same 3% of people? Businesses move slower, talent is lost, and inefficiencies accrue.


Wow, that's really terrible to hear. And of course, conveniently, it lets the employer off the hook for any notion of team or project fit with the employee. If they weren't performing well, then obviously they never will in any capacity in any context. Awesome.


A lot of the pure evil HR bullshit is the dance around discrimination suits and to maintain the use of alma mater as a discriminator. It's literally in their interest to assume that you're a widget fit for a specific purpose.

If you provide meaningful training, you need to be fair in the application of said training. If you admit that you can train people with common existing skills to do most technology jobs, it's going to hard to justify your cozy recruiting funnel with a small number of universities, picked in the basis of where a few bigshots in the company went to school.


One wonders what exactly happened to this industry, why instead of training, companies are offloading the work to universities, MOOCs, and boot camps.


All about cost and trying to offload it somewhere else. Sadly it's very predictable, most businesses work that way.

Then again, theoretically this gives a huge edge to the proactive learners.


The questions are calibrated to find juniors people out of school. They can enter easily and get training afterwards. The training is not offloaded.

However there is some challenge with having experienced people. They simply can't answer the screening after a while and can't get hired. And who's gonna teach the juniors if you don't have seniors???

Personally, I moved to finance. I find that experience, domain knowledge and maturity is valued more there :D


The problem is GPA itself isn't necessary a valid data point. It's less fallible than "gut instinct", as the author here seems eager to claim, but personality type can be more important than ability to memorize facts.

I'd rather hire a programmer who knew less and could get along with others than a master dev who's a total a-hole.


It should probably have been titled "The Utter Uslessness of Unstructured Job Interviews", because that's the kind of interview the author criticizes.

In my personal experience, structured interviews can be very helpful in determining a candidates abilities.


Are you sure you're using the same definitions as the author? In practice, across a pretty decent sample of large tech companies, I've never seen a truly structured interview outside of Matasano, where I was almost murdered by my employees for instituting them. I think we'd hear far more complaints about them if they were common.

It's possible that what you consider to be a structured interview is in fact what this author (and I) would call unstructured. Specifically: if the interviewer has any discretion about questions at all, the interview is probably fundamentally unstructured.

In a structured interview, the interview is less an interrogator than a proctor or a referee. Every candidate gets identical questions. The questions themselves are structured to generate answers that facilitate apples-apples comparisons between candidates: they generate lists of facts. The most common interview questions, those of the form "how would you solve this problem", are themselves not well suited to these kinds of interviews. It takes a lot of work to distill a structured interview out of those kinds of problems.


> Lists of facts

This has me mulling whether that might be a better approach to administering law-school exams than the traditional analyze-this-hypothetical-fact-situation approach. (I'm a part-time law professor.)

More generally: I wonder to what extent school examinations can draw useful lessons from job interviews.


How did you feel about your experience using structured interviews for hiring?


Truly structured interviews are better than even the most rigorous traditional interviews. They are also more expensive to design and much more painful to deliver. Some kind of interview is probably necessary for every serious tech hiring process. Organizations should be realistic about the low quality signal they'll get even from structured interviews. Take time away from interviews and feed that time to off-site work-sample testing.


I've tried work-sample testing with middling results:

-The more hoops a candidate has to jump through, the more likely they are to bail out of your recruiting funnel. This is especially bad for college/postgrad recruiting when you aren't the #1 employer in your field. Everyone wants to work for the Googles and Facebooks of the world. It's hard getting someone to spend a couple hours for your startup job.

-People cheat. We usually issue a short coding project, grade for correctness, then do a code review over Skype or face-to-face. Many candidates turn in the exact same responses. I've even seen people cheat and have a friend do the Skype session with a totally different guy flying out. Do you proctor your test in a secure center? Use an online service to lock down their machine and record? Both are pretty invasive. Switching up the questions constantly is tough and makes your signal noisier.

-Industriousness and raw intellect trump skills/knowledge most of the time. Sure there's a baseline level of skill required to train someone quickly enough, like I wouldn't hire someone who didn't know basic data structures, but work-sample tests are often biased to those with a very specific background. I don't want employees who are great at doing what I need today. I want ones who will be great at figuring out what to do years down the line.


First: if you make candidates do work sample tests, you should reduce the amount of in-person interviewing you do to account for it. Up to a limit, most candidates would prefer your selection/qualification process to happen from their homes than from your office. Unfortunately, companies aren't serious enough about their tests to trust them, and do indeed tend to make this just another hoop.

Second, incorporate the work sample tests into your in-person (or even telephone) interviews. Now you have a (hopefully interesting) technical problem they ostensibly just solved to talk about. Your evaluation should be by formal rubric, not interview, but it's easy to see if someone actually did it. We had no problems at all with people cheating (of people we hired with this process, over about 4 years, we fired not a single one).

Finally, I could not be less interested in the kind of amateur psychoanalysis tech interviewers hope they're accomplishing as a side effect of quizzing people about code in their conference rooms.


I'm curious in your structured interview process what parts of the interview are setup for the person interviewing with your company to interview your company? Interviews go both ways. I've turned down more jobs than I've accepted due to the company doing poor on the interview.


From the hiring side, I agree. There's no other way to smoke out those whose creative writing skills extend to their resumes, and you get a pretty good idea of personality differences between candidates if you use the same pool of questions for a given position.


This discussion misses an important element, the skill of the interviewer. It is unsurprising that unskilled interviewers' assessments are poor predictors of future performance. It would be interesting to measure the accuracy of interviewers who have had years of experience interviewing, hiring, and managing people.

Here's how I think it works. Skilled interviewers are biased toward rejecting candidates based on any negative impression. Structured interviewing has the same effect. It's the precision versus recall tradeoff. For this use case only precision matters. Extremely low recall is fine.

Also, in the GPA prediction example, the interviewer is penalized for predicting a low GPA for a person who performed well. But in hiring, there is no penalty for failing to hire someone who would have performed adequately.

(Yes, I understand there is an implicit assumption in my argument that candidates are not in short supply, but that's usually true, certainly at Google)


> For this use case only precision matters. Extremely low recall is fine.

Only if you have arbitrarily large amounts of time you can spend interviewing. Most of us don't.


> The key psychological insight here is that people have no trouble turning any information into a coherent narrative. This is true when, as in the case of my friend, the information (i.e., her tardiness) is incorrect. And this is true, as in our experiments, when the information is random. People can’t help seeing signals, even in noise.

People see patterns where there are none. I think this is fundamentally why humans fail at statistics. If every fiber of your being wants to see patterns then you will see patterns. Probably why people hallucinate when in sensory deprivation tanks as well. The brain will make up patterns just so it can continue to see them.

The paragraph right after follows up with the statistical failure that pattern seeking leads to

> They most often ranked no interview last. In other words, a majority felt they would rather base their predictions on an interview they knew to be random than to have to base their predictions on background information alone.

So people would rather do busy work in order to continue to satisfy established pattern seeking habits than figure out a better way.


Matt Mullenweg advocates audition/tryouts instead of job interviews.

https://hbr.org/2014/04/the-ceo-of-automattic-on-holding-aud...


That has to be one of the most employee-hostile hiring strategies I've ever read.


"I'll pay you $50/hour to work on a project here, Monday to Wednesday. At the end, you'll either get the job or we'll find a different candidate."

I'd take that over every other bullshit interview process I've gone through in the past, while also feeling satisfied that it's a more accurate assessment of how I would actually perform at the job.


> The most significant shift we’ve made is requiring every final candidate to work with us for three to eight weeks on a contract basis. Candidates do real tasks alongside the people they would actually be working with if they had the job. They can work at night or on weekends, so they don’t have to leave their current jobs; most spend 10 to 20 hours a week working with Automattic, although that’s flexible.

It's far more extensive than three nights.


Ok, yeah, that's pretty ridiculous. In concept, though, I think these kinds of trials can be a win-win.


Honestly I don't find this a realistic option either (although better than the approach in the GP's linked article...). IME at many companies you will be in great shape if you're set up with accounts/licenses/permissions/etc so that you can theoretically do something useful by the start of your second week. Once you have all of that set up, you still typically need a few days or a week+ being brought up to speed on projects, the structure of existing code/systems/etc.

Personally I feel like if you can't come up with an interview process that gives you enough confidence to risk a 3-6 month onboarding period on a new hire, you're doing something wrong.


You're pretty much limited to hiring either only the jobless or someone with the ability to work two jobs at once for an extended period of time while you hold your tryout. There are a lot of people who can't devote that kind of time to a second job (some candidates burn a week's vacation!)

And it's not some mere formality, according to their stats they hire about a third of the people they try out.


> or someone with the ability to work two jobs at once for an extended period of time while you hold your tryout.

This is the part that's crazy to me. If you think you can accurately judge someone based the work they perform after a full normal workday/week, I have some beachfront property on the moon to sell you...

I have a feeling their conversion rate is so low because a lot of people get another offer during their trial and immediately jump ship.


> I have a feeling their conversion rate is so low because a lot of people get another offer during their trial and immediately jump ship.

Or because they're just selecting against good candidates in the first place; if you've got a good job and are performing well elsewhere, you're less likely to jump through that kind of hoop. It's easy to persuade yourself you're hiring the best candidates by being strict about selecting from a pool that is already heavily biased (I'm pretty sure Joel Spolsky wrote something along those lines years ago, so this is hardly a new insight).


I presume they only hire about a third of the people who try out for the same reasons you mention; the only candidates are the jobless or people willing to screw their current employer and moonlight while "working from home".

Sounds like a recipe for a bunch of bad candidates.


And don't forget anti-moonlighting clauses in current employment contracts.


As someone who has worked at every level of IT (startup to Fortune 500 executive), hired thousands of people, and personally interviewed hundreds of candidates of all levels of experience, the conclusion I have come to is that interviews are almost entirely worthless.


It's interesting, because I'd argue that the best companies in tech have have interviews that are very structured and predictable.


Is that true? Isn't the generic accusation against AmaGooFaceSoft that your interview performance largely depends on whether you happened to study some interviewer's pet topic (e.g. parallelism, dynamic programming, networking, etc.)?


Having done interviews at a few of time, I'd say that's mostly false, at least for the last 2-3 years.

As far as I can tell, there's always a pre-defined pool of questions. Some of them are pretty 'open ended', but still targeted towards getting a good impression of a certain area of knowledge.

That being said, I had an interview with Google at some point in the past where one of the interviewers almost seemed appalled that I didn't know the exact list of items that can be found in a filesystem superblock. But at all other companies it seemed a bit more sane. I guess it's partially a function of the type of personalities a company is willing to hire :)


Careful. First and most obviously, the "pool of questions" concept is structured only if you have a Google-scale volume of candidates and reliably assigning specific subsets of those questions to candidates.

But even then, for such a process to be rigorously structured, the questions themselves need to be determined a priori and without reference to the candidates background or preferences. Otherwise:

* You're subject to the interviewers and judges (probably subconscious) biases about the merits of different questions.

* You're more exposed to the candidates own innate ability to interview well by navigating themselves to more easily-answered questions.


Recently I failed an interview at Facebook because I chose to serialize a binary tree into a list of N items, rather than a list of N items and O(N) sentinels. Most of the interview was spent convincing the interviewer that this could possibly be correct.


i recently failed a programming test after acing all tests except one where i needed a text diff and pulled in a library instead of writing my own in less then ten minutes. they said i need to get better at algos. which is fair. but i dont know anyone that can program a decent working text diff in less then ten minutes.


Based on your description, it sounds possible they were just looking for a Levenshtein distance implementation, which is definitely in the Universal Weird Corpus of Interview Questions and for which people who prepare for interviews a ton would have a good shot.


I ported a diff algorithm (a simple one at that) just a few months ago because the language I was working in didn't have one. I couldn't do it off the top of my head today, or probably the day after, it was transient knowledge that I didn't even try to remember.


Why did you conclude that you failed because of that? That sounds like a pass with flying colors. Did you specifically get feedback about that segment?


In the few remaining minutes after we agreed that it was possible to correctly deserialize a tree serialized in this way, I did not produce a linear time deserializer.


:-(


was it a binary tree or a binary search tree? a BST can be uniquely identified by one of pre/post-order traversal, whereas a binary tree requires an in-order traversal and one other. or placing sentinels that specify a state machine [PUSH_LEFT, PUSH_RIGHT, POP] that specify how to move down and up the tree structure.


It was a BST.


n sentinels? What do you mean?

Do you mean characters to indicate a null child?


This accusation is almost always from people who haven't studied that particular pet topic. In my experience, the pet topics you list - parallelism, dynamic programming, networking, etc. - are needed quite frequently for those jobs at AmaGooFaceSoft. In other words, those candidates would be poor fits for those jobs, which is exactly what an interview is trying to test.

I think the Google hiring process is fucked up in a lot of ways, but I don't think reliance on whiteboarding or the selection of interview questions are two of them.


> This accusation is almost always from people who haven't studied that particular pet topic

Well sure, but you only have so much time to study for these interviews.

So, you make sure you review your basic data structures, algorithms, and complexity analysis. The things that will most likely show up.

Then you have to consider the more advanced topics. Realistically, you're not going to have an expert-level understanding of all of them. None of them are extremely difficult concepts to understand necessarily, but you need the kind of mastery to understand such a problem quickly and under pressure (since it's a short, timed interview).

The most effective way to study, then, is to simply run through a battery of dynamic programming questions and hope you get lucky and the interviewer asks you a variation on one you studied.

I've had more than a few AmaGooFaceSoft interviews ask me slight variations on the change counting question. They're really testing my ability to recall the solution to that problem, not any sort of deep understanding of dynamic programming.

On the flip side, the interviews I've failed almost always involved some pet topic that I wouldn't dream of studying. I once (no joke, and not THAT long ago) had an interviewer at AmaGooFaceSoft ask me do a bunch of calculus questions involving power series (turn a function into a power series, determine the radius of convergence, etc.). Why would I study that for an interview?


> Well sure, but you only have so much time to study for these interviews.

Considering the payoff (potentially a $40,000 to $100,000 raise on a 40 hour work week), I think that the amount of time to study for these types of interviews is very trivial. Interviewing skills can be used for interviews at many well paying companies because this type of interview has been standardized, so it's not like you're only studying specific knowledge for one company.


Are those topics really needed for (what you're implying) is the majority of jobs at those companies? I'm not sure if there's any way to really test this.


Yes. This is the reality of distributed computing: most of the problems you want to solve do not have off-the-shelf libraries. Rather, you need to know an algorithm, and various algorithmic-design techniques, well enough that you can decompose it into steps and then recompose steps so that they can be partitioned among many different computers, with appropriate failure modes if machines or network connections go down. That requires intimate knowledge of both the algorithm and of the types of problems you run into in a distributed setting.

MapReduce, Pregel, Bigtable, Flume, etc. are building blocks: they solve some of the distribution problems, but you still need to understand how the algorithms that run on top of them work, on a step-by-step level, to implement on top of them.


Is that actually a majority of jobs at Google? I know something about the work that several of my friends working as SREs and SWEs do and it doesn't sound like it involves a lot of distributed systems programming.

Let's say you work on some part of Android. Obviously you need to interact with things like Google's build system which are distributed, but are you really implementing some distributed computation in the course of your every week, or even every month?

I get that Google wants to test during the interview for suitability over a large space of possible specific roles, but I seriously doubt that "distributed systems stuff" would be in the list of top 10 programmer domain knowledges that are useful in those roles. Is it more useful than knowing how to work with version control well? Everyone at Google has to do that, but they don't test it during the interview. Is it more useful than being able to read and write idiomatic and readable Java? They don't really substantially test that during the interview either.

(On the other hand, the things that spawned this conversation were "dynamic programming, parallelism, and networking" and the latter two are much more obviously generally important things.)


It was at the time I was there (2009-2014). There's another Googler above who says it's not like that anymore, which is possible, but these were things you needed to know across Search, GMail, YouTube, Plus, Docs, and Infrastructure while I was there.


It entirely depends on what project you work on. When I interned, I had to deal with a lot of custom datastructures, but since I started full time, I haven't (although my next project might?)


People always talk about the whiteboard interview, and assume it has the greatest weight, but the other questions are just as important. There are plenty of things being measured, and technical ability is often not the one that disqualifies a candidate. The other questions do tend to come from a standardized bank of questions too; the whole thing is very structured.


This. I wonder how many people "think" they flubbed an interview strictly because they got some detail wrong on the whiteboard when it was really far more related to a negative impression of grit, communication, motivation, personality, or work-history.


And we will never know because the vast majority of companies refuse to give even the vaguest of feedback on the reason they are not moving forward. Not even “it was your technical knowledge” or “it was communication skills”


I think it's a lot! If you had a reasonably decent answer to the whiteboard questions it's often fine if you don't actually get it right.

There are a lot of other ways to screw up an interview.


As I said upthread, I think this is not true, at least not in the sense that the author is using. Could you describe some of the structured processes you've seen?


When interviewing for smaller orgs you do have to answer the question: can I work with this person everyday? Subjective and arguably harder to predict than technical performance, and oftentimes more important.


Known useless indicators:

* Resumes

* Skills tests (hacker rank)

* Whiteboard interviews

* Unstructured interviews

* Employee referrals

No wonder headhunters have such a good business. Not that they're more discriminating, but they can pretend to be the solution to an intractable problem.


> No wonder headhunters have such a good business. Not that they're more discriminating, but they can pretend to be the solution to an intractable problem.

Do companies view them this way though? I've always thought that hiring external recruiters to bring in a bunch of candidates is done to outsource the scummy task of bothering people via phone, linkedin etc.


I agree with most, but referrals are one of the most important indicators. "A hire As, B hire Cs". A known issue, however, is the workforce diverse given the heavy use of referral methods.


All my referral tells you is I have a friend who needs a job and that I want $5000. Whether my friend is any good or not is something you'll find out after I cash my check.


The traditional recruiting and hiring process is broken. I say this as a former technical recruiter. I wrote about the problems of recruiting for a closed-source startup here:

https://www.linkedin.com/in/chrisvnicholson/recent-activity/...

I mention closed-source for a reason. For technical hiring, there is nothing better than open source. Open-source projects allow engineers and their potential employers to collaborate in depth over time. The company can experience whether the engineer is competent, reliable and friendly. The engineer can judge the team's merits in the same way. And they can both decide whether the fit is right.

Closed-source and/or non-engineering jobs are the opposite. You get a resume, a Github repo if you're lucky, and a half-day's worth of interviews and tests. Then you roll the dice on that imperfect information.

This is one reason why a lot of recruiting and hiring happens through the networks of people that a company can tap into. It may seem corrupt or nepotistic, but the advantage of those referrals is that someone with more information than you is willing to stake their reputation on a candidate's performance.

Large companies with lots of historical data have the opportunity to train algorithms to learn how job applications and long-term performance/flight risk/etc. actually correlate. From what I can tell, most haven't.


Hiring candidates through interviews or through known contacts list is just about taking a chance. There are several unknowns which can only come out in the course of time.

I agree with vonnik, that running the Project in Open Source definitely has an edge in the interviewing process. The potential employee has already seen your code (or you can make sure that he has), knows what he/she is getting into. I would seek for some links of their contributions to other projects which will help me evaluate if their code.

Of-course less than 5% of the job applicants now have had the opportunity to develop for open source. (Either the folks contributing in the open source love their job so much, that they hardly look around or they are already being offered jobs without looking..).


It's not surprising. Employee selection is basically voodoo. Outcomes don't get fed back into redesign of the process, and the process is far more based in tradition than data. When the process gets challenged it's ripe to fall apart.


The sample size is too small to do any useful feedback and humans are too complex to be mapped/binned in the way interviews attempt to do. Not only that but things that happen after hiring may have a bigger influence on the success/failure of the hire than the factors the interview is looking for. E.g. who you end up working with, internal politics, the specifics of the job.

Kahenman's work on the illusion of validity comes to mind. It kind of boils down to our hiring process must be right because we're doing it.


But surely structured interviews just test a candidate's ability to improvise plausible stories? Whether they are truthful or not is a different matter..


Structured interviews improve the ability to systematically test the relationship between policy and results. Real experiments are then possible.

As opposed to a situation where your interviewers are asking off-the-cuff questions, and you're uncertain whether the questions have a systematic bias toward a bad direction -- racism, sexism, or just selecting for something idiosyncratic that adds unnecessary constraint to your candidacy pool. You also now don't know whether it's your policy that's systematically bad (or good!), or that your interviewers are systematically bad.


And yet unstructured interviewing basically works and the world keeps spinning. If 'random' truly was a better result (as the article suggests), then in a hiring round for a programmer last year, we might have discarded our candidate that won and is awesome for the one that couldn't conceptually 'get' FizzBuzz.

Or put another way: if 'random' was better than 'unstructured', you'd never have a round of hiring where there were no unsuitable candidates - one will have been chosen.


Better to give someone a job related task perhaps?


Being able to improvise plausible stories is an important aspect of job performance, both internally and externally.


One aspect amongst many?


The best part about structured interviews is that it gives you historical data on the interviews performance.


As someone with many interviews coming up in the near future, this scares me. It's easy to get in a self-conscious feedback loop when you know every behavior, response, and gesture is being fed into a fundamentally irrational character-judging process.

The best interview I've ever been on was one for a young startup. They gave essentially a homework problem, a day to solve it, and then in the interview we talked about the problem and my solution. The worst interview I've been on was sitting in front of multiple engineers as each one threw out a random CS question (from seemingly the entire space of CS) and asked me to talk intelligently about it. When I seemed unsure of myself, they glanced around nervously and disapprovingly.

Interviews are the worst. I've spent my time trying to bolster my OSS projects, so that I can point to them as evidence of my competence, but I can't help but prepare for the worst anyways.


"...fundamentally irrational character-judging process" is really the wrong impression to go into interview with as a candidate. Seriously, you will handicap yourself if you see it this way.

Competency is only one component of a hiring decision. After some base threshold of competency, the question then becomes whether or not you, the candidate, is going to be someone that the team WANTS to work with.

Bolstering your OSS projects is fine but you'll get a better return on investment for your time to practice "behavioral interview" questions. This is absolutely the hardest type of interview but in the hands of experienced hiring managers it works better than any other technique, IMHO.

If you can find an experienced mentor who will do mock behavioral interviews with you and give you honest constructive feedback, that will boost your interview skills more than anything else.


Good advice, thanks. My experience so far has been difficulty appearing confident on the technical portions of the interviews. Other devs seem to sense blood in the water when nerves and shyness have you fumbling through whiteboarding obscure topics.


Relax. Look, I've spent time at a small startups in the tech sector, then at a tech giant, then in finance (where I went on to found a small shop of my own). I can safely say that it's pretty random and all you can do is increase your odds by studying.

At the end of the day, it's pretty well known that at any big tech company a second pass through the interview process would easily cut out a lot of the people there. Could be a bad day, a question you don't remember, etc.

I personally now only ask the homework type of questions. Making sure to give the person plenty of time to do it. Works really well but obviously someone won't like it (though I've yet to run into the mythical person turned off by having to do a simple coding project that HN keeps talking about...)


They're useless if it's attempt to prove you know more tham them about some algorithm that's been implemented 150x times (every job interview in California). I'd rather work with someone pleasent, hard working, and concerned with everyone's well being.


Hell, all I want is someone that could easily learn and understand an algo they're not already familiar with if needed. I remember being asked to basically implement the Day-Stout-Warren algorithm in an interview a few years ago and wondering what that person really learned about me from that memorization exercise.


Articles like this - and the comments that follow - always overlook the primary value of job interviews, which to me is answering the question: "Do I want to work for this company?"


They're not ignoring that value; that's simply not what they're writing about. I don't think anyone is advocating for sight-unseen hiring. The problem is with using interviews as a candidate qualification mechanism, where they simply do not work.


And articles like this are almost universally written from the perspective of someone being interviewed, rather than the one interviewing.


The article that I just read following the submitted link was from the perspective of neither, but of someone looking at the hiring process as an observer. The anecdotes are anecdotes and they don't matter except to lighten up a newspaper article for the masses, so we can ignore them - I'm talking about what was actually studied.


Sure, because poor management, crippling technical debt, and a toxic work environment are totally plainly presented to you in an interview. /s


This might not be true given just a couple of hours staying in the office and talking to people. Management issues / technical debt are not visible until you work for the company.


Not plainly, no, but easily inferred.


You should publish because I know I and a lot of other people would be incredibly interested in divining this information during a job interview.


The best I've come up with is a few proxies, like their deployment process. Someone doing continuous or frequent deployments probably have pretty decent code quality. A company that takes several weeks, several rounds of QA and a dozen release documents that need to be signed in triplicate, they have that process because they've been burned and need scapegoats when the inevitable happens.

The best part is that most people at companies like this aren't aware that it's not normal, so they'll be open about it.


Yeah, that's basically all it takes. Talk to engineers about their everyday processes. (If the company doesn't let you talk to their engineers during the interview process, or if they're unwilling to discuss, consider that the huge red flag).


All that this article says is that past performance, in terms of GPA, is the best performance of future performance - rather than a 30 minute interview predicting future performance. This sees like a basic truism to me and the main lesson that tech hiring processes can take away from this article.

In my experience (having hired > 100 engineers) one of the basic problems that tech hiring, as a whole, has is that it misunderstands the point of a technical interview. Organizations and hiring managers see the interview process as a way of improving the brand of the engineering organization - "We have super high standards and to prove this our interview process is really hard - therefore if you think you meet these standards you should apply". This leads to the current interviewing trends of super academic/puzzle/esoteric technology based interviews. Applicants leave those interviews saying that it was super hard reinforcing the brand messaging (classic marketing).

Rather, in my experience, the best results come from viewing the hiring/interviewing process for what it is - an attempt to predict future performance (and specifically performance at your organization) using a variety of techniques which interviewing is one. In this context, of attempting to predict future performance, interviews are not a great tool - better to look at specific past performance.

Past performance is always the best predictor of future performance and the point of a technical interview, in my mind, is to critically inspect that past performance to understand how closely it relates to the future performance that your organization needs.


My basic problem with interviewing is you are observing behavior in one sort of situation, and on that basis trying to predict behavior in a very different sort of situation, namely job performance, which is actually a whole bundle of different types of situations.

It seems like it would be much better to instead put the job prospect in situations that model the sorts that would come up at work.


I wonder if we can extrapolate to marriages and to how arranged marriages (at least in India) having a higher success rate. Usually the parents on either side decide on a match based on family background, financial stability, education background etc,. rather than letting the to-be married decide.


What is 'success'?

My evidence is anecdotal, but I've spoken with multiple Indian women who are in arranged marriages, have zero real love in the relationship, and aren't particularly happy, but the idea of divorce is entirely unthinkable to them. The lack of divorce probably makes this a 'success' by most metrics, but doesn't seem particularly successful to me.


Perhaps, but then one would have to define a "successful" marriage, which is very subjective (even if only Indian families are scrutinized).


> In one experiment, we had student subjects interview other students and then predict their grade point averages for the following semester.

Not sure how using inexperienced interviewers proves anything. Would have been more interesting to have lecturers interview the students.


There's a reference in the article that covers that kind of scenario.

  The additional 50 students that the school interviewed but
  initially rejected, did just as well as their other
  classmates in terms of attrition, academic performance,
  clinical performance, and honors earned.


I am a recruiter. Recently, I started working with a new employer. We could not get anyone through their interview process. Eventually I asked the HR person to clarify precisely what was being asked in these interviews.

She said that, essentially, the interviews were ad-hoc, with the interviewer just coming up with whatever questions they thought relevant based on the resume - often asking the candidate to go through their career history.

I explained that the only effective approach I have found with recruiting is to have a set of pre-defined questions, and each question is specifically designed to give insight into how the candidate meets the pre-defined job requirements. Very much like software development, where test cases are related to software requirements.

I explained also that it is not critical to stick precisely to these questions, but that should mostly be the case - interviews are human interactions and some flexibility is required depending on circumstance.

The HR person then explained this to the hiring managers at the company, and worked with the hiring managers to define interview questions that give insight into the job requirements.

The next two people interviewed got the jobs, after months of no one getting through the interviews.

In the early days of software development, the business was often dissatisfied with software delivered because it simply did not meet the requirements of the business. So the software development process matured and came up with the idea of tests that can be mapped back to the requirements via a requirements traceability matrix. Thus the business has a requirement, the developers write code to meet the requirement, and a test is designed to verify that the software meets the defined requirement.

Recruiting currently has no such general understanding in place of the relationship between job position requirements and definition of quantifiable questions that identify to what extent a given job candidate meets a requirement.

Once you get your head around the idea that recruiting should be very similar to software development in this regard, then it is easy to see that ad-hoc interviews do nothing to verify in any organised way to what extent a candidate meets the requirements of a given job opening.


> I explained that the only effective approach I have found with recruiting is to have a set of pre-defined questions, and each question is specifically designed to give insight into how the candidate meets the pre-defined job requirements. Very much like software development, where test cases are related to software requirements.

This creates a system that can be gamed though. The interviewees can pass this information back to the recruiter and the recruiter can couch future interviewees. The recruiter has a vested interest in placing people.

It would be better to have a thousand questions that are randomly selected.


You hear lots of these stories about stupid interviews. You rarely hear the stories about the horrible, terrible employees that weren't weeded out, got hired and did great harm to the company and their coworkers.

Interviews can be good and bad, I'd venture to say that many the horrible hire has been avoided by any interview at all. Thus, don't make perfect the enemy of good, and try to improve on good.

The set of potential bad hires is vast compared to the good hires, and that ratio is only remedied by good filtering before and during the interview.


> You rarely hear the stories about the horrible, terrible employees that weren't weeded out, got hired and did great harm to the company and their coworkers.

Actually, stories about horrible co-workers that weren't weeded out by interview or any other hiring process are quite common workplace stories in every field, including tech.


We have entire websites dedicated to them: http://thedailywtf.com/ . If they did great harm to the company though, that's the companies fault for keeping them and/or not monitoring them.


No interview will tell you the future, so to my mind, the only thing the interview can tell you immediately is how much someone knows and whether their personality will mesh with the company's culture.

In order to ascertain this, I propose job-hiring hackathons. Have the company hold a mini-hackathon, once every 2 weeks or once a month, where all job applicants must show up and work on projects (corporate employees' presence can be optional). Just watch them complete the projects and hire the best candidates.


One thing interviews can't select for: creativity and motivation. And in tech those two criteria are the most vital, especially motivation. I can easily fill in the skills gap in someone who's motivated. I can't do anything with someone who doesn't give a shit, even if they're the second coming of Albert Einstein. So folks, please, don't apply for jobs you don't really care about. Save yourself and your prospective employer time, aggravation, and the opportunity cost.


I find it unfair that you're downvoted. I mostly agree with you.

However, "don't apply for jobs you don't really care about" is a very 50/50 advice. Right now I have zero money reserves. If I somehow got fired tomorrow, I'll be in the red even after 2 days of unemployment. So sometimes you have to make a hard choice.

That being said, it's good to be open about this after you get your act together months later and decide what to do with your current employer, during a lunch for example.


Well, don't wait to get fired then. Find a better paying job you like and go for it. It doesn't seem like you have much to lose anyway, and the best way to increase your paycheck is by moving around and not letting employers take you for granted. Just don't sell your soul for a buck in the process. This game is a marathon, and grinding it out never really works in the long term.


I fully agree, and that's exactly what I am doing -- even if an offer from another company ends up only being a leverage to force a raise in my current company (see below for clarification). However, I am taking it slow and I am patient (even though NOT being able to randomly go to the cinema or a restaurant with my girlfriend is getting on the nerves of both of us lately; money is tight and I'm very unhappy with my current compensation) because I don't want to replace one problem with the same problem in another company. So I am picky, I am clear in my requirements, I don't accept terms I know will make me hate the job, and I am perfecting my negotiating skills during this entire process.

CLARIFICATION on the leverage remark: it's my opinion that 99% of the time leveraging an offer from another company that wants to give you more money, to make your old company give you more money, is a huge mistake. Most businessmen HATE being strong-armed, or, to use a milder language, hate being shown that their employees have power over them, and this makes them hate you even if they very much need you in a business sense. They end up actively looking for a way to get rid of you, even if it costs them more money and/or stress in the long-term. I've witnessed it.

SOURCE: 4 of my stupider younger acquantainces from 7-12 years in the past. And an observation from my first job. After I "strong-armed" my first employer to double my then pretty measly salary, he went on a hunt to replace me (even though it took him around a year to really do it), but I was smart enough to detect the signs and resigned long before he had the chance. No regrets.


I believe a lot of places hire a candidate through a consensus, meaning some members in the team accept or reject a potential candidate. When enough accept the candidate, the hiring is done.

Is there any company that tracks who rejects a particular candidate during an interview process, and how often that negative feedback turned out to be true. I guess with the turnover rate at todays tech places, such tracking of a record of an interviewer is not really possible?

I always wonder about this.


Structured interviews are better than unstructured ones, but in my experience they are really a Trojan horse for the idea that interviews in all forms are largely worthless (as predictors for good hiring).

Once you start collecting data on your hiring pipeline work sample hiring becomes so much obviously better that it makes little sense to spend the time to do the hard work of making a good structured interview process.


> So great is people’s confidence in their ability to glean valuable information from a face to face conversation that they feel they can do so even if they know they are not being dealt with squarely. But they are wrong.

If people utterly refuse to learn from proven mistakes, then all hope is lost. Einstein was right, human stupidity is infinite.


We had 2 slackers on the team. One jumped directly to Google.

The other jumped around for few years, got laid off by some company, recently joined FB.


One of the most incompetent developers I've ever known got hired at Facebook, then Apple. Every line of code the guy right was just...exceptionally bad and poorly thought out.

I guess he was good at whiteboard exercises, though?


Odd - I was trained to do competence based interviews 20 years ago, apparently this is now rediscovered knowledge or something!


The problem is that most people have no training in how exactly to interview someone. It's like asking people with no training in building rockets to get you to the Moon and then deciding that because because they failed tickets are "utterly useless".


I'm sure it has been discussed before, but what factors push people away from say two-day internships? Is it because these things are so short that they too can eventually be gamed? Or is it because you need to have staff of the same specialty dedicating their resources to a potentially short-lived investment?


If you care about your company's culture, a person's humbleness, art of concise debating, etc - more important parameters than sheer GPA or coding skills imho - you can never do away with in-person interviews.

Calling them "utterly useless" is an utter click-bait.


It certainly does feel like job interviews are a coin flip to me after having done many interviews.


> Alternatively, you can use interviews to test job-related skills, rather than idly chatting or asking personal questions.

Alternatively? How is this not the focus of an interview?

Sure, confidence and social skills are important, but obviously they cannot predict a person's actual ability.


The issue with silicon valley interviews is that it's leading a new paradigm of management styles that deal with knowledge and creative work

This shift MUST be accepted by everybody ornostracism is risked. (Like trump supporters)

But paradigm shifts take time and the majority of managers still want cogs. But instead of filtering dor cogs they have to dress the filters up as filtering for a "i give smart people freedom team" and the convoluted mental gymnastics needed for this creates shitty interview processes.

All "well this technique worked for us" stories are mostly Not useful because tehy are just N=1 stories about managers using their preferred filters.

The issue isn't with the filters themselves (all sorts exist) but with a culture that obligates everyone to out on false facades.

People who just want to be paid and can work well need to pretend to be passionate. Managers who want well-paid cogs need to pretend to promote individualistic thinking etc...


More accurately, the article is about the use of unstructured free-form interviews.


It is some time we are going towards word of mouth and peer recommendations as the preferred way to hire new personnel. Cold applications are for outsiders and as such a very different market, with all the strings and the bulls.it attached.


Well, the worst interview I've ever had was the ones where the interviewer wouldn't deviate from the script, even after he recognized that these questions made no sense for someone with my background (I was self-taught, and had never managed my own memory or written a sort algorithm. They were also irrelevant to the position in question).

It was painfully awkward.

It was also a fantastic way to accidentally discriminate against women and older candidates.

I'm not saying anyone should conduct an interview completely by the seat of their pants, but please don't encourage this foolish consistency.


Does it bother anyone else that the example given in the article (showing up 25 minutes late) is judging interviewing by its worst rather than its best?


I'm coming late to this discussion, but there is one point I'd like to make. Our "interviews" in the world of software engineering, are immensely different from "interviews" in the standard sense of the word.

I've worked in different fields, and I talk to people who work in other fields. Most of those fields work in a way that is described in this article - interviews are question and answer sessions, where people are evaluated by a number of highly subjective criteria. "Tell me about your fundraising experience?" "How do you deal with difficult clients or coworkers". That kind of thing.

Software interviews are exams. They're not "more like" exams, they are flat out exams. There is very little banter. The closest I've come in google and Netflix interviews has been the more open-ended system design style question they often put in there, but even that has an academic test quality to it.

It's pretty much 5 hours of technical exam. "How do you find all matching subtrees in a binary tree" might be a question - and you really are expected to get it written at the whiteboard. "Find all permutations of a set". "Find all square sub matrices in an NXM matrix." The "top" companies are good at modifying the question so that you must know how to do this but can't just regurgitate it.

Alternatively, you may do a "take home" exam. Most recently, I did a mini rails project. I actually liked my result, I kind of enjoyed writing it. However, it was a no-hire, one reason given was that my routing was non-standard. True. I hadn't really thought about it, it was a take-home, so I mainly focused on the UI and code, and just chucked in a couple of named routes for demo and testing purposes. The other reason was that there was some duplicate code (I disagreed and had a reason for this, but there is no chance to defend your code, you write it and send it in, and they say "no hire").

I have no idea if it was a real piece of crap and they were just being nice. It had 100% test coverage and git for version control, and implemented a few features. Unfortunately, like I said, I never got a chance to defend the code.

Our processes in high tech are badly broken. I'm probably done interviewing, my next job will have to be one that doesn't involve a software interview. The routing and duplicate code, along with a google interview, pretty much sealed the deal for me.

My advice to people is (this isn't my idea) be an X that programs, not an X programmer. Coding is an amazing tool for a job, but avoid making it your job. For instance, I actually know a fundraiser who does a lot of data science, and he's a rock star in his field, but I guarantee you nobody asks him to reverse a binary tree in an interview!

Best of luck out there. Our interviewing processes are their own special version of horridness, just not uselessness described here.


i have an interview with apple for a mechanical engineer. i will report back once its done.


Please do. I'll be watching for your response to this message.


How do you know that the person you'll be marrying won't cheat on you and won't leave you in hard times? If you apply the methods we use today for interviews you'll end up with a 50/50 chance at best, a coin toss.

Yet why do some marriages last forever (till death do us apart) while others fail miserably or crumble even after 20 years?

The search for the global optimum cannot be performed by asking a set of questions. I argue that it cannot be done consciously. It's gut/instinct thing. If you have a mechanical approach, anyone can game the system and get a job because humans can be like chameleons to present themselves as the right candidate, and they can study for the interview. Only way IMO is to have that 3rd eye or whatever you call it... instinct, gut feeling, etc

The problem with this conclusion is that instinct and sexism/racism are often conflated.

No good answer.


This article is critiquing the unstructured get-to-know you interview. It is not critiquing highly structured interviews with clear goals, nor technical challenges, nor references checks or past performance.

In your example, how would you structure a series of questions and procedures to limit the risk of marrying someone who will abandon you or cheat on you? I think you can apply good interview process techniques to this quite well!

1. Have they been divorced or cheated on someone before? People who have been divorced before are much more likely to divorce again. The 50% divorce rate in the US is slightly misleading, as many of the divorces are concentrated in repeat divorcees. In fact, among younger (early to mid twenties) first-time marriages, the divorce rate plummets to something like 15-20%.

2. Have their parents been divorced or cheated? Children of divorced parents are much more likely to get divorced themselves.

3. Have they been physically, emotionally, or sexually abused as a child? People with traumatic early childhood experiences are much more likely to develop trust issues with long-term partners, especially if they never had extensive counseling.

4. Do they have a good relationship with their family? People who have a difficult or unstable relationship with multiple family members are more likely to see tumultuous relationships as a norm.

This is all equivalent to reference checks in a job.

Then a long dating / engagement period is necessary. How do you they treat you during this period? Do they cheat? Are they abusive? Do they leave you during a period of difficulty? Do you have the same religious views? Do you split housework evenly? Do you both want kids? How do you view money? The three most common reasons for a fight among couples are 1) money, 2) housework, 3) free time (and how to spend it).

People who are otherwise happy and well-adjusted adults who get married and then divorce bitterly after 10 years are not the norm. Most divorces can be predicted. And most divorces happen before 2 years of marriage. If you are aware of the warning signs and are not blinded by a "gut instinct" I think you can definitely minimize the potential for marrying a snake -in-the-grass.


The trouble with this advice is that, while it's accurate if your one and only goal is to predict the likelihood of someone divorcing or cheating on you, it seems profoundly unfair.

To examine this "unfairness", let's imagine it at its most extreme: a society in which divorcees are so stigmatized that it's practically impossible to ever re-marry; in which children of divorcees are likewise stigmatized; in which victims of child-abuse are further victimized by a society that considers them potential "snakes-in-the-grass". Do we really want to live in such a society?

Obviously I don't think you were advocating this. But it's a thought experiment which demonstrates a classic class of problem: what's good for the individual isn't always good for society, especially taken to extremes.


Isn't that true for any system you use to predict? If you rely on "gut instinct", then you filter out people who aren't good at fooling gut instinct. Ugly people, short people, people with poor social skills, autistics, etc. That's hardly fair either.

If you want to make the most accurate predictions possible, you absolutely should not use gut instinct. If you want to be fair, then have a lottery or select randomly. You can't have both. There's nothing remotely fair about gut instinct. See, e.g. judges giving unattractive people twice the sentences of attractive ones. I can provide tons more examples of stuff like that. Gut instinct should be illegal.


The other problem is that it assumes that divorce is a bad outcome to be avoided, while a long marriage must be successful and happy; neither are axiomatic.


I'm not sure I see your point. What GP is proposing is a risk assessment strategy. If your partner ticks all of those boxes (for example), then you better think long and hard about whether the relationship is viable without being blinded by "love".

Anyone from any background can either rise above their circumstances or fail regardless of the help they get. This should never prevent is from thinking clearly and logically about our future with these people based on their backgrounds or past actions.


Seems like point 1 is possibly just confounding age. They've not divorced yet partly because they've not been married long enough yet.


Point 1 is saying that if you're younger when you get married and it's your first marriage, the marriage is less likely to end in divorce. This is the opposite of what you'd expect if getting divorced was just a matter of time, as younger people have more potential time in which to get divorced.


The trick is asking the right questions.

I know a couple that divorced because one of them didn't want to be monogamous anymore. They tried to make it work, first one way, then the other, but in the end they couldn't. Do you think the other one is more likely to cheat or file for divorce than someone who's never been married?

Children of divorced parents are more likely to get divorced, but I've never seen that statistic controlled for personal and cultural attitudes about divorce.

What I've read suggests that whether someone has experienced a healthy relationship is more predictive of relationship stability than whether they've experienced trauma.

A flawed heuristic may be better than no heuristic, but too much confidence in a flawed heuristic can backfire.

(Incidentally, most divorces happen after 8 years for both first and second marriages.[1] The 50% figure applies to first marriages as well.[2] That doesn't tell the whole story, because young adults now are divorcing less than young adults a generation ago. On the other hand, we don't know if that will remain true; divorce later in life has increased.)

[1] https://www.census.gov/prod/2011pubs/p70-125.pdf

[2] https://familyinequality.wordpress.com/2016/06/08/life-table...


> Yet why do some marriages last forever (till death do us apart) while others fail miserably or crumble even after 20 years?

Well, you don't exactly "date" for years before getting a job. You're acting like people get married on a hunch, or that people don't work hard to present themselves as "marriageable." (Incidentally, it's also not clear to me that a 20-year marriage is a failure, and certainly not in this analogy)

In most of the places I've worked where there are short-term contracts before full-time hiring, the employer has a far better idea of the skills and quality of the candidate.


> In most of the places I've worked where there are short-term contracts before full-time hiring, the employer has a far better idea of the skills and quality of the candidate.

At the place I work we offer paid internships to college students who haven't graduated yet. By the end of a summers' worth of work, we have a really good idea as to which interns we would like to hire full time, and we give them an offer on the spot (contingent on graduation), no interview needed.


Contingent on graduation for ethical reasons, or because you still see the degree as a relevant signal on top of the insight gained from working with the candidate?


  instinct and sexism/racism are often conflated.
What's perceived as instinct is often the outcome of social conditioning, which gives outcomes such as "People See Black Men as Larger, More Threatening Than Same-Sized White Men" [1]

[1] http://www.apa.org/news/press/releases/2017/03/black-men-thr...


Could also be due to the disproportionally high rates of crime among black men. That should never be an excuse for prejudice, however.

https://ucr.fbi.gov/crime-in-the-u.s/2012/crime-in-the-u.s.-... https://infogr.am/Black-34991937313


I think instead of instinct "intuition" would be a better word. It is gut feeling, but developed from environment rather than innate.


If you ask someone to predict a coin toss, and use the methods we use today for interviews, you'll end up with a 50/50 chance at best. Yet some people guess the coin toss correctly. It must be some unmeasured "gut" that's doing a good job for those people, right?


I think there absolutely are people who can guess correctly, but how do you tell who they are and how many of them do you think are actually out there? I'd guess not many at all.


There are a huge number of coin tosses to guess when evaluating someone for marriage potential. Maybe less for a job candidate. But the candidate is not the coin being tossed. We're talking about the ability to estimate probabilities for a whole range of stuff that combinedly give us a measure of confidence.


> I argue that it cannot be done consciously. It's gut/instinct thing. If you have a mechanical approach, anyone can game the system and get a job because humans can be like chameleons to present themselves as the right candidate, and they can study for the interview.

Not sure why you think people can't game your gut instinct as well. A whole lot of bullshit artists out there, which may or may not be a requisite skill for the job they're applying for.


It goes both ways. Companies are often not trustworthy and will fire people for their own reasons. They may not pay very well. There are so many things. I am not Mr. Right, I am Mr. Right Now accurately describes the business arrangement. It's a one or a few nights stand. Not a lifetime commitment.


> It's gut/instinct thing. If you have a mechanical approach, anyone can game the system and get a job because humans can be like chameleons to present themselves as the right candidate, and they can study for the interview. Only way IMO is to have that 3rd eye or whatever you call it... instinct, gut feeling, etc.

This is ridiculous. Mechanical approaches can be gamed, but so can gut instinct. You just have to look at the White House to see that. A lot of gut-instinct voters got hoodwinked by a skilled self-promoter. And he's a gut-instinct guy himself, so he's getting led around by whoever's got his ear.

Gut instinct, though, is worse in several ways. One is that it's not transmissible. As a company grows, how do you scale? Another is that it's not necessarily repeatable. Was that person bad, or is your gut off because you're tired, depressed, or upset? A third is that it can't be consciously improved. If your mechanical process has a flaw, your team can discuss it and come up with solutions. But if your gut judgment isn't good enough, what can you do?

I am a big fan of gut instinct as a component of a hiring process. Often, thanks to experience, we perceive things we have a hard time articulating, and it's wise to pay attention to that. But I think it's a giant mistake to treat our subconscious as some sort of mystic oracle that we must worship and never question.


> I argue that it cannot be done consciously. It's gut/instinct thing. If you have a mechanical approach, anyone can game the system and get a job because humans can be like chameleons to present themselves as the right candidate, and they can study for the interview.

If it were only a gut/instinct thing, you would have no basis to reject 'wrong action' other than it feeling bad for the gut..

At some point, the 'set' of circumstances will/will not intersect the 'set' of actions such that it is clear that the person acted/didn't act 'correctly' according to ones heirarchy of values.

Ethics, philosophy, and theology exist for a reason - the fact that our society tends to ignore this , presuming and not investigating the subtleties within a relative-individualist framework doesn't make these frameworks correct, or even a very coeherent set of doctrines with which to gauge things..

but yes, a 30 minute interview, or even a 1 week trial period is not a very good means by which to judge character, since the potential reward is great enough, and the period to observe inconsistencies short enough, as to allow a deviant personality to 'fool people' for the trial period.


Interviews happen at scale, dating doesn't. Once you find someone, the getting-to-know-you period before marriage is months / years of one-on-one time where you really can get to know them.

It's not the unstructuredness or the judging humans part that's the problem, it's the fact that this is all done at scale by people with more important things to be doing.

If companies all did like Apple in the early days and treated every hire as a bet-the-company decision, bad hires would go away.


Maybe the answer then is to push the decision down to someone for whom the hiring is a big decision, and who does not have to work at scale. An individual group manager has a lot of incentive to make a good hiring decision if the person being hired will be working in his group. And the manager does not have to hire so many people that he needs to consider it a large-scale project and adopt corresponding expediencies. Things can remain personal.

And if the organization for some reason needs to work at scale, add an initial lightweight sanity-checking and routing to the correct group. But keep the main hiring decision with the group the engineer would be working in.


i see some faulty reasoning there. humans have billions of years of evolution in them that contributes to their mate selection instinct. Notice that monkeys and birds also have good mate selection, it isn't a big-brained thing. Not sure how that has anything to do with selecting good engineers.


Mate selection in evolution is not at all geared at the same things. If anything, I think instincts from that actually cause a lot of the failed relationships rather than improving them, since evolution doesn't really care about you having a long happy marriage.


it cares about you staying together long enough to raise kids. remember that also half of marriages end in divorce so it's not that great


Have good instincts or have a system (hire after speaking to 34% of the candidates, fire early and often, etc).


The biggest problem with instinct/intuition is that you need to have enough relevant experience for it to work. Your intuition ends up being completely wrong when you brain doesn't have enough data to work with. When it does have enough data it's an extremely powerful tool.

I think it would be cool if a company like Google published anonymized data about the correlation between interview "scores" and some post-hire metrics. My guess is the correlation would be poor to non-existent. Picking those metrics is difficult, perhaps an obvious one that is difficult to game is how long the employee stayed with the company. Another interesting one would be anonymous peer evaluations where it's guaranteed no one would see the data points so it wouldn't suffer from the problems 360 reviews have.

I think the best you can do is some sort of mix between a more structured portion trying to gauge where the candidate is in terms of knowledge and intelligence and a less structured portion where you try and get more of a "feel" for the candidate's personality. You have to realize that under the best of times it's not perfect. Rejecting someone can't code at all for a software position should be relatively easy. Ability to tackle bigger things can be gauged by looking into previous projects and through references. I think there's a big region though where the outcome is difficult to determine.

I think back to the very first time I was involved in a hiring decision. The guy was very smart, technically capable, engineering degree; PhD material. Seemed pleasant enough. Got hired and IMO definitely the kind of person a company like Google would hire with their process. He lost interest fairly quickly on the job. Working with him I found he had some very odd, not to say crazy, political opinions. Everything was too boring for him so he didn't really get that much done. Couldn't really work independently at all. He left the job, left the country, and I think he ended up being a cab driver. Not sure. Yet another example is someone fresh out of school with a CS masters degree who despite all the help of the team could simply not wrap his head around the project and become productive. On paper all the right credentials but first real world job and he couldn't cope for some reason. Ended up leaving. I've seen a few CS background people just not find their place. I'm sure we've all seen situations where we wonder how some person got hired and how come they're still there. Over time I think I learned to do a better job hiring/interviewing and have had a stream of pretty successful hiring decisions.

It turns out that it takes many months on the job to really know the fit. Even if the person is capable the specific job or team may not work out. Some people are very good at making it look like they're accomplishing things while they're not really. There's no way any company has a secret sauce of only hiring "great" people and even great people will do poorly under certain circumstances and conversely not so great people can be very successful under the right circumstances. Some people can grow really quickly while others can't.


You're claiming that sexism/racism is not instinct and there is plenty of evidence that it is.


I think one issue is that interviews are often confused about whether they are checking credentials or checking for fit. That is, are they looking for people who will thrive in the environment, or are they checking that someone has the skills they say.

Google's interviews may well boil down to "prove you're smart enough to work here at this smart company with smart people like me. "

Other examples:

"Oversell yourself so you pressure yourself into high standards and thus end up working late often out of guilt"

"The job is easy but with a lot of specific untransferable domain knowleedge. Prove that you're a company man. "

"Most of us aren't sure what we are doing, but there's value in our companies general direction. Can you seek out help and thrive in such unknown conditions?"

These simplifications are actually for the most part "good" and is actually reflective of how the manager sees the world. Google is unique in that the direct manager doesn't have as much control, but the manager spirit is a belief in giving smart people freedom.

That gets you in, but your immediate manager may end up not believing in this. Michael O Churches story comes to mind. In some cases, the manager having no say in the process is bad as there isn't a good fit.

A manager who believes in treating people like slaves and people who want to be treated like slaves for a specified amount of money is actually a good fit. Someone who wants to be a cog but is given freedom is paralyzed by indecision and vice versa, the person is stifled.

It's hard to see this in the US because there's a strong bias for the free/smart paradigm and all companies have to outwardly present this shared value. In china though "i'm just a code monkey" is said a lot because despite having little to no say in hours worked or project assigned, software pays much mich hogher than other jobs. It's a deal they accept because there aren't any better ones. Or more specifically, because there are too many other people who will take the deal.

Only when a majority of people demand a base standard of life can you prevent a race to the bottom inflicted by employees themselves.

The key is to see through the game that companies arr required to play (at least in the us) and track down the exact team you want to be a part of, figure out the actual culture (having a taxonomy beforehand is useful) and then deciding if it's for you. (Given your BATNAs)

Because of this game, all marketing about being great places to work is BS because thats the ONLY thing they can say. I say mostly because the marketing is a result of a real cultural shift in realizing how to effectively manage knowledge workers.

So silicon valley does in fact have a more progressive attitude to management styles unlike the east coast which is more about play books, but it's fsr far less than what you would think. The majority of managers still subconsciously reject new team managment styles, and shitty interviews are a result of having preferred filters but NEEDING to dress them up in all sorts of covolution.


Such a careless claim and writing articles and throwing on peoples faces gives just a chance to everyone to debunk ny times articles


Once we get to the point where most people have several jobs with separate contracts, interviews become superfluous because you can just hire someone for a few hours at a time and then fire them. The only reason that doesn't work today is we're still clinging to the idea that you only work for one entity at a time. Never mind that most people already manage at least a couple bosses within the same company.


I've done a bunch of consulting and contract work, and that can be fine. But it's expensive, because to be continually finding new work, you must continuously invest in marketing, and every time you do something new there's a lot of switching cost. A company also needs a lot more supervisory capacity to be monitoring people who come and go, so it's not cheap for companies, either.

It's also limited. I can be much more productive when I do one thing full time than a bunch of things part time. Depth takes time, as does keeping up with changes. You just can't get as far with fractional attention as with serious focus.

And uncertainty is also uncomfortable. Many people just want to settle into a solid, reliable situation and do the work. They want to be able to plan their future with some confidence. Even if they can make more money juggling a variety of things, they'd rather make less and have less chaos.

So I don't think we'll ever get to the point you suggest.


You both have a good point.

I too quit my freelancing for a stable remote job but looking at it 18 months later I found out that I sucked a lot in managing my freelancing gigs.

So IMO with some good contract and budget management -- and confidence, and actually having a choice -- you can reap most the benefits of freelancing and almost none of the drawbacks, if you can take the thought of switching customers every 3-6 months.

I understand not everybody can pull this off -- I am not sure I am yet ready to do this that well. But I've seen people doing it very successfully and almost stress-free.


You just need an agent. Agents have not been "eaten by software" yet so they have no way to take clients making less that 6 figures of revenue. When that software comes the long tail of contractors will get access to agent services and we'll go from 80/20 salary/contractor to 20/80.


It took months to become familiar with the existing tech at the company I'm currently working for. Still learning their internals


> not one interviewer reported noticing that he or she was conducting a random interview. More striking still, the students who conducted random interviews rated the degree to which they “got to know” the interviewee slightly higher on average

Yeah well, when you're asking questions of someone who looks thoughtful very briefly, then answers almost immediately after, it sounds like you might have more reason to think you know them better than the one actually considering their answer

Might be introducing a confound or two that you then proceed to completely ignore and even conclude past lest someone accidentally draw other conclusions




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: