Over time I’ve become more skeptical about this kind of psychology research (as more studies fail to replicate) and, as is often the case, here the sample size is quite small (76 students, split across 3 groups), predicting something relatively noisy as GPA. It is unclear to me that one would be able to detect reasonable effects.
Furthermore, some claims that make it into the piece are at odds with the data:
> Strikingly, not one interviewer reported noticing that he or she was conducting a random interview. More striking still, the students who conducted random interviews rated the degree to which they “got to know” the interviewee slightly higher on average than those who conducted honest interviews.
While Table 3 in the paper shows that there is no statistical evidence for this claim as the effects are swamped by the variance.
My point is not that this article is wrong; verifying/debunking the claims would take much more time than my quick glance. But that ought to be the responsibility of the newspaper, and not individual readers.
Politicians don’t get to write about the successes of their own policies. While there is a difference between researchers and politicians, I think we ought to be a bit more critical.
So obviously, the title of this article should be "Maybe interviewing is not that useful" instead of "The utter uselessness of job interviews", but besides this I find your comment unjustified.
In fact, it's quite the opposite, I believe this type of work contributed to make us more critical by questioning some basic facts about interviewing, that i would have never questioned just a couple of years ago.
> Over time I’ve become more skeptical about this kind of psychology research (as more studies fail to replicate)
ok, this is interesting, where is it mentioned?
 Think fast and think slow. There is this short article which mention some of the results and has been discussed on HN a couple of times already.
It's a lot harder to present a better way of predicting candidate performance in the workplace, along with substantial data that indicates it's better than today's methods. Corporations would love more effective ways to determine effectiveness/performance before hiring.
Interviewing is terrible, but that doesn't mean there is a better option.
This is one of the biggest problems I see with business guys today. They want absolute certainty in a world which can't offer it. Start accepting that there is a lot of stuff we don't know and can't know at the moment and we simply have to work towards getting better.
You can't get better if you don't acknowledge that there is a problem in the first place.
These silly, must have +5 years of experience, and check all these boxes with technologies to get the job clearly shows the industry is completely lost at the moment and isn't willing to acknowledge it.
There are a bunch of other testing methods that are illegal in the USA too, such as IQ tests.
This is as it should be—I'm paid to solve problems.
Why is this different?
The more I read about interviewing, the more I realize too many people think they have this problem solved—their amateur psychology is impeccable and their technical screens test for exactly the right things, no more and no less. Did they do a bunch of controlled studies to convince themselves of this, or are they taking sounding good, or intuition about the statistical outcomes of different techniques, to be equivalent to truth?
Maybe the first step is to collectively realize we have close to no clue what we're doing, and are being asked to solve a hard problem: individually, to talk to someone for an hour and make a hiring recommendation. In aggregate, to make the decision based on a handful of these one-hour conversations.
Maybe the first step is to realize this is a problem worth trying to solve.
Maybe the first step is vocal non-acceptance.
Which is bullshit. It's perfectly reasonable to criticise something without proposing an alternative. It's especially ridiculous to reject criticism provided without alternatives, when it's literally your job to do the work being criticised.
"Hey the way you're doing this part of your job produces results no better than random selection."
"Bring me solutions, not problems!"
Yes yes..of course I know this could be gamed as well, but no matter...you can't really argue that this wouldn't be magnitudes better then the typical current/broken interview process.
Also, not all jobs/codebases lend themselves to being productive in 2 weeks, I'd argue they should be, but they aren't.
I switched roles to a new team, and the first 2 weeks were a trainwreck. It was almost of no predictive quality on how I would do.
Now, there are many reasons why that was the case, and perhaps those underlying issues should be addressed, but from all the role changes I've had, the first two weeks shows how well the group you're going into can onboard, more than it shows how productive the individual will be in the long term.
It works good enough.
And I bet almost everyone reading this has worked with at least one person they think shouldn't have survived the interview, and that person was making a boatload because they convinced the boss they're brilliant. Meanwhile 90% of their day was spent talking about how great they are, and 10% creating new bugs, and no one dared say anything because the thought of them being more "productive" was horrifying.
'It' mostly successfully matches employees to employers, but the quality of those matches may vary wildly.
Interview processes also vary wildly—you can't really say that it works without defining 'it.' Are we talking multiple technical screens that require writing code or a single fluffy buzzword-laden conversation with a C-level? Both have failure modes, but those failure modes sure are different.
End of the day, everything evens out. People add the structure they need when they hire people. If your engineering interview process for some detail oriented gig is buzzword trivia with the CIO, the company will probably tank anyway. Conversely, if you do some nerd-fest whiteboard interview for a CTO in a bigger organization, you're probably not getting the right outcome either.
And this is a near universal phenomenon. Almost everyone wants to "get to know the candidate."
Everyone hates the process, and companies have investing major dollars and hours trying to improve. End of the day, little has changed since 1917. You either acquire-hire, get a strong referral or interview a pool of unknown applicants.
The tests and quizzes are little different to how a city hired an accountant in 1917. The old boys network evolved. Then you're left with the rest.
> Corporations would love more effective ways to determine effectiveness/performance before hiring.
This is irrelevant to evaluating the current methods. Even if there is no replacement, if they are useless, we should know it, bar none.
Leeches are often used in literal modern hospitals in the developed world as an effective treatment for certain ills.
If we had no alternative, we should stick to it, but look at other things in the meantime. You seem to be advocating doing nothing at all, which is trivially easy to show works for no-one
You could make the argument that algorithms tend to be studied more by smarter people, but if that's what you're going for you may as well ask them about their hobbies, and hire the person that is into playing chess, or doing astronomy (or whatever intellectual pursuit you care to name).
If on the other hand you are interested in a person's ability to code, ask them to do so. The last time I had to hire someone, I wrote a small application with one module that was deliberately written in an obfuscated style. I asked candidates to bring that module under control - rewrite it in a readable code style. To do this, successful candidates needed to identify what the current code was doing by examining the public interfaces in a debugger, documenting what the calls seemed to do, prepare unit tests, and then rewrite the module in a readable style. It took about a day for most candidates to do.
At the end of that, you get to see a candidate's ability to read code, use a debugger, write unit tests, write documentation, and write well structured code, which is a pretty good coverage of the typical tasks in a developer's day. I feel this gives a much more realistic assessment of a candidate's capabilities that asking questions about a more or less randomly chosen algorithm.
This is an issue as well. If you aren't google then a day is too much investment for a single job opportunity, especially if you're already employed.
When I have to give an interview and have to include an algorithms section I make candidates type code. Whether that's on a phone screen with a shared online text editor or in person with their laptop / an interview laptop, I want them to type stuff, not just rely on whiteboard pseudo-code and diagramming. As a vim user I discount that their editing environment may not be what they're used to but even if I was forced to use Notepad I could still bang out a function to test the even/oddness of a number (my own fizzbuzz) pretty quickly. So I at least make sure to test coding, even if poorly.
I agree work-sample tests are the best, but as another commenter noted if they take a lot of time for the applicant you're going to get people who refuse that just as some refuse to play the algorithms game. Especially if people have a github repo, especially if some of the projects they've worked on have had more than themselves as commiters, especially if they're currently employed as a developer at some other company that does general software. Unless you're trying to build a top team, which most projects don't need, you're wasting a lot of time trying to rank beyond "would work out ok" and "would not work out at all". I have a section in my phone screen that tests for regex knowledge, I'm primarily just testing to see if they know the concept or if when faced with a problem that regexes can solve (which actually does happen from time to time) they reach for writing some custom parser or not. If they vaguely remember there's a way to specify a pattern and find matches, that's a Pass. If they know grep/their language of choice's regex syntax and can give a full solution, great, I'll rank them slightly higher than someone who just knows the concept, but all I really care about is the concept. If they don't know the concept, that's a strong sign (to me) they won't work out.
I tried to do a semi work sample test with an intern candidate a few months ago instead of a different test, based on experience with a prior intern who struggled on something I thought was basic and left me wondering why I didn't catch that in the phone screen. Basically I gave them some stripped down code from several files that looks a lot like what we have in production (JS, using Backbone) explained the overall mapping from bits of code to what could be shown on the screen, and essentially asked them to add a new component (already written) to one part of the screen by modifying/filling-in-the-functions in a few places. It required them to read and understand some alien code, see what they can ignore, understand what was asked, and then do it (initialize something, pass it around, up to I think 3 indirect function calls of nesting, call a couple things on it). The candidate got through it, I'm not sure the old intern would have...
Why g Matters: The Complexity of
Personnel selection research provides much evidence that intelligence (g) is an important predictor of performance in training and on the job, especially in higher level work. This article provides evidence that g has pervasive utility in work settings because it is essentially the ability to deal with cognitive complexity, in particular, with complex information processing. The more complex a work task, the greater the advantages that higher g confers in performing it well. Everyday tasks, like job duties, also differ in their level of complexity. The importance of intelligence therefore differs systematically across different arenas of social life as well as economic endeavor. Data from the National Adult Literacy Survey are used to show how higher levels of cognitive ability systematically improve individuals’ odds of dealing successfully with the ordinary demands of modem life (such as banking, using maps and transportation schedules, reading and understanding forms, interpreting news articles). These and other data are summarized to illustrate how the advantages of higher g, even when they are small, cumulate to affect the overall life chances of individuals at different ranges of the IQ bell curve. The article concludes by suggesting ways to reduce the risks for low-IQ individuals of being left behind by an increasingly complex postindustrial economy.
Algorithmic tests have a pretty low bar to clear in IT jobs. Using them for secretaries or lab technicians is another story.
The problem with comparing IQ results at hire to employment success is that employment outcome is difficult to define over time. You're also unlikely to get statistically relevant data without focusing on large organizations with standardized HR processes. Most of the research is based on supervisory evaluations, which are not the most reliable indicators of anything for a variety of reasons.
The other thing I find amusing is that business folk who talk about this miss the fact that there are large workforces in the US that either have or do use standardized testing like this to hire and promote. Those are government bureaucracies, which function relatively well, but are hardly a model that most folks advocating this would aspire towards.
https://mobile.nytimes.com/2015/08/28/science/many-social-sc... among other places
I'm all for upending the status quo with interviews, but let's not throw out science and reporting just to get there.
In particular, I was disappointed to find a (short) paragraph in the article that I find bogus. That does not mean the article shouldn't have been posted in the first place, but just that this paragraph should have been edited or removed.
I think there is something wrong when I feel like I have to look up the actual research paper and check whether the claims made in an article are supported by data and methodology. I should not be a skeptic when reading New York times articles.
To be fair, it is posted in the opinion section, but should we really just take this article as an opinion? That doesn't feel right to me either.
Then onto your last question and Daniel Kahneman, we can talk about that for a long time, but let me keep it short. The best place I know (though technical) is the blog by Andrew Gelman ( turned up in a 5 second Google, but there is way more on his blog), and Daniel Kahneman himself has "admitted" flaws in his studies .
Even the "hard" sciences have trouble, because journals prefer publishing new and positive results, rather than replications or negatives.
People ask questions in interviews that they want to know the answer to and that could go either way. All questions have equivalent expected surprise, either both answers are unsurprising, or one answer is surprising, but you think you already know the answer is the other one.
If interviewers were asking questions like "is 2+2=4" they would have detected random interviewers way easier, but they wouldn't be trying very hard to get to know the person.
As for getting to know the person, the more surprising someone's answers are, if you believe they are telling the truth, the more distinguishes them from "average person who gave answers I expected", so you say you "got to know" them. This is unsurprising.
This isn't to defend unstructured interviews, other studies for a long time have shown them to be worse than structured interviews and test scores. If I had to guess the only reason the research in the article got published as novel was the random interview part.
Edit: Here's a table from a meta-analysis of lots of studies on correlation between different factors and job performance: http://imgur.com/a/YRFTh. Basically unstructured interviews aren't as good as structured ones, but they are better than nothing. Work samples are the best, structured interviews and IQ tests tie for second.
Note that this meta-analysis is combining a bunch of different fields to yield general observations, a specific field may have different results. But in expectation for a randomly selected field these are fairly solid results, and I don't expect fields vary from them too much.
Politicians boast about successes constantly. I don't understand what you mean by "get to".
Why is that not the case here, where a professor is allowed to sell his own work? It is as if Obama is the NYT reporter for Obamacare.
That people believe politicians blindly is a topic for another day :)
But what I find far more important in the interview process is involving current team members in the process of selecting new coworkers. It's one way of getting teams to be bought into a feeling of shared purpose and is the first step in establishing a working relationship. If you don't give at least some of the current team a role in the hiring process, teams will feel imposed upon by those hiring and won't be as understanding about flaws in those added to the teams.
Focusing on selecting the "right candidate" is really myopic in a situation where there are likely many right candidates. We should, instead, be focusing on not selecting a wrong candidate and fostering the right team dynamic. We already know that strong teams significantly outperform strong individual performers who don't cooperate. Yet hiring still seems focused on optimizing for strong individual contributions. And I've yet to see a study that looks at the flaws of the interview process in building ineffective teams.
We know that interviews are selecting a right person some percentage of the time. That percentage will never be 0 or 100. We will always have to accept some bad hires.
I would rather that successful hire percentage be somewhat lower and keep the team engaged in defining culture and hiring standards than to give up team involvement for a higher right-person rate.
My main point is that the people studying this issue and arguing against the interview process tend to only look at less than half of the benefit of the interview process. If we're going to ditch the interview process, whatever replaces it needs to have the same property of involving the existing team or it is, to my mind, automatically worse than what we have now. Because while we're all aware of how flawed the interview process can be, it does work to some extent. When I was hiring, only 2 out of the 50 or so that I hired didn't work out. Would I like to have avoided hiring one or both of them? Sure. Am I willing to sacrifice the team involvement benefits to got to do so? Absolutely not.
In fact there is an awful lot of wrong stuff society keeps doing despite evidence over many year suggesting it is stupid. The stupidity of high CEO salaries e.g. has been proven quite well, yet business keeps a blind faith in it. Getting employees in complicated jobs to perform by rewarding them on extremely narrow metric has also proven itself counterproductive yet the MBA crowd refuse to believe it doesn't work.
Business practices often seem to be more like religion than founded on reality. It is a GOOD thing that some researchers are trying to do their bit to correct this flawed picture.
Are you suggesting anything out of the ordinary is going on here? Doesn't one have to present their work as legitimate and wait for feedback? So long as they are open to that feedback, I don't really get this argument?
> Over time I’ve become more skeptical about this kind of psychology research (as more studies fail to replicate) and, as is often the case, here the sample size is quite small (76 students, split across 3 groups), predicting something relatively noisy as GPA. It is unclear to me that one would be able to detect reasonable effects.
So present that back them as a refutation? Are you waiting for someone else to do it? Why are you debating the reliability here?
They're predictive, but not very.
Structured interviews are better, but not really by much.
The problem with all of these things is they work, but not very well, and people make too much of any one thing. An interview is a very small slice of behavior, even if it's structured very well.
People make too much of them.
For a quick reference, the two defining criteria for a structured interview are:
1.) They use one or several consistent set(s) of questions, and
2.) There are clear criteria for assessing responses
That second point is really important. You can't only ask candidates the same sets of questions and have a structured process: you need to understand what a "one-star" response vs. a "five-star" response actually looks or sounds like. Training and calibrating all of the interviewers in a large company around a similar rating system is nightmarish, so most companies don't bother.
The book also outlines that pairing a work sample with a structured interview is one of the most accurate methods of hiring.
If anyone is interested in some in-depth structured interview questions or work sample ideas, feel free to email me. I've spent the last few years working on a company in the interviewing space and would love to chat.
Which is how it is typically presented because it sounds much better than "reject a lot of candidates who would probably have worked out just fine". It is useful to perceive both the potential value in an approach like this and the shortcomings. Google can absorb the massive expense in man hours, lost opportunity, etc. that comes with trying to craft genuinely predictive interview processes, but a lot of the companies trying to emulate them can't. Too often, interviewees don't realize a process of this sort is stacked against them, and interviewers don't appreciate the negatives of adopting a still-nascent approach that sounds more reliable simply because it is quantitative - and assuming since Google does it it must work.
Its not like Google pays the best or still has the best workplace. It's a large company with large company politics and red tape.
I've met a few such people in this forum. Not many.
I'm not sure how one would even begin getting a rigorous estimate of that number. What is a credible sample of "great candidates" in this industry?
That seems to be an unproven assumption, and quite likely to be a wrong assumption. For example if good people turn out to be less interested in "honing their interview skills", adding parasitic noise to the signal.
This gets tossed around as a truism. I'm curious, does anyone have any evidence for it? Call me a skeptic, but these kinds of "everyone knows" truths are often wrong.
Google and other such companies have a vested interest in getting hiring right. They also have the wherewithal to conduct studies, collect data, and let the evidence guide their hiring practices. Google in particular has shown a willingness to completely overhaul their practices by eliminating ineffective practices (remember their reputation for "thought puzzle" type questions?).
So I'm curious if you have anything to back up the idea that they're doing it all wrong.
I know it from my own experience and that of many others who have been through the gauntlet. Take it for what it's worth, I'm not selling you anything. I don't look impressive on the whiteboard, but I do have a rather impressive track record. Something doesn't line up. :-)
FWIW, as far as I recall there was another experiment at Google where they tried to establish correlation between interview performance and job performance, and as far as I recall, there was no meaningful correlation. This, of course, is not fully representative, because it does not include poor whiteboard performers.
This is not data.
These were their conclusions:
1. The ability to hire well is random. This is referring to individuals, not the system as a whole.
2. Forget brain-teasers. Focus on behavioral questions in interviews, rather than hypotheticals
3. Consistency matters for leaders
4. Grades don’t predict anything about who is going to be a successful employee. School grades, that is.
So, stop making stuff up from behind your throwaway account.
Google is collecting and analysing data to improve its hiring process... not to improve the hiring process of the industry at large.
There is an effectively limitless supply of great engineers who will jump through hoops to work for Google.
That's just not true for the vast majority of the industry.
If this were true, Google would have crashed and burned a long time ago.
Obviously, their interview process selects for much more versatile engineers than that. Engineers who not only produce reliable and maintainable code, but who can actually come up with products that generate billions of dollars over the years.
Actually, Google pays under the average of top companies, because as a top tier company, they can afford. Most people I know who went to work for Google took a pay cut but don't have a single regret about it.
> A simpler explanation is that they pay well and are prestigious and so get a lot more good candidates.
Their pay and prestige will attract even more bad candidates.
How do you separate good from bad candidates?
That's right: a kick-ass interview process.
Compared to who? They pay more than AMZN/MS/FB/AAPL/etc. for equal level, but are stingier with levels. You might be correct that certain other companies pay more than Google (Netflix maybe?), but they're certainly above average.
Google is more generous to good performers via bonuses once you are working there, but if you have two offers in hand, you are going to find Google highly resistant to negotiating. The notion of people taking a pay cut to work at Google sounds plausible to me.
If one's goal is to maximize compensation (particularly in the short term), a Google offer is better used to get a higher paying offer at one of their competitors.
I don't think that's true. While google is by all accounts (including in my personal experience) unwilling to move significantly on base salary, they'll happily match pretty much any offer with stock from my experience (and the experience of others I've talked to).
Perhaps I dealt with a particularly nasty Google recruiter. I felt like the recruiter had misrepresented the health benefits and relocation package once I got the actual offer letter and related paperwork.
That said, from what I've seen, compensation growth at Google from everything I've seen is faster than at the other companies, which means that for someone coming in at L>3, they will likely be given greater compensation at google than elsewhere.
I'm curious as to how they misrepresented things. I was actually pleasantly surprised once I got here by how extensive the benefits were, but I'm always interested in learning more, since while I actually think that 4 google interviews is a decent way to judge someone for google, I really hate their interview/negotiation process.
How many of their projects succeed simply because they are google? Some major ones (like android) come to mind.
If there are consistently used questions and specific criteria for assessing responses, can a candidate just learn the likely questions and what constitutes the "right" answer?
The reality is that generating good questions for a structured interview is difficult. You can't just pose a programming problem. As you've noted, most programming problems have multiple good answers. Differentiating between multiple good answers from different candidates re-introduces subjectivity. Your brain would rather convince you that one valid answer is less good than another, even when it's not, that to admit to you that it can't differentiate or generate a narrative for you. The part of your brain that generates narratives is incredibly powerful and does not care about how accurate your hiring process ends up being.
What we tried to do was create questions that generated lists of facts. "Spot all the errors in this code" would be an example of this approach (but none of the three we used). We went into the process wanting to embrace epistemological uncertainty, generating a historical trail of data that we could retrofit to candidate performance.
In the end, work sample testing was so much more powerful a predictor for ourselves that we never fully got around to analyzing the data. Sometimes we'd get candidates that clearly generated inferior "lists of facts"; I think there may have been 1-2 instances where that outcome actually overruled work-sample testing delivered prior (out of a few tens of engineering hires and probably ~100 interviews).
I wouldn't breach such a contract nor should you hire someone that would be willing to do so. The sole exception being in the cases where it would be the best for the public interest, i.e. whistle-blowing.
Is anyone though?
For most questions, there's no "right" answer, but there are a set of points that the interviewer wants to see you touch on. For example, they might first want to see that you can code up a naive brute-force variant of the algorithm, checking whether you know the programming language claimed and can think through the problem, and ask you the algorithmic complexity. Then they'll want to see if you can get a divide-and-conquer or dynamic programming variant with lower time complexity. Then they might ask "What if it has to be an online algorithm, where new input arrives before the computation finishes?" Then they'll ask "How would you distribute this over 1000 machines, and what are the failure modes?"
At each stage, they're watching how you answer, and where you get stuck. If you ask clarifying questions or spend time to think before diving into coding, that's a plus. If you have never heard of the problem before (this is frequent - many questions are not in textbooks), they want to see how you would reason through it, and break it down into subproblems that are similar to textbook problems. If you miss language trivia, most people don't care; when I did interviews I'd usually volunteer the answer if they missed some API call, and when I interviewed my interviewers did the same. If you don't know how to solve the problem and can't make any effort to move forward through a solution, that's a big negative. Similarly if you don't know what the concept of big-O is or why it's important.
But the real magic comes when they run out. What do they do then? Good people say that they don't know. Often, they will take guesses at a few more based on general principles. E.g., "I'm sure there are arguments for sorting, but I only use top for that." Or, "There must be more ways to filter, but I only do it by user or command." They'll usually look a bit uncomfortable, because they are the sort of person who likes knowing things. But what they won't do is bluff or make things up.
And honestly, that's my biggest tip for hiring: find people who like understanding things and know a good amount for their level, but are willing to say when they don't know. People who can't do that are dangerous.
I don't score it by how many option-letters people know. I score it by how well they demonstrate familiarity with a basic ops activity, which is finding out what's running.
Even if somebody said they never used ps, or like below, only use one incantation, that isn't a problem, because my follow-up there would be to ask how you'd find out what's using all the RAM or what's using a lot of CPU.
Maybe they're just used to using different tools, in which case I can ask for details for those tools. (And then check manually later to make sure I'm not getting snowed.) But if they've done a bunch of ops stuff, they'll demonstrate the sort of know-from-doing level of familiarity with something.
In theory I could just ask the broader question to start. But some people are good talkers, and I'm looking for evidence that people are good doers. All that said, I've shifted mainly to more experiential interview processes, as actually doing is the best way to see if somebody's a doer.
To me any question where the obvious answer is "I can trivially look all that up in the man pages" is a gigantic red flag that I don't want to work at that place.
If I'm feeling curious that day, I might begin a conversation about what you think you are testing for with that question, as I've spent a fair bit of my career studying hiring pipelines. But if I hadn't had my coffee yet or were irritated or something I'd probably just ask to talk to someone else.
That said, as I've explained elsewhere in the thread, the point isn't to see how many they know. It's to see that a) they know some portion of it that people doing the work would know, and b) have reactions to the rest that indicate useful work habits and attitudes.
A perfectly find first answer to is "I only use ps -aux. I'd just look in the man page if I needed anything else." Because then we could have a good discussion of how they use that output to figure out things, what sort of things they'd expect to see in the man page, and what other tools they use to see what's going on.
I know you're not asking people recite "ps" flags from memory. The problem is, you're looking for a subjective "X-factor" in how someone answers a trivia question.
I used to really like this approach too. I'd think of things that you'd only know if you'd actually done the kind of work I'd done. I had some favorite interview questions: "what are some of the functions you carry around with you from project to project in your 'libyou.a' library", and "where would you start looking if you had to debug a program that segfaulted in malloc"? I think the logic I was using is the same as the logic you're using here.
Ultimately, I think it's a bad approach. Equivalently strong candidates look at their work through different lenses, and find different things memorable. Far more important to me, though, is that some people really suck at interviewing --- and, what motivates me more, having succumbed to this repeatedly as a hiring manager and also being myself I think an example of the phenomenon --- some people just interview way better than they should.
The other problem with this question is that, by its phrasing, it implies that there is a right answer. If I can name more ps flags than an other person, I'm a better candidate. Which when put that way, I think points out its faults. Maybe you don't intend for that to be true, but you'd have to fight your own cognitive biases pretty hard to not at least bias towards the person that can name 17 flags off the top of their head, or the one that teaches you a new flag combo you didn't know. Even though those things are likely not that good of predictors of a good candidate.
You've said what you really want to get to, is can they admit when they don't know something. If that is important to your hiring process (and I'd encourage you to validate that with data), ask them a question no one could know the answer to. Ask the same question to every candidate, and grade it purely pass fail. Did they say "I don't know" or not.
>> admit when they don't know something... encourage you to validate that with data
My experience is that people that won't admit they don't know something are not necessarily bad employees. Often they are capable employees, although this is a trait I rarely see in the best employees. However, I find they are toxic to a good work environment, since they won't listen to people who do know something.
They do seem to get promoted to management had a much higher rate :)
So the process is, in fact, designed to make people feel uncomfortable, for a little bit (or at least you're definitely aware that this is a frequently occurring side effect).
Does that not suggest to you that there may be a negative tradeoff at play here? (To wit: yes you do manage do efficiently extract a few bits of information from them... but at cost of having them feel like they're being, well... interrogated. And are already "losing points" by that point in the conversation with you).
I had a PM friend who would try to push the bounds of a candidate's knowledge until they gave up and said "I don't know". The actual knowledge wasn't the point of the question: it was that they could say "I don't know", because PMs who can't tend to make life miserable for the engineers & other teammates who work with them.
And I agree with your PM friend; I think willingness to admit ignorance is even more valuable there.
But no, I don't think they walk away feeling interrogated. My style is pretty conversational, and when people don't have all the answers, I definitely work to make them feel comfortable with that. E.g., for this question I'd close with something like, "Of course nobody knows all the arguments to ls. I sure don't. So you did fine."
Try it next time you're mingling with ops people. Say in a bar at a conference. Don't look at what they answer. Look at how they answer. Some people are comfortable not knowing. Some people are excited to discover the ones they forgot they knew. some people get curious and eager to fill the hole they've just noticed in their knowledge. And some people get defensive and peevish.
Again, this is one of those, "Here's what I've got cached, and here's where I'd look up what I don't know offhand" questions.
From the ps manpage:
Note that "ps -aux" is distinct from "ps aux". The POSIX and UNIX standards require that "ps -aux" print all processes owned by a user named "x", as well as printing all processes that would be selected by the -a option. If the user named "x" does not exist, this ps may interpret the command as "ps aux" instead and print a warning. This behavior is intended to aid in transitioning old scripts and habits. It is fragile, subject to change, and thus should not be relied upon.
/actually I'd probably hit up the man page first...
Since the 'ps -o' flags are only useful in that one command, I just look them up in the (rare) situation where they seem the best alternative. Having been at this for decades, I try to reserve my limited memory capacity for factoids that will pay off, and being able to process columnar data generically at an instant seems more useful than knowing a special way to do it with that one command.
Not sure how well this works in practice though.
They were. I was asked a ton of this kind of Linux/Unix trivia in one session. As well as just random stuff. For example:
As a result of having read this story, I can now always instantly remember what 2^24 is too :-).
So 1000 * 1000 * 16 = 16 million would be an easy estimate to make.
If you are designing a system and want to have an idea of how much space or memory an approach will take, being good at this kind of math can be really helpful.
(I mean, I get it, this kind of general estimation and/or ball-parking can be incredibly valuable, but this doesn't sound relevant in any way.)
To be fair to Google (in regard to this one filter question only), they actually are up front about expecting a significantly higher baseline of general mathematical awareness (even for non-mathy roles) than most other shops. At any rate, this kind of estimation comes up in capacity planning (both in architecture planning, and for algorithms on a single box) all the time. Which after all is what Google does, at nearly every level of their stack, (nearly) all the time.
So, no, still not useful questions.
My opinion is that big-O often ends up being used as a premature optimization effort hindering "just get the right answer first". Maybe that brute force method will work in a reasonable clock-time cost, despite having egregious algorithmic cost. You won't know if you're busy overengineering and optimizing before you even have a working solution.
The reason is because when your dataset is in the petabytes, any algorithm bigger than O(N log N) is not going to terminate. You're not going to get any answer at all; your MapReduce is going to sit there burning CPU time for a day, and then you'll kill it, and you won't have any idea what went wrong. (This is learned from experience, if you couldn't tell.)
In a startup context, I'd certainly agree with you - that's exactly the approach I'm taking with my startup, where I'm doing a lot of the debugging and code-wrangling on my local laptop after verifying that the data I want exists and calculating its size with a couple cloud-computing passes. It can sometimes be useful to prototype & iterate with a small subset of the data until you've got the algorithms down.
But many of the algorithms Google cares about don't give any useful results on small data sets. I remember running a few collaborative-filtering algorithms and finding that they gave zero answers because within my small sample of 0.01% of the total data, nobody liked anything else that anybody else did.
 Well, it's really the amazing amount of throughput that modern architectures can achieve. Latency hasn't improved quite as much given the inescapable limitation of the speed of light.
Perhaps it has changed as well; when I left (2014) they had just started encouraging engineers to focus on one particular task, with the architecture already defined, while when I started (2009) there were still a number of problems of the form "here's a feature we want to add; here's the data we have available; how can we build it?"
What if you're having a conversation with someone about local politics and you're convinced the local zoning rules don't encourage growth in the way they think they do. Do you force the subject of conversation so you can make sure they know just how wrong they are? Not if you want them to walk away having a good impression of you.
Instead if the talk turns to zoning you put out feelers to see if they'll want to talk about the larger issue, and only engage in the conversation if it seems like the time for it.
The analogy isn't perfect, but if the interviewer wants to talk about the larger issue, you're probably well matched and will have a great interview. If not, and you know enough about runtime complexity to discuss the tradeoffs, then you know enough to just answer the textbook big-O question and move on.
Then there are situations where the code you write must play nicely within a massive, already-complex system, wherein it will work with potentially huge inputs. Like at Google.
So, saying in a Google interview that you don't think Big Oh is super important, and that you prefer shipping whatever correct solution you come up with first and then worrying about efficiency if it ends up being slow, would likely not get you very far.
Let's say you created your own company and you're interviewing engineers.
Would you be comfortable hiring someone who doesn't know the difference between O(n^2) and O(2^n)?
Or someone who doesn't even know what these concepts represent?
There's also the thing I picked up from playing around with graphics demo coding; get the algorithm working first, even if it takes 10 seconds per frame. Then look for the slow parts, concentrating first on the inner loops. Whatever you do, don't try to prematurely optimize the code as you write it, as tempting as it may be. Because almost certainly you'll make things worse than if you waited until after you have a working first pass.
This applies to way more than just fast graphics code, of course.
Any sort of technical solution given at a Google interview would generally have the following questions:
1) What big O time does this algorithm run in? Why? What big O space requirements does it take.
2) If space were more/less expensive, or time more/less important, how would you change the solution and why?
Understanding those tradeoffs and being able to analyze code at that level is a big part of most software engineering jobs.
The interviewee is more than welcome to study data structures and interview questions about data structures. A poor student will just memorize questions and answers by rote without really understanding the data structure in question. The good student will actually learn and understand data structures in the context of the solution.
One would hope that interviewers are able to ask follow-up questions about the solution which distinguish between the two.
EDIT: During the best tech interview I have done, I had no idea how to solve the question asked (that is, I had not studied its solution despite this being a relatively common question). I was able to ask articulate questions and invoked understanding of data structures as I went along. Memorizing solutions is a game of luck, and not recommendable.
There's a limit to how far you can get just Googling stuff ("how do I reverse a string") for recipe-book solutions to things. I think practically everything in Computer Science could be Googled ("how does merge sort work", "what is an inclusive vs exclusive cache") at some level but this doesn't mean one shouldn't know a great deal of it. At least, if you want one of these jobs...
I think one of the most helpful methods of determining a good interviewer vs. a good future employee is taking a structured, situational interview question, i.e. "How would you go about selling a new product to a customer?" and turning it into an actionable work sample that is tailored to your company. "We make this piece of software that does this. Spend the next 10 minutes writing a cold email to a prospect that outlines our offering." As the interviewer, grade the work sample on structured criteria that is important.
You do want to try out the question and criteria before depending on it too much, but it's easy enough to try it out on a colleague.
I take it you're working from a different theory, where you try to put the expertise in a machinery of questions and answers, relying less (or not at all) on the interviewers themselves. I think that's also a valid approach, one with different limitations.
Personally, as I mentioned elsewhere, I'm not a big fan of interview questions anymore period. If I want to know if somebody can do the work, now I try to create situations where they just do the work. But if one is trapped in the dominant tech interview paradigm, I personally favor investing in interviewers more than the questions themselves.
I'd probably start here to lighten up the mood due to how cliche the question has become:
Nothing about Google's hiring process can be deemed reputable when they'd go so far as to illegally conspire with other tech giants to prevent potential employees from achieving the best possible outcomes for themselves. Maybe things have changed over there, but given the slap on the wrist they got, I'm skeptical.
Most importantly most companies interviewing don't get to pick from the pool of applicants that google gets to. And even more importantly most companies don't even get to pick from the pool of applicants that a typical funded (or soon to be funded) startup will get.
While I am sure there are things that can be learned from 'Work Rules' the question is how much of that applies to the vast majority of companies in America.
It may be good at screening University graduates but it's pretty awful at anything else. It's also really obvious that the interviewers don't actually know how to interview people and are pretty much cargo-culting the same process that hired them.
Awful. It totally destroyed my perception of the company.
Now granted this was... 7 years ago, no (or more) but from colleagues who have interviewed there (or been hired!) it's not changed fundamentally that much.
Today they ask fairly easy questions (given a list of integers, find the subsequence that has the following property etc. - basically leetcode medium level) that you can solve fairly easily using basic CS 101 knowledge. However, it ends up (again, IME) being a race against time where you have to whiteboard code a simple solution in 20 minutes approximately.
I'd much rather have them as questions that take real insight to solve, but have the interviews last 90 minutes or so.
To me their approach lately is like forcing a top tennis player to play with kids (well, you would be surprised how many pros have issues slowing down their game) or a double diamond slope skier to ski on blue slopes only (and increasing the risk of getting an injury as their timing/moves are optimized to ultimate performance instead of basics). I've heard of people that solved stuff on interviews nobody solved before them but were rejected as they messed up some basic stuff and recruiter was communicating back to them that it looked bad in the hiring committee.
Similarly, as a senior engineer at a large company, you need to be adaptable enough to work with people with less/different experience than you. Adjusting your message to the audience is an important skill.
A lot of people (anecdotally, the majority) at Google have the "impostor syndrome", and the news of the experiment did nothing whatsoever to quell the symptoms. Now they don't know if they are, in fact, not impostors, but they do know that on average they perform about as well. :-)
In other words, maybe as long as you let in a small number (but only a small number) of non-performers, you're fine (which is bound to happen anyway - I'm sure there is some noise in the interviews).
Then there's the issue that by the time you even get an on-site, you're already very much not a random candidate. Recruiters actually do look at your track record, etc. You can bullshit there, but I don't recommend it, since references will be spot checked, and they better line up.
Google interviews are largely a roll of the dice above certain level of basic engineering competence. I.e. if you don't know the basics, you will almost certainly not pass them. But if you're a more senior candidate, Google doesn't really know how to interview you, and their interview process turns into a random number generator biased heavily towards "no hire".
if you have to start using structured interview questions to expand a team - you've already lost
The real conclusion should be that "unstructured interviews provide a variable that decreases the average accuracy of predicting GPAs, when combined with (one) other predictive variable(s) (only previous GPA)."
This conclusion seems logical. When combined with an objective predictive measure of a person's ability to maintain a certain GPA (that person's historical ability to maintain a certain GPA), a subjective interview decreases predictive accuracy when predicting specifically a person's ability to maintain a certain GPA.
To then go on to conclude that interviews then provide little, to negative, value in predicting something enormously more subjective (and more complicated), like job performance, is absurd - and borderline bad science.
There are numerous (more subjective) attributes that an unstructured interview does help gauge, from culture fit to general social skills to one's ability to handle stressful social situations. I'd hypothesize all of these are probably better measured in an unstructured (or structured) interview than in most (any) other way. To recommend the complete abandonment of unstructured interviews (which is done in the final sentence of the actual paper) is ridiculous.
Well, based on what? Is there any evidence for any of these hunches?
But, if you really believe that it's a huge leap to hypothesize that interviews would be a better measure of subjective social skills than metrics like GPA, I don't know what to tell you.
Believe it or not, there are non-racist and non-classist character traits that are also not represented in one's GPA.
But, I guess hiring for one's willingness to work with specific clients, or for one's happiness with the organization's structure (in, for example, a holacracy) wouldn't be valid things to hire around to you?
Also, it is quite often not the technical questions that end up making us decide not to hire someone. That's just one area. I know it's the part that sticks out, and candidates give a lot of weight to it in their memory of the interview, but you shouldn't just assume it was that your whiteboard code wasn't quite good enough. I actually don't think that's the most common thing we give a "no hire" for.
If it's the tech giant, it's very possible they attract better candidates because they offer more.
The best sofware guys I've ever seen work at those big corps. The big corps are the ones that have the resources to work on the REALLY hard problems and not writing the same CRUD app over and over again. Those same companies also provide tons of educational benefits so that you become an expert in your field.
Things like fighter jet software are fantastically complex. You hear about the dumb mistakes that are made (International Date Line Issues), but you never hear about the thousands of hours of testing every change to code goes through and the insane calculations that are made every second in even standard level flight.
I have no idea why there is such a backlash against interviews. What's so hard about studying for a job you want? Even if you don't USE knowledge, you should at least be able to rederive things using your base knowledge. Thoe interviews I've seen at aero companies are damn hard. Tons of grilling on if you actually understand mechanics and fliud dynamics.
That's a really good question and I had to think about why I don't like interviews.
1. It's an interrogation and the stakes are extremely high. There are so many aspects of it you don't control. Some guy had a shitty morning, or just doesn't like you, dropped your resume on the way back to his office, etc.
2. It's completely phony and it starts with the first question, "So why do you want to work at generic company X?" "I need money," is not an acceptable answer. Now I have to be phony and tell them why their company is awesome (it's not) which makes be feel like a kiss ass. I have to pretend to be excited about working at a boring ass company. Just shoot me.
3. We have to do it, unless we have rich parents that died young like Batman.
4. The person or people that are interviewing you have no notion of what you've accomplished aside from skimming your resume. All the hard work I've done to produce miracles in the last 20 or so years means, really nothing.
5. Companies only reward tenure at that company. It signals the start of something new but that's typically a bad thing when it comes to work. Less vacation, less credibility, less influence, etc.
6. You really don't know if the person or people you are interviewing are bozos or not. It doesn't matter, they cut the checks, they have the power.
7. As the article insinuates, they're fairly pointless.
The by far most common reasons interviewers offer for low scores are: didn't listen/didn't acknowledge when they reached the end of their knowledge/didn't test when they got off the rails/just didn't seem like a cooperative coworker.
These are all filters that may not matter at ye olde startup; but in a stable long lived team, these are red flags that seem corellated in my experience with lowering the morale of a dozen other people. So, even if you're a rock star, you might get rejected.
> didn't acknowledge when they reached the end of their knowledge
Oh man, this one. How hard is "I don't know?" You're not supposed to have the entirety of computer science and software engineering knowledge jammed into your head. Let's talk about what you know and what you don't and figure out whether you're a good fit.
We hire lots of people without direct experience in what we do because they seem great to work with and we think they'll learn quickly. Just please don't try to bullshit; it doesn't usually work.
Can you cite any of the more common things you give no hires for?
- Lack of self awareness and introspection. Not being able to give specific examples of times you've made a mistake and how you learned from it.
- Shit talking old coworkers, general attitude that you're great and nothing is your fault.
- Being unable to talk about YOUR specific contributions; "we did this, we did that, the project made money..." Okay, what did YOU do? Surprising amount of people fail on that one.
- Not being able to give context about the "why" of what you worked on, what other options you considered, etc.
- Not thinking about the end user; "I used this technology because it sounded cool" instead of "I wanted this result for my customers." Tell us about the situation and why you made the right or wrong choices.
- Interrupting the interviewer and not asking questions. This is a weird one. We want you to do well, so sometimes if you're going down the wrong path we'll try to help. I've had candidates talk over me and just barge ahead down a totally incorrect path. ¯\_(ツ)_/¯
Obviously most weaknesses can be overlooked if there are serious strengths.
I've done a lot of interviewing, happy to answer questions.
>- Shit talking old coworkers, general attitude that you're great and nothing is your fault.
These are fair enough, although you did say above that the other coworkers in your old scrappy startups were much worse.
> - Being unable to talk about YOUR specific contributions; "we did this, we did that, the project made money..." Okay, what did YOU do? Surprising amount of people fail on that one.
Some people will struggle to take credit because of politeness. Isn't taking credit trivial to do? Especially when the interviewer cannot possibly check up on that. I'm absolutely not convinced about this one, although I do make a point to talk about my contributions when I interview (which frankly doesn't happen much at all, since I tend to stay in the same job for long periods if possible and try to make things better there).
>- Not thinking about the end user; "I used this technology because it sounded cool" instead of "I wanted this result for my customers." Tell us about the situation and why you made the right or wrong choices.
This sounds like bullshit too, to be honest. Most people don't get to decide the general point of what they do unless they are the CTO. However I do realise that this is absolutely the kind of stuff interviewers look at and you need to prepare for it.
>- Interrupting the interviewer and not asking questions. This is a weird one. We want you to do well, so sometimes if you're going down the wrong path we'll try to help. I've had candidates talk over me and just barge ahead down a totally incorrect path. ¯\_(ツ)_/¯
>Obviously most weaknesses can be overlooked if there are serious strengths.
Which you really won't know until months later.
>I've done a lot of interviewing, happy to answer questions.
I know people who'd do badly on several of these accounts and are completely brilliant at getting the job done.
The problem is that you cannot possibly know how well would the ones you rejected would have done.
If you can have a probation period of 3 months or so, that should be the main yardstick. Of course someone's attitude can become shittier over time but no interviewing process can reliably catch that.
Nope, I said the engineers I work with now are consistently better. "My current team is very good" is MUCH different from "my last team was very bad." (And I'm not in a job interview.)
> Some people will struggle to take credit because of politeness. Isn't taking credit trivial to do?
No, it's not so much about credit, it's about specificity and the ability to talk about the different parts of the project. A lot of people get stuck on, "We made a project that sold widgets." I want something more along the lines of, "In our widget selling application, I worked on backend services for payment processing and fraud detection." Some people get really stuck on generalities and won't dive into specifics. It just makes it impossible to evaluate your contribution. Maybe you didn't do anything. I have no idea, you're just not generating useful information for me.
> Most people don't get to decide the general point of what they do unless they are the CTO.
I totally disagree. So many tiny choices that engineers make from day to day have a tangible impact on customers. Even if you're a junior engineer, you have an impact on the latency of service calls you're responsible for, as an example. Do you pick the lightweight framework that loads quickly, or do you need the features of the bigger one? It's that kind of tradeoff that people should consider, and "I saw something shiny" is not a good answer.
> The problem is that you cannot possibly know how well would the ones you rejected would have done.
Yep, for sure. I'm positive that I've rejected good candidates. That's the side you want to err on though, especially if you have lots of good candidates.
If that line is as detailed as you're looking for, then that's reasonable. Although I've seen people get annoyed that I don't remember the specifics of exactly what I implemented and why and a list of the various decisions made on projects that are 2+ years in the past.
At my current company I've worked on 300 tasks over the course of 2 years, across 30+ projects, ranging from simple bug fixes to implementing large swaths of new software for Fortune 100 clients. There are some similar items in there, but most of them were different.
I don't have an amazing memory, so a lot of the old stuff I worked on becomes very vague. Hell, it can take me time to remember what I need to do to work on something I haven't done in the past six months at my current company.
I know those old projects in broad strokes but if you want me to talk about a specific project in detail that happened years ago, then I will struggle to dig up those memories, even after reading a document I made that refreshes things a bit before interviews.
I'd venture that not being able to answer this question and others like off the top of their head is less an indication that they are unaware of/don't learn from their mistakes than it is an indication how well they practice for bullshit interview questions and can give you a bullshit answer to your bullshit question that gets past your personal bullshit-o-meter well enough for you not to be offended.
But since you're also probably trying to filter out cynical assholes like me, ¯\_(ツ)_/¯
Well it certainly shows a lack of preparation and experience.
Here's how I approach those questions. I think about each project I've launched, and whether they're relevant. It's an easy thought exercise, and then you have an answer to that question forever. And seriously, time you've made a mistake should be an easy one, and it's one everyone should have ready. Over your whole career you can't think of a time you've done something that you'd do differently if you had a chance? I bet you can, and if not, it shows a serious lack of introspection.
We want people trying to make themselves better. Thinking about your career and studying for interviews is part of that.
> filter out cynical assholes
Dude, at least 50% of the tech industry is people who would charitably be described as cynical assholes. Part of being professional is being able to turn it off. If you can't give a professional answer in an interview how can the interviewer expect you to give a professional answer in a contentious meeting?
Especially the "tell us about a time you had a conflict with someone" question. Ugh. I guess I just need to seek out some assholes to work with because I've got a big fat nothing to talk about on that one. I ought to start writing down the crap I make up for it so I don't have to do it again every time I interview.
I know it's phony, but job interviews are sales pitches. Preparation helps.
There's a gulf between "nothing is my fault" and "some of my old coworkers were shit and they deserve the shit-talk they get from me".
I don't think it's fair to judge people universally on the base if they shit-talk or not. Being 100% kind and never saying anything bad about people in your past is vastly overrated.
> - Not thinking about the end user; "I used this technology because it sounded cool" instead of "I wanted this result for my customers." Tell us about the situation and why you made the right or wrong choices.
You're polarizing this way too much. If I go after a "cool technology", there are several things MANY interviewers don't take in consideration:
(1) It's only their own interpretation I chose a "cool" technology over the customer needs. Cognitive bias and all. Not to mention most people really have no qualifications to even claim this.
(2) In the senior programmer area (where I believe I belong) often times you have to make calls nobody can inspect for a while, you have to trust your experience and intelligence and make a decision quickly. If Ruby on Rails consistently fails you on a single website project of yours, it's very okay to start switching out its slowest parts with Elixir's Phoenix <-- that's a recent example in my work place. I chose both "cool tech" and "customer needs" together.
(3) Many times there's no immediate benefit to your work. It's easy for a manager to blatantly reject a hard mid-term decision implemented by a tech lead as "he's after the cool tech only because he's bored" and only I know in my head that the results from that "cool tech" will start showing in a month from now (it also doesn't help at all when I tell them that and they don't believe me).
> - Interrupting the interviewer and not asking questions. This is a weird one. We want you to do well, so sometimes if you're going down the wrong path we'll try to help. I've had candidates talk over me and just barge ahead down a totally incorrect path. ¯\_(ツ)_/¯
I admit I've been in the wrong on this one but I want to give you another perspective. I wanted to make a certain point -- 99% of the time it's the "why did I do this" or "why did I fail doing this" or "how did I succeed by doing this" -- and sometimes I digress because there are sub-points and I overdo my tries of being crystal clear. Granted, that's up to me to perfect as a communication skill but I've been extremely annoyed by interviewers who can't seem to trace my initial line of thought and try to coax me back in it. Instead they give you a smug expression along the lines of "this guy talks too much" and they form a negative impression right there and then. I can see it in their eyes and quite frankly I lose respect for them immediately as well -- they can handle such a situation much better. Interviews are a two-way process and both sides screw up in every single interview.
So overall, I believe you're over-generalizing on a few points.
This is just such a red-flag in an interview. I have worked with some terrible people, for sure. But I would never talk about them in an interview.
> t's only their own interpretation I chose a "cool" technology over the customer needs.
Heh, I swear to god when I asked the question, "Why did you pick X technology?" they said, "Because it seemed cool." I want to hear, "because it performed faster" or "because it had X feature I wanted" or whatever. Tell me WHY it's cool. I just want to hear how you'd explain a choice to me as a coworker, I'm not looking to actually second guess old choices. I'd even take, "Well, we had a really short deadline, and I was familiar with the technology. I decided hitting the date was more important than evaluating all the choices." I want to hear about your thought process; I don't care what choice you made. "Because I thought the industry was moving that direction and wanted to be future-proof and improve our ability to hire" is a good answer too. There are a ton of good answers, but "because it looked cool" is not one of them.
> interviewers who can't seem to trace my initial line of thought and try to coax me back in it
For sure. I try really hard not to talk when a candidate is talking because I know how hard it is to get thrown off in such a stressful situation. Sometimes the candidate has just obviously misunderstood my question though. This is much more important on technical questions -- I believe a good interviewer will treat the candidate like they are a coworker, and basically solve the whiteboard problem in a fairly collaborative fashion. Obviously the person doing the interview will take the lead, but I'm SUPER happy to answer clarifying questions, and if you start to struggle the right thing to do is to ask some small questions and see if the answers help. Just sitting there banging your head against the whiteboard isn't useful.
> So overall, I believe you're over-generalizing on a few points.
For sure, I'm just trying to give some really brief summaries of bad behaviors I've seen. I've also given a "hire" to people who did one of each of the above; like I said, none of it is disqualifying.
> This is just such a red-flag in an interview. I have worked with some terrible people, for sure. But I would never talk about them in an interview.
Of course! Don't get me wrong. I am not making it a goal to shit-talk former coworkers in an interview. I try to avoid it, but what am I to do when I have to share that I maintained a Rails 4.0.x app for 16 months and was unable to upgrade it even it to 4.1.x because the previous team did a horrific job? It wasn't my fault at all and I actually sunk several weekends trying to upgrade it but was drowned in mysterious and horrifying errors (like an ancient state machine gem having very rigid dependencies and utilizing 80+ monkey patches practically made any upgrade impossible) before giving up? Lie that I suck? No I don't suck, they sucked, and I ain't taking the blame for them. That being said, I have better uses of my leisure time, and my work time as well -- having in mind I am not expected to refactor at all. So I tried sinking 35-40 hours of leisure time in that problem and moved on.
> Heh, I swear to god when I asked the question, "Why did you pick X technology?" they said, "Because it seemed cool."
You'll hear a very hearty (and non-sarcastic) "I am sorry for that awful experience, man" from me. I never do that. I always explain myself. I went out of my way to rehearse in my spare time, just asking myself the question "why did you pick tech X?" -- I am a firm believer of Einstein's saying "If you can't explain it to a 5-year old then you don't understand it"!
> Sometimes the candidate has just obviously misunderstood my question though.
Perfectly fair. Never objected to that and I also went to apologize for the misunderstanding because in an interview every minute counts.
Thank you for being constructive.
The reason I personally care about the skill of talking about bad coworkers tactfully is that I think it correlates with being able to navigate tricky workplace fuckups without making a big political mess that I have to clean up. The way I would phrase the above situation is:
"I inherited a Rails 4.0 app, and I wanted to upgrade it to Rails 4.1 so that we could take advantage of feature X. This wasn't on our roadmap, so I spent a lot of my own time looking into the feasibility and working on it in my personal time. However, it eventually became clear that too much of the legacy code would have to be totally redone, and there wasn't enough of a business upside."
> I am a firm believer of Einstein's saying "If you can't explain it to a 5-year old then you don't understand it"
Me too! Love this.
You're a better diplomat than me. I could've phrased it almost the same, but with me the tone and the wording depends a LOT on my current mood -- I have to work on this, that's for damn sure.
It does however still leave the question "but whose fault that legacy code is?" open, don't you think? With my wording in the above comment I would've aimed at being crystal clear -- and thus somewhat rude in the process, admittedly.
> without making a big political mess that I have to clean up
Could you please clarify on that point? I am curious.
Heh, I can do this all day. "We accumulated a lot of tech debt on a previous project because of very short deadlines." Or, "the previous owner learned the technology while building the system -- I'm sure he would have done it differently today." What I'm looking for is that you understand what leads to this sort of situation, and it isn't usually, "the last guy was an idiot." I mean sometimes, sure, but even then it's usually closer to, "the last guy was hired into a role he wasn't ready for and needed a better mentor, or better training."
> Could you please clarify on that point? I am curious.
It's really helpful to be able to send an engineer to collaborate with another team without having to worry about whether they'll end up butting heads with someone and making a mess. Tact is important. Sometimes other people are under constraints that you're not aware of, and it's useful to have empathy. Maybe they have super tight deadlines, or maybe they're having to use a technology they've never used before. It's easy to say, "this person is a moron and their code is bad," but if they get the impression that you think that it can really harm working relationships. At that point, I end up having to step in and smooth relationship over, and it's not a great use of time.
Talking about how bad ex-coworkers with tact shows me that you 1) have empathy and 2) will understand how to navigate similar situations if hired.
I can see that! :D Thanks, you've been very helpful. Believe it or not, I am learning from this interaction.
> Tact is important.
I don't disagree and I'm with you here. But herein lies the dilemma -- I've been tactful and diplomatic way too many times for my taste. I've had my fair share of politics. I am not horrible in it; I simply started lacking any patience for it and thus my mood started leaning heavily towards being blunt and somewhat inconsiderate. I am not outright offensive but I am no longer tactful and diplomatic (sigh).
I started standing up and heavily protest in the face of bad politics however. I have less patience now because I expect everybody else to do the same -- and they don't.
That's why I get easily irritated. I absolutely agree with ALL of your points -- the people might have had horrible customers (or team leaders), they might have drained the budget close to the end of the project so they probably had to cut tens of corners, they might have been gaining experience along the way... and a plethora of other possibilities. I agree.
I do have empathy and tact. But I have lost almost all patience in the last few years.
I appreciate that you might not believe such a polarized message in an interview. But I got slightly carried away sharing. :)
At the very least, just prove that you can keep your mouth shut if you think someone is the worst, heh. That's really all I'm looking for. ;)
I also totally understand getting frustrated. I've been a total dick to my coworkers in the past,
for lots of different reasons. I was burnt out, I was young, etc. I've mellowed out a lot over the years. A good chunk of that is also just being on a good team; it's really hard to be the best version of yourself of you're not well supported.
And here you have myself at 37 being much more impatient compared to 27. :D Truth be told, I am also much more mellow in general, just not in work lately.
> At the very least, just prove that you can keep your mouth shut if you think someone is the worst, heh. That's really all I'm looking for. ;)
Double thanks, this is an extremely valuable advice!
I appreciate you taking the time.
Dummy candidates mixed in with the interview flow that gave randomized answers to questions (interviews were structured to somewhat blind this process), and interviewers lost no confidence from those interviews.
Interviewers, when told about the ruse, were then asked to rank between no interview, randomized interview, and honest interview. They chose a ranking (1) honest, (2) randomized, (3) no interview. Think about that: they'd prefer a randomized interview to make a prediction with over no interview at all.
Of course, the correct ranking is probably (1) no interview, (2) a tie between randomized and honest. At least the randomized interview is honest about its nature.
Similarly, the people doing the prediction were other students rather than teachers who know better what to look for. The research would better fit HR people hiring for roles they know nothing about than experienced team members hiring for roles similar to what they do.
The research results are massively over-applied, and in no way whatsoever is sufficient to use the term "utter uselessness of X". Unsurprising, given it is research by a 'professor of marketing'.
I assume that interview could be pretty short.
The study was done on students, who almost universally have zero experience selecting people to work under them in an industry setting (or at all). Drawing conclusions from this particularly inexperienced subject pool and then extrapolating out is bogus, particularly given the extremely certain language of the article. The subject pool is at an age (18-22) where people are still figuring out what to make of themselves and others; they have extremely little adult experience of the workplace and judgment of character - indeed, at this age, people are notorious for making bad decisions in their personal lives.
When you look at the actual paper they link, out the window goes the declarative language, and instead the article is unusually full of weasel-words ('can', 'may'). There's a major difference between "utterly useless" and the actual conclusion of their paper: "interviewers probably over-value unstructured interviews".
People think they can get valuable information from this, so they want to meet the candidates, even if they are told the answers are random.
You need to prove (and then convince people) that all this extra information (the impression a person makes when you meet them) doesn't improve your prediction of their future GPA over a prediction based only their past GPA.
Remember a few years ago when this forum was drowning about how to find and hire "10x" people? 98% of employees were useless in the face of the 10xer.
The reality is, most of the time screening for general aptitude, self-motivation and appropriate education is good enough.
I've probably built a dozen teams where 75% of the people were random people who were there before or freed up from some other project. They all work out. IMO, you're better off hiring for smart and gets thing done and purging the people who don't work.
I feel like as a company you could exploit a lot of value by just hiring people and training them. Training btw doesn't mean at work. Think how much you learned in college lectures vs. reading the actual textbook. It means going home and studying, and the incentive should be that you're getting things that make you more employable, so this shouldn't count towards hours (if it was say a very proprietary old programming language though I could see this argument falling apart).
Not saying this is the best way but shocked that everyone is "trying to find the best". What kind of world will that leave us in if all companies want to hire the same 3% of
people? Businesses move slower, talent is lost, and inefficiencies accrue.
If you provide meaningful training, you need to be fair in the application of said training. If you admit that you can train people with common existing skills to do most technology jobs, it's going to hard to justify your cozy recruiting funnel with a small number of universities, picked in the basis of where a few bigshots in the company went to school.
Then again, theoretically this gives a huge edge to the proactive learners.
However there is some challenge with having experienced people. They simply can't answer the screening after a while and can't get hired. And who's gonna teach the juniors if you don't have seniors???
Personally, I moved to finance. I find that experience, domain knowledge and maturity is valued more there :D
I'd rather hire a programmer who knew less and could get along with others than a master dev who's a total a-hole.
In my personal experience, structured interviews can be very helpful in determining a candidates abilities.
It's possible that what you consider to be a structured interview is in fact what this author (and I) would call unstructured. Specifically: if the interviewer has any discretion about questions at all, the interview is probably fundamentally unstructured.
In a structured interview, the interview is less an interrogator than a proctor or a referee. Every candidate gets identical questions. The questions themselves are structured to generate answers that facilitate apples-apples comparisons between candidates: they generate lists of facts. The most common interview questions, those of the form "how would you solve this problem", are themselves not well suited to these kinds of interviews. It takes a lot of work to distill a structured interview out of those kinds of problems.
This has me mulling whether that might be a better approach to administering law-school exams than the traditional analyze-this-hypothetical-fact-situation approach. (I'm a part-time law professor.)
More generally: I wonder to what extent school examinations can draw useful lessons from job interviews.
-The more hoops a candidate has to jump through, the more likely they are to bail out of your recruiting funnel. This is especially bad for college/postgrad recruiting when you aren't the #1 employer in your field. Everyone wants to work for the Googles and Facebooks of the world. It's hard getting someone to spend a couple hours for your startup job.
-People cheat. We usually issue a short coding project, grade for correctness, then do a code review over Skype or face-to-face. Many candidates turn in the exact same responses. I've even seen people cheat and have a friend do the Skype session with a totally different guy flying out. Do you proctor your test in a secure center? Use an online service to lock down their machine and record? Both are pretty invasive. Switching up the questions constantly is tough and makes your signal noisier.
-Industriousness and raw intellect trump skills/knowledge most of the time. Sure there's a baseline level of skill required to train someone quickly enough, like I wouldn't hire someone who didn't know basic data structures, but work-sample tests are often biased to those with a very specific background. I don't want employees who are great at doing what I need today. I want ones who will be great at figuring out what to do years down the line.
Second, incorporate the work sample tests into your in-person (or even telephone) interviews. Now you have a (hopefully interesting) technical problem they ostensibly just solved to talk about. Your evaluation should be by formal rubric, not interview, but it's easy to see if someone actually did it. We had no problems at all with people cheating (of people we hired with this process, over about 4 years, we fired not a single one).
Finally, I could not be less interested in the kind of amateur psychoanalysis tech interviewers hope they're accomplishing as a side effect of quizzing people about code in their conference rooms.