Over time I’ve become more skeptical about this kind of psychology research (as more studies fail to replicate) and, as is often the case, here the sample size is quite small (76 students, split across 3 groups), predicting something relatively noisy as GPA. It is unclear to me that one would be able to detect reasonable effects.
Furthermore, some claims that make it into the piece are at odds with the data:
> Strikingly, not one interviewer reported noticing that he or she was conducting a random interview. More striking still, the students who conducted random interviews rated the degree to which they “got to know” the interviewee slightly higher on average than those who conducted honest interviews.
While Table 3 in the paper shows that there is no statistical evidence for this claim as the effects are swamped by the variance.
My point is not that this article is wrong; verifying/debunking the claims would take much more time than my quick glance. But that ought to be the responsibility of the newspaper, and not individual readers.
Politicians don’t get to write about the successes of their own policies. While there is a difference between researchers and politicians, I think we ought to be a bit more critical.
So obviously, the title of this article should be "Maybe interviewing is not that useful" instead of "The utter uselessness of job interviews", but besides this I find your comment unjustified.
In fact, it's quite the opposite, I believe this type of work contributed to make us more critical by questioning some basic facts about interviewing, that i would have never questioned just a couple of years ago.
> Over time I’ve become more skeptical about this kind of psychology research (as more studies fail to replicate)
ok, this is interesting, where is it mentioned?
 Think fast and think slow. There is this short article which mention some of the results and has been discussed on HN a couple of times already.
It's a lot harder to present a better way of predicting candidate performance in the workplace, along with substantial data that indicates it's better than today's methods. Corporations would love more effective ways to determine effectiveness/performance before hiring.
Interviewing is terrible, but that doesn't mean there is a better option.
This is one of the biggest problems I see with business guys today. They want absolute certainty in a world which can't offer it. Start accepting that there is a lot of stuff we don't know and can't know at the moment and we simply have to work towards getting better.
You can't get better if you don't acknowledge that there is a problem in the first place.
These silly, must have +5 years of experience, and check all these boxes with technologies to get the job clearly shows the industry is completely lost at the moment and isn't willing to acknowledge it.
There are a bunch of other testing methods that are illegal in the USA too, such as IQ tests.
This is as it should be—I'm paid to solve problems.
Why is this different?
The more I read about interviewing, the more I realize too many people think they have this problem solved—their amateur psychology is impeccable and their technical screens test for exactly the right things, no more and no less. Did they do a bunch of controlled studies to convince themselves of this, or are they taking sounding good, or intuition about the statistical outcomes of different techniques, to be equivalent to truth?
Maybe the first step is to collectively realize we have close to no clue what we're doing, and are being asked to solve a hard problem: individually, to talk to someone for an hour and make a hiring recommendation. In aggregate, to make the decision based on a handful of these one-hour conversations.
Maybe the first step is to realize this is a problem worth trying to solve.
Maybe the first step is vocal non-acceptance.
Which is bullshit. It's perfectly reasonable to criticise something without proposing an alternative. It's especially ridiculous to reject criticism provided without alternatives, when it's literally your job to do the work being criticised.
"Hey the way you're doing this part of your job produces results no better than random selection."
"Bring me solutions, not problems!"
Yes yes..of course I know this could be gamed as well, but no matter...you can't really argue that this wouldn't be magnitudes better then the typical current/broken interview process.
Also, not all jobs/codebases lend themselves to being productive in 2 weeks, I'd argue they should be, but they aren't.
I switched roles to a new team, and the first 2 weeks were a trainwreck. It was almost of no predictive quality on how I would do.
Now, there are many reasons why that was the case, and perhaps those underlying issues should be addressed, but from all the role changes I've had, the first two weeks shows how well the group you're going into can onboard, more than it shows how productive the individual will be in the long term.
It works good enough.
And I bet almost everyone reading this has worked with at least one person they think shouldn't have survived the interview, and that person was making a boatload because they convinced the boss they're brilliant. Meanwhile 90% of their day was spent talking about how great they are, and 10% creating new bugs, and no one dared say anything because the thought of them being more "productive" was horrifying.
'It' mostly successfully matches employees to employers, but the quality of those matches may vary wildly.
Interview processes also vary wildly—you can't really say that it works without defining 'it.' Are we talking multiple technical screens that require writing code or a single fluffy buzzword-laden conversation with a C-level? Both have failure modes, but those failure modes sure are different.
End of the day, everything evens out. People add the structure they need when they hire people. If your engineering interview process for some detail oriented gig is buzzword trivia with the CIO, the company will probably tank anyway. Conversely, if you do some nerd-fest whiteboard interview for a CTO in a bigger organization, you're probably not getting the right outcome either.
And this is a near universal phenomenon. Almost everyone wants to "get to know the candidate."
Everyone hates the process, and companies have investing major dollars and hours trying to improve. End of the day, little has changed since 1917. You either acquire-hire, get a strong referral or interview a pool of unknown applicants.
The tests and quizzes are little different to how a city hired an accountant in 1917. The old boys network evolved. Then you're left with the rest.
> Corporations would love more effective ways to determine effectiveness/performance before hiring.
This is irrelevant to evaluating the current methods. Even if there is no replacement, if they are useless, we should know it, bar none.
Leeches are often used in literal modern hospitals in the developed world as an effective treatment for certain ills.
If we had no alternative, we should stick to it, but look at other things in the meantime. You seem to be advocating doing nothing at all, which is trivially easy to show works for no-one
You could make the argument that algorithms tend to be studied more by smarter people, but if that's what you're going for you may as well ask them about their hobbies, and hire the person that is into playing chess, or doing astronomy (or whatever intellectual pursuit you care to name).
If on the other hand you are interested in a person's ability to code, ask them to do so. The last time I had to hire someone, I wrote a small application with one module that was deliberately written in an obfuscated style. I asked candidates to bring that module under control - rewrite it in a readable code style. To do this, successful candidates needed to identify what the current code was doing by examining the public interfaces in a debugger, documenting what the calls seemed to do, prepare unit tests, and then rewrite the module in a readable style. It took about a day for most candidates to do.
At the end of that, you get to see a candidate's ability to read code, use a debugger, write unit tests, write documentation, and write well structured code, which is a pretty good coverage of the typical tasks in a developer's day. I feel this gives a much more realistic assessment of a candidate's capabilities that asking questions about a more or less randomly chosen algorithm.
This is an issue as well. If you aren't google then a day is too much investment for a single job opportunity, especially if you're already employed.
When I have to give an interview and have to include an algorithms section I make candidates type code. Whether that's on a phone screen with a shared online text editor or in person with their laptop / an interview laptop, I want them to type stuff, not just rely on whiteboard pseudo-code and diagramming. As a vim user I discount that their editing environment may not be what they're used to but even if I was forced to use Notepad I could still bang out a function to test the even/oddness of a number (my own fizzbuzz) pretty quickly. So I at least make sure to test coding, even if poorly.
I agree work-sample tests are the best, but as another commenter noted if they take a lot of time for the applicant you're going to get people who refuse that just as some refuse to play the algorithms game. Especially if people have a github repo, especially if some of the projects they've worked on have had more than themselves as commiters, especially if they're currently employed as a developer at some other company that does general software. Unless you're trying to build a top team, which most projects don't need, you're wasting a lot of time trying to rank beyond "would work out ok" and "would not work out at all". I have a section in my phone screen that tests for regex knowledge, I'm primarily just testing to see if they know the concept or if when faced with a problem that regexes can solve (which actually does happen from time to time) they reach for writing some custom parser or not. If they vaguely remember there's a way to specify a pattern and find matches, that's a Pass. If they know grep/their language of choice's regex syntax and can give a full solution, great, I'll rank them slightly higher than someone who just knows the concept, but all I really care about is the concept. If they don't know the concept, that's a strong sign (to me) they won't work out.
I tried to do a semi work sample test with an intern candidate a few months ago instead of a different test, based on experience with a prior intern who struggled on something I thought was basic and left me wondering why I didn't catch that in the phone screen. Basically I gave them some stripped down code from several files that looks a lot like what we have in production (JS, using Backbone) explained the overall mapping from bits of code to what could be shown on the screen, and essentially asked them to add a new component (already written) to one part of the screen by modifying/filling-in-the-functions in a few places. It required them to read and understand some alien code, see what they can ignore, understand what was asked, and then do it (initialize something, pass it around, up to I think 3 indirect function calls of nesting, call a couple things on it). The candidate got through it, I'm not sure the old intern would have...
Why g Matters: The Complexity of
Personnel selection research provides much evidence that intelligence (g) is an important predictor of performance in training and on the job, especially in higher level work. This article provides evidence that g has pervasive utility in work settings because it is essentially the ability to deal with cognitive complexity, in particular, with complex information processing. The more complex a work task, the greater the advantages that higher g confers in performing it well. Everyday tasks, like job duties, also differ in their level of complexity. The importance of intelligence therefore differs systematically across different arenas of social life as well as economic endeavor. Data from the National Adult Literacy Survey are used to show how higher levels of cognitive ability systematically improve individuals’ odds of dealing successfully with the ordinary demands of modem life (such as banking, using maps and transportation schedules, reading and understanding forms, interpreting news articles). These and other data are summarized to illustrate how the advantages of higher g, even when they are small, cumulate to affect the overall life chances of individuals at different ranges of the IQ bell curve. The article concludes by suggesting ways to reduce the risks for low-IQ individuals of being left behind by an increasingly complex postindustrial economy.
Algorithmic tests have a pretty low bar to clear in IT jobs. Using them for secretaries or lab technicians is another story.
The problem with comparing IQ results at hire to employment success is that employment outcome is difficult to define over time. You're also unlikely to get statistically relevant data without focusing on large organizations with standardized HR processes. Most of the research is based on supervisory evaluations, which are not the most reliable indicators of anything for a variety of reasons.
The other thing I find amusing is that business folk who talk about this miss the fact that there are large workforces in the US that either have or do use standardized testing like this to hire and promote. Those are government bureaucracies, which function relatively well, but are hardly a model that most folks advocating this would aspire towards.
https://mobile.nytimes.com/2015/08/28/science/many-social-sc... among other places
I'm all for upending the status quo with interviews, but let's not throw out science and reporting just to get there.
In particular, I was disappointed to find a (short) paragraph in the article that I find bogus. That does not mean the article shouldn't have been posted in the first place, but just that this paragraph should have been edited or removed.
I think there is something wrong when I feel like I have to look up the actual research paper and check whether the claims made in an article are supported by data and methodology. I should not be a skeptic when reading New York times articles.
To be fair, it is posted in the opinion section, but should we really just take this article as an opinion? That doesn't feel right to me either.
Then onto your last question and Daniel Kahneman, we can talk about that for a long time, but let me keep it short. The best place I know (though technical) is the blog by Andrew Gelman ( turned up in a 5 second Google, but there is way more on his blog), and Daniel Kahneman himself has "admitted" flaws in his studies .
Even the "hard" sciences have trouble, because journals prefer publishing new and positive results, rather than replications or negatives.
People ask questions in interviews that they want to know the answer to and that could go either way. All questions have equivalent expected surprise, either both answers are unsurprising, or one answer is surprising, but you think you already know the answer is the other one.
If interviewers were asking questions like "is 2+2=4" they would have detected random interviewers way easier, but they wouldn't be trying very hard to get to know the person.
As for getting to know the person, the more surprising someone's answers are, if you believe they are telling the truth, the more distinguishes them from "average person who gave answers I expected", so you say you "got to know" them. This is unsurprising.
This isn't to defend unstructured interviews, other studies for a long time have shown them to be worse than structured interviews and test scores. If I had to guess the only reason the research in the article got published as novel was the random interview part.
Edit: Here's a table from a meta-analysis of lots of studies on correlation between different factors and job performance: http://imgur.com/a/YRFTh. Basically unstructured interviews aren't as good as structured ones, but they are better than nothing. Work samples are the best, structured interviews and IQ tests tie for second.
Note that this meta-analysis is combining a bunch of different fields to yield general observations, a specific field may have different results. But in expectation for a randomly selected field these are fairly solid results, and I don't expect fields vary from them too much.
Politicians boast about successes constantly. I don't understand what you mean by "get to".
Why is that not the case here, where a professor is allowed to sell his own work? It is as if Obama is the NYT reporter for Obamacare.
That people believe politicians blindly is a topic for another day :)
But what I find far more important in the interview process is involving current team members in the process of selecting new coworkers. It's one way of getting teams to be bought into a feeling of shared purpose and is the first step in establishing a working relationship. If you don't give at least some of the current team a role in the hiring process, teams will feel imposed upon by those hiring and won't be as understanding about flaws in those added to the teams.
Focusing on selecting the "right candidate" is really myopic in a situation where there are likely many right candidates. We should, instead, be focusing on not selecting a wrong candidate and fostering the right team dynamic. We already know that strong teams significantly outperform strong individual performers who don't cooperate. Yet hiring still seems focused on optimizing for strong individual contributions. And I've yet to see a study that looks at the flaws of the interview process in building ineffective teams.
We know that interviews are selecting a right person some percentage of the time. That percentage will never be 0 or 100. We will always have to accept some bad hires.
I would rather that successful hire percentage be somewhat lower and keep the team engaged in defining culture and hiring standards than to give up team involvement for a higher right-person rate.
My main point is that the people studying this issue and arguing against the interview process tend to only look at less than half of the benefit of the interview process. If we're going to ditch the interview process, whatever replaces it needs to have the same property of involving the existing team or it is, to my mind, automatically worse than what we have now. Because while we're all aware of how flawed the interview process can be, it does work to some extent. When I was hiring, only 2 out of the 50 or so that I hired didn't work out. Would I like to have avoided hiring one or both of them? Sure. Am I willing to sacrifice the team involvement benefits to got to do so? Absolutely not.
In fact there is an awful lot of wrong stuff society keeps doing despite evidence over many year suggesting it is stupid. The stupidity of high CEO salaries e.g. has been proven quite well, yet business keeps a blind faith in it. Getting employees in complicated jobs to perform by rewarding them on extremely narrow metric has also proven itself counterproductive yet the MBA crowd refuse to believe it doesn't work.
Business practices often seem to be more like religion than founded on reality. It is a GOOD thing that some researchers are trying to do their bit to correct this flawed picture.
Are you suggesting anything out of the ordinary is going on here? Doesn't one have to present their work as legitimate and wait for feedback? So long as they are open to that feedback, I don't really get this argument?
> Over time I’ve become more skeptical about this kind of psychology research (as more studies fail to replicate) and, as is often the case, here the sample size is quite small (76 students, split across 3 groups), predicting something relatively noisy as GPA. It is unclear to me that one would be able to detect reasonable effects.
So present that back them as a refutation? Are you waiting for someone else to do it? Why are you debating the reliability here?
They're predictive, but not very.
Structured interviews are better, but not really by much.
The problem with all of these things is they work, but not very well, and people make too much of any one thing. An interview is a very small slice of behavior, even if it's structured very well.
People make too much of them.
For a quick reference, the two defining criteria for a structured interview are:
1.) They use one or several consistent set(s) of questions, and
2.) There are clear criteria for assessing responses
That second point is really important. You can't only ask candidates the same sets of questions and have a structured process: you need to understand what a "one-star" response vs. a "five-star" response actually looks or sounds like. Training and calibrating all of the interviewers in a large company around a similar rating system is nightmarish, so most companies don't bother.
The book also outlines that pairing a work sample with a structured interview is one of the most accurate methods of hiring.
If anyone is interested in some in-depth structured interview questions or work sample ideas, feel free to email me. I've spent the last few years working on a company in the interviewing space and would love to chat.
Which is how it is typically presented because it sounds much better than "reject a lot of candidates who would probably have worked out just fine". It is useful to perceive both the potential value in an approach like this and the shortcomings. Google can absorb the massive expense in man hours, lost opportunity, etc. that comes with trying to craft genuinely predictive interview processes, but a lot of the companies trying to emulate them can't. Too often, interviewees don't realize a process of this sort is stacked against them, and interviewers don't appreciate the negatives of adopting a still-nascent approach that sounds more reliable simply because it is quantitative - and assuming since Google does it it must work.
Its not like Google pays the best or still has the best workplace. It's a large company with large company politics and red tape.
I've met a few such people in this forum. Not many.
I'm not sure how one would even begin getting a rigorous estimate of that number. What is a credible sample of "great candidates" in this industry?
That seems to be an unproven assumption, and quite likely to be a wrong assumption. For example if good people turn out to be less interested in "honing their interview skills", adding parasitic noise to the signal.
This gets tossed around as a truism. I'm curious, does anyone have any evidence for it? Call me a skeptic, but these kinds of "everyone knows" truths are often wrong.
Google and other such companies have a vested interest in getting hiring right. They also have the wherewithal to conduct studies, collect data, and let the evidence guide their hiring practices. Google in particular has shown a willingness to completely overhaul their practices by eliminating ineffective practices (remember their reputation for "thought puzzle" type questions?).
So I'm curious if you have anything to back up the idea that they're doing it all wrong.
I know it from my own experience and that of many others who have been through the gauntlet. Take it for what it's worth, I'm not selling you anything. I don't look impressive on the whiteboard, but I do have a rather impressive track record. Something doesn't line up. :-)
FWIW, as far as I recall there was another experiment at Google where they tried to establish correlation between interview performance and job performance, and as far as I recall, there was no meaningful correlation. This, of course, is not fully representative, because it does not include poor whiteboard performers.
This is not data.
These were their conclusions:
1. The ability to hire well is random. This is referring to individuals, not the system as a whole.
2. Forget brain-teasers. Focus on behavioral questions in interviews, rather than hypotheticals
3. Consistency matters for leaders
4. Grades don’t predict anything about who is going to be a successful employee. School grades, that is.
So, stop making stuff up from behind your throwaway account.
Google is collecting and analysing data to improve its hiring process... not to improve the hiring process of the industry at large.
There is an effectively limitless supply of great engineers who will jump through hoops to work for Google.
That's just not true for the vast majority of the industry.
If this were true, Google would have crashed and burned a long time ago.
Obviously, their interview process selects for much more versatile engineers than that. Engineers who not only produce reliable and maintainable code, but who can actually come up with products that generate billions of dollars over the years.
Actually, Google pays under the average of top companies, because as a top tier company, they can afford. Most people I know who went to work for Google took a pay cut but don't have a single regret about it.
> A simpler explanation is that they pay well and are prestigious and so get a lot more good candidates.
Their pay and prestige will attract even more bad candidates.
How do you separate good from bad candidates?
That's right: a kick-ass interview process.
Compared to who? They pay more than AMZN/MS/FB/AAPL/etc. for equal level, but are stingier with levels. You might be correct that certain other companies pay more than Google (Netflix maybe?), but they're certainly above average.
Google is more generous to good performers via bonuses once you are working there, but if you have two offers in hand, you are going to find Google highly resistant to negotiating. The notion of people taking a pay cut to work at Google sounds plausible to me.
If one's goal is to maximize compensation (particularly in the short term), a Google offer is better used to get a higher paying offer at one of their competitors.
I don't think that's true. While google is by all accounts (including in my personal experience) unwilling to move significantly on base salary, they'll happily match pretty much any offer with stock from my experience (and the experience of others I've talked to).
Perhaps I dealt with a particularly nasty Google recruiter. I felt like the recruiter had misrepresented the health benefits and relocation package once I got the actual offer letter and related paperwork.
That said, from what I've seen, compensation growth at Google from everything I've seen is faster than at the other companies, which means that for someone coming in at L>3, they will likely be given greater compensation at google than elsewhere.
I'm curious as to how they misrepresented things. I was actually pleasantly surprised once I got here by how extensive the benefits were, but I'm always interested in learning more, since while I actually think that 4 google interviews is a decent way to judge someone for google, I really hate their interview/negotiation process.
How many of their projects succeed simply because they are google? Some major ones (like android) come to mind.
If there are consistently used questions and specific criteria for assessing responses, can a candidate just learn the likely questions and what constitutes the "right" answer?
The reality is that generating good questions for a structured interview is difficult. You can't just pose a programming problem. As you've noted, most programming problems have multiple good answers. Differentiating between multiple good answers from different candidates re-introduces subjectivity. Your brain would rather convince you that one valid answer is less good than another, even when it's not, that to admit to you that it can't differentiate or generate a narrative for you. The part of your brain that generates narratives is incredibly powerful and does not care about how accurate your hiring process ends up being.
What we tried to do was create questions that generated lists of facts. "Spot all the errors in this code" would be an example of this approach (but none of the three we used). We went into the process wanting to embrace epistemological uncertainty, generating a historical trail of data that we could retrofit to candidate performance.
In the end, work sample testing was so much more powerful a predictor for ourselves that we never fully got around to analyzing the data. Sometimes we'd get candidates that clearly generated inferior "lists of facts"; I think there may have been 1-2 instances where that outcome actually overruled work-sample testing delivered prior (out of a few tens of engineering hires and probably ~100 interviews).
I wouldn't breach such a contract nor should you hire someone that would be willing to do so. The sole exception being in the cases where it would be the best for the public interest, i.e. whistle-blowing.
Is anyone though?
For most questions, there's no "right" answer, but there are a set of points that the interviewer wants to see you touch on. For example, they might first want to see that you can code up a naive brute-force variant of the algorithm, checking whether you know the programming language claimed and can think through the problem, and ask you the algorithmic complexity. Then they'll want to see if you can get a divide-and-conquer or dynamic programming variant with lower time complexity. Then they might ask "What if it has to be an online algorithm, where new input arrives before the computation finishes?" Then they'll ask "How would you distribute this over 1000 machines, and what are the failure modes?"
At each stage, they're watching how you answer, and where you get stuck. If you ask clarifying questions or spend time to think before diving into coding, that's a plus. If you have never heard of the problem before (this is frequent - many questions are not in textbooks), they want to see how you would reason through it, and break it down into subproblems that are similar to textbook problems. If you miss language trivia, most people don't care; when I did interviews I'd usually volunteer the answer if they missed some API call, and when I interviewed my interviewers did the same. If you don't know how to solve the problem and can't make any effort to move forward through a solution, that's a big negative. Similarly if you don't know what the concept of big-O is or why it's important.
But the real magic comes when they run out. What do they do then? Good people say that they don't know. Often, they will take guesses at a few more based on general principles. E.g., "I'm sure there are arguments for sorting, but I only use top for that." Or, "There must be more ways to filter, but I only do it by user or command." They'll usually look a bit uncomfortable, because they are the sort of person who likes knowing things. But what they won't do is bluff or make things up.
And honestly, that's my biggest tip for hiring: find people who like understanding things and know a good amount for their level, but are willing to say when they don't know. People who can't do that are dangerous.
I don't score it by how many option-letters people know. I score it by how well they demonstrate familiarity with a basic ops activity, which is finding out what's running.
Even if somebody said they never used ps, or like below, only use one incantation, that isn't a problem, because my follow-up there would be to ask how you'd find out what's using all the RAM or what's using a lot of CPU.
Maybe they're just used to using different tools, in which case I can ask for details for those tools. (And then check manually later to make sure I'm not getting snowed.) But if they've done a bunch of ops stuff, they'll demonstrate the sort of know-from-doing level of familiarity with something.
In theory I could just ask the broader question to start. But some people are good talkers, and I'm looking for evidence that people are good doers. All that said, I've shifted mainly to more experiential interview processes, as actually doing is the best way to see if somebody's a doer.
To me any question where the obvious answer is "I can trivially look all that up in the man pages" is a gigantic red flag that I don't want to work at that place.
If I'm feeling curious that day, I might begin a conversation about what you think you are testing for with that question, as I've spent a fair bit of my career studying hiring pipelines. But if I hadn't had my coffee yet or were irritated or something I'd probably just ask to talk to someone else.
That said, as I've explained elsewhere in the thread, the point isn't to see how many they know. It's to see that a) they know some portion of it that people doing the work would know, and b) have reactions to the rest that indicate useful work habits and attitudes.
A perfectly find first answer to is "I only use ps -aux. I'd just look in the man page if I needed anything else." Because then we could have a good discussion of how they use that output to figure out things, what sort of things they'd expect to see in the man page, and what other tools they use to see what's going on.
I know you're not asking people recite "ps" flags from memory. The problem is, you're looking for a subjective "X-factor" in how someone answers a trivia question.
I used to really like this approach too. I'd think of things that you'd only know if you'd actually done the kind of work I'd done. I had some favorite interview questions: "what are some of the functions you carry around with you from project to project in your 'libyou.a' library", and "where would you start looking if you had to debug a program that segfaulted in malloc"? I think the logic I was using is the same as the logic you're using here.
Ultimately, I think it's a bad approach. Equivalently strong candidates look at their work through different lenses, and find different things memorable. Far more important to me, though, is that some people really suck at interviewing --- and, what motivates me more, having succumbed to this repeatedly as a hiring manager and also being myself I think an example of the phenomenon --- some people just interview way better than they should.
The other problem with this question is that, by its phrasing, it implies that there is a right answer. If I can name more ps flags than an other person, I'm a better candidate. Which when put that way, I think points out its faults. Maybe you don't intend for that to be true, but you'd have to fight your own cognitive biases pretty hard to not at least bias towards the person that can name 17 flags off the top of their head, or the one that teaches you a new flag combo you didn't know. Even though those things are likely not that good of predictors of a good candidate.
You've said what you really want to get to, is can they admit when they don't know something. If that is important to your hiring process (and I'd encourage you to validate that with data), ask them a question no one could know the answer to. Ask the same question to every candidate, and grade it purely pass fail. Did they say "I don't know" or not.
>> admit when they don't know something... encourage you to validate that with data
My experience is that people that won't admit they don't know something are not necessarily bad employees. Often they are capable employees, although this is a trait I rarely see in the best employees. However, I find they are toxic to a good work environment, since they won't listen to people who do know something.
They do seem to get promoted to management had a much higher rate :)
So the process is, in fact, designed to make people feel uncomfortable, for a little bit (or at least you're definitely aware that this is a frequently occurring side effect).
Does that not suggest to you that there may be a negative tradeoff at play here? (To wit: yes you do manage do efficiently extract a few bits of information from them... but at cost of having them feel like they're being, well... interrogated. And are already "losing points" by that point in the conversation with you).
I had a PM friend who would try to push the bounds of a candidate's knowledge until they gave up and said "I don't know". The actual knowledge wasn't the point of the question: it was that they could say "I don't know", because PMs who can't tend to make life miserable for the engineers & other teammates who work with them.
And I agree with your PM friend; I think willingness to admit ignorance is even more valuable there.
But no, I don't think they walk away feeling interrogated. My style is pretty conversational, and when people don't have all the answers, I definitely work to make them feel comfortable with that. E.g., for this question I'd close with something like, "Of course nobody knows all the arguments to ls. I sure don't. So you did fine."
Try it next time you're mingling with ops people. Say in a bar at a conference. Don't look at what they answer. Look at how they answer. Some people are comfortable not knowing. Some people are excited to discover the ones they forgot they knew. some people get curious and eager to fill the hole they've just noticed in their knowledge. And some people get defensive and peevish.
Again, this is one of those, "Here's what I've got cached, and here's where I'd look up what I don't know offhand" questions.
From the ps manpage:
Note that "ps -aux" is distinct from "ps aux". The POSIX and UNIX standards require that "ps -aux" print all processes owned by a user named "x", as well as printing all processes that would be selected by the -a option. If the user named "x" does not exist, this ps may interpret the command as "ps aux" instead and print a warning. This behavior is intended to aid in transitioning old scripts and habits. It is fragile, subject to change, and thus should not be relied upon.
/actually I'd probably hit up the man page first...
Since the 'ps -o' flags are only useful in that one command, I just look them up in the (rare) situation where they seem the best alternative. Having been at this for decades, I try to reserve my limited memory capacity for factoids that will pay off, and being able to process columnar data generically at an instant seems more useful than knowing a special way to do it with that one command.
Not sure how well this works in practice though.
They were. I was asked a ton of this kind of Linux/Unix trivia in one session. As well as just random stuff. For example:
As a result of having read this story, I can now always instantly remember what 2^24 is too :-).
So 1000 * 1000 * 16 = 16 million would be an easy estimate to make.
If you are designing a system and want to have an idea of how much space or memory an approach will take, being good at this kind of math can be really helpful.
(I mean, I get it, this kind of general estimation and/or ball-parking can be incredibly valuable, but this doesn't sound relevant in any way.)
To be fair to Google (in regard to this one filter question only), they actually are up front about expecting a significantly higher baseline of general mathematical awareness (even for non-mathy roles) than most other shops. At any rate, this kind of estimation comes up in capacity planning (both in architecture planning, and for algorithms on a single box) all the time. Which after all is what Google does, at nearly every level of their stack, (nearly) all the time.
So, no, still not useful questions.
My opinion is that big-O often ends up being used as a premature optimization effort hindering "just get the right answer first". Maybe that brute force method will work in a reasonable clock-time cost, despite having egregious algorithmic cost. You won't know if you're busy overengineering and optimizing before you even have a working solution.
The reason is because when your dataset is in the petabytes, any algorithm bigger than O(N log N) is not going to terminate. You're not going to get any answer at all; your MapReduce is going to sit there burning CPU time for a day, and then you'll kill it, and you won't have any idea what went wrong. (This is learned from experience, if you couldn't tell.)
In a startup context, I'd certainly agree with you - that's exactly the approach I'm taking with my startup, where I'm doing a lot of the debugging and code-wrangling on my local laptop after verifying that the data I want exists and calculating its size with a couple cloud-computing passes. It can sometimes be useful to prototype & iterate with a small subset of the data until you've got the algorithms down.
But many of the algorithms Google cares about don't give any useful results on small data sets. I remember running a few collaborative-filtering algorithms and finding that they gave zero answers because within my small sample of 0.01% of the total data, nobody liked anything else that anybody else did.
 Well, it's really the amazing amount of throughput that modern architectures can achieve. Latency hasn't improved quite as much given the inescapable limitation of the speed of light.
Perhaps it has changed as well; when I left (2014) they had just started encouraging engineers to focus on one particular task, with the architecture already defined, while when I started (2009) there were still a number of problems of the form "here's a feature we want to add; here's the data we have available; how can we build it?"
What if you're having a conversation with someone about local politics and you're convinced the local zoning rules don't encourage growth in the way they think they do. Do you force the subject of conversation so you can make sure they know just how wrong they are? Not if you want them to walk away having a good impression of you.
Instead if the talk turns to zoning you put out feelers to see if they'll want to talk about the larger issue, and only engage in the conversation if it seems like the time for it.
The analogy isn't perfect, but if the interviewer wants to talk about the larger issue, you're probably well matched and will have a great interview. If not, and you know enough about runtime complexity to discuss the tradeoffs, then you know enough to just answer the textbook big-O question and move on.
Then there are situations where the code you write must play nicely within a massive, already-complex system, wherein it will work with potentially huge inputs. Like at Google.
So, saying in a Google interview that you don't think Big Oh is super important, and that you prefer shipping whatever correct solution you come up with first and then worrying about efficiency if it ends up being slow, would likely not get you very far.
Let's say you created your own company and you're interviewing engineers.
Would you be comfortable hiring someone who doesn't know the difference between O(n^2) and O(2^n)?
Or someone who doesn't even know what these concepts represent?
There's also the thing I picked up from playing around with graphics demo coding; get the algorithm working first, even if it takes 10 seconds per frame. Then look for the slow parts, concentrating first on the inner loops. Whatever you do, don't try to prematurely optimize the code as you write it, as tempting as it may be. Because almost certainly you'll make things worse than if you waited until after you have a working first pass.
This applies to way more than just fast graphics code, of course.
Any sort of technical solution given at a Google interview would generally have the following questions:
1) What big O time does this algorithm run in? Why? What big O space requirements does it take.
2) If space were more/less expensive, or time more/less important, how would you change the solution and why?
Understanding those tradeoffs and being able to analyze code at that level is a big part of most software engineering jobs.
The interviewee is more than welcome to study data structures and interview questions about data structures. A poor student will just memorize questions and answers by rote without really understanding the data structure in question. The good student will actually learn and understand data structures in the context of the solution.
One would hope that interviewers are able to ask follow-up questions about the solution which distinguish between the two.
EDIT: During the best tech interview I have done, I had no idea how to solve the question asked (that is, I had not studied its solution despite this being a relatively common question). I was able to ask articulate questions and invoked understanding of data structures as I went along. Memorizing solutions is a game of luck, and not recommendable.
There's a limit to how far you can get just Googling stuff ("how do I reverse a string") for recipe-book solutions to things. I think practically everything in Computer Science could be Googled ("how does merge sort work", "what is an inclusive vs exclusive cache") at some level but this doesn't mean one shouldn't know a great deal of it. At least, if you want one of these jobs...
I think one of the most helpful methods of determining a good interviewer vs. a good future employee is taking a structured, situational interview question, i.e. "How would you go about selling a new product to a customer?" and turning it into an actionable work sample that is tailored to your company. "We make this piece of software that does this. Spend the next 10 minutes writing a cold email to a prospect that outlines our offering." As the interviewer, grade the work sample on structured criteria that is important.
You do want to try out the question and criteria before depending on it too much, but it's easy enough to try it out on a colleague.
I take it you're working from a different theory, where you try to put the expertise in a machinery of questions and answers, relying less (or not at all) on the interviewers themselves. I think that's also a valid approach, one with different limitations.
Personally, as I mentioned elsewhere, I'm not a big fan of interview questions anymore period. If I want to know if somebody can do the work, now I try to create situations where they just do the work. But if one is trapped in the dominant tech interview paradigm, I personally favor investing in interviewers more than the questions themselves.
I'd probably start here to lighten up the mood due to how cliche the question has become:
Nothing about Google's hiring process can be deemed reputable when they'd go so far as to illegally conspire with other tech giants to prevent potential employees from achieving the best possible outcomes for themselves. Maybe things have changed over there, but given the slap on the wrist they got, I'm skeptical.
Most importantly most companies interviewing don't get to pick from the pool of applicants that google gets to. And even more importantly most companies don't even get to pick from the pool of applicants that a typical funded (or soon to be funded) startup will get.
While I am sure there are things that can be learned from 'Work Rules' the question is how much of that applies to the vast majority of companies in America.
It may be good at screening University graduates but it's pretty awful at anything else. It's also really obvious that the interviewers don't actually know how to interview people and are pretty much cargo-culting the same process that hired them.
Awful. It totally destroyed my perception of the company.
Now granted this was... 7 years ago, no (or more) but from colleagues who have interviewed there (or been hired!) it's not changed fundamentally that much.
Today they ask fairly easy questions (given a list of integers, find the subsequence that has the following property etc. - basically leetcode medium level) that you can solve fairly easily using basic CS 101 knowledge. However, it ends up (again, IME) being a race against time where you have to whiteboard code a simple solution in 20 minutes approximately.
I'd much rather have them as questions that take real insight to solve, but have the interviews last 90 minutes or so.
To me their approach lately is like forcing a top tennis player to play with kids (well, you would be surprised how many pros have issues slowing down their game) or a double diamond slope skier to ski on blue slopes only (and increasing the risk of getting an injury as their timing/moves are optimized to ultimate performance instead of basics). I've heard of people that solved stuff on interviews nobody solved before them but were rejected as they messed up some basic stuff and recruiter was communicating back to them that it looked bad in the hiring committee.
Similarly, as a senior engineer at a large company, you need to be adaptable enough to work with people with less/different experience than you. Adjusting your message to the audience is an important skill.
A lot of people (anecdotally, the majority) at Google have the "impostor syndrome", and the news of the experiment did nothing whatsoever to quell the symptoms. Now they don't know if they are, in fact, not impostors, but they do know that on average they perform about as well. :-)
In other words, maybe as long as you let in a small number (but only a small number) of non-performers, you're fine (which is bound to happen anyway - I'm sure there is some noise in the interviews).
Then there's the issue that by the time you even get an on-site, you're already very much not a random candidate. Recruiters actually do look at your track record, etc. You can bullshit there, but I don't recommend it, since references will be spot checked, and they better line up.
Google interviews are largely a roll of the dice above certain level of basic engineering competence. I.e. if you don't know the basics, you will almost certainly not pass them. But if you're a more senior candidate, Google doesn't really know how to interview you, and their interview process turns into a random number generator biased heavily towards "no hire".
if you have to start using structured interview questions to expand a team - you've already lost
The real conclusion should be that "unstructured interviews provide a variable that decreases the average accuracy of predicting GPAs, when combined with (one) other predictive variable(s) (only previous GPA)."
This conclusion seems logical. When combined with an objective predictive measure of a person's ability to maintain a certain GPA (that person's historical ability to maintain a certain GPA), a subjective interview decreases predictive accuracy when predicting specifically a person's ability to maintain a certain GPA.
To then go on to conclude that interviews then provide little, to negative, value in predicting something enormously more subjective (and more complicated), like job performance, is absurd - and borderline bad science.
There are numerous (more subjective) attributes that an unstructured interview does help gauge, from culture fit to general social skills to one's ability to handle stressful social situations. I'd hypothesize all of these are probably better measured in an unstructured (or structured) interview than in most (any) other way. To recommend the complete abandonment of unstructured interviews (which is done in the final sentence of the actual paper) is ridiculous.
Well, based on what? Is there any evidence for any of these hunches?
But, if you really believe that it's a huge leap to hypothesize that interviews would be a better measure of subjective social skills than metrics like GPA, I don't know what to tell you.
Believe it or not, there are non-racist and non-classist character traits that are also not represented in one's GPA.
But, I guess hiring for one's willingness to work with specific clients, or for one's happiness with the organization's structure (in, for example, a holacracy) wouldn't be valid things to hire around to you?
Also, it is quite often not the technical questions that end up making us decide not to hire someone. That's just one area. I know it's the part that sticks out, and candidates give a lot of weight to it in their memory of the interview, but you shouldn't just assume it was that your whiteboard code wasn't quite good enough. I actually don't think that's the most common thing we give a "no hire" for.
If it's the tech giant, it's very possible they attract better candidates because they offer more.
The best sofware guys I've ever seen work at those big corps. The big corps are the ones that have the resources to work on the REALLY hard problems and not writing the same CRUD app over and over again. Those same companies also provide tons of educational benefits so that you become an expert in your field.
Things like fighter jet software are fantastically complex. You hear about the dumb mistakes that are made (International Date Line Issues), but you never hear about the thousands of hours of testing every change to code goes through and the insane calculations that are made every second in even standard level flight.
I have no idea why there is such a backlash against interviews. What's so hard about studying for a job you want? Even if you don't USE knowledge, you should at least be able to rederive things using your base knowledge. Thoe interviews I've seen at aero companies are damn hard. Tons of grilling on if you actually understand mechanics and fliud dynamics.
That's a really good question and I had to think about why I don't like interviews.
1. It's an interrogation and the stakes are extremely high. There are so many aspects of it you don't control. Some guy had a shitty morning, or just doesn't like you, dropped your resume on the way back to his office, etc.
2. It's completely phony and it starts with the first question, "So why do you want to work at generic company X?" "I need money," is not an acceptable answer. Now I have to be phony and tell them why their company is awesome (it's not) which makes be feel like a kiss ass. I have to pretend to be excited about working at a boring ass company. Just shoot me.
3. We have to do it, unless we have rich parents that died young like Batman.
4. The person or people that are interviewing you have no notion of what you've accomplished aside from skimming your resume. All the hard work I've done to produce miracles in the last 20 or so years means, really nothing.
5. Companies only reward tenure at that company. It signals the start of something new but that's typically a bad thing when it comes to work. Less vacation, less credibility, less influence, etc.
6. You really don't know if the person or people you are interviewing are bozos or not. It doesn't matter, they cut the checks, they have the power.
7. As the article insinuates, they're fairly pointless.
The by far most common reasons interviewers offer for low scores are: didn't listen/didn't acknowledge when they reached the end of their knowledge/didn't test when they got off the rails/just didn't seem like a cooperative coworker.
These are all filters that may not matter at ye olde startup; but in a stable long lived team, these are red flags that seem corellated in my experience with lowering the morale of a dozen other people. So, even if you're a rock star, you might get rejected.
> didn't acknowledge when they reached the end of their knowledge
Oh man, this one. How hard is "I don't know?" You're not supposed to have the entirety of computer science and software engineering knowledge jammed into your head. Let's talk about what you know and what you don't and figure out whether you're a good fit.
We hire lots of people without direct experience in what we do because they seem great to work with and we think they'll learn quickly. Just please don't try to bullshit; it doesn't usually work.
Can you cite any of the more common things you give no hires for?
- Lack of self awareness and introspection. Not being able to give specific examples of times you've made a mistake and how you learned from it.
- Shit talking old coworkers, general attitude that you're great and nothing is your fault.
- Being unable to talk about YOUR specific contributions; "we did this, we did that, the project made money..." Okay, what did YOU do? Surprising amount of people fail on that one.
- Not being able to give context about the "why" of what you worked on, what other options you considered, etc.
- Not thinking about the end user; "I used this technology because it sounded cool" instead of "I wanted this result for my customers." Tell us about the situation and why you made the right or wrong choices.
- Interrupting the interviewer and not asking questions. This is a weird one. We want you to do well, so sometimes if you're going down the wrong path we'll try to help. I've had candidates talk over me and just barge ahead down a totally incorrect path. ¯\_(ツ)_/¯
Obviously most weaknesses can be overlooked if there are serious strengths.
I've done a lot of interviewing, happy to answer questions.
>- Shit talking old coworkers, general attitude that you're great and nothing is your fault.
These are fair enough, although you did say above that the other coworkers in your old scrappy startups were much worse.
> - Being unable to talk about YOUR specific contributions; "we did this, we did that, the project made money..." Okay, what did YOU do? Surprising amount of people fail on that one.
Some people will struggle to take credit because of politeness. Isn't taking credit trivial to do? Especially when the interviewer cannot possibly check up on that. I'm absolutely not convinced about this one, although I do make a point to talk about my contributions when I interview (which frankly doesn't happen much at all, since I tend to stay in the same job for long periods if possible and try to make things better there).
>- Not thinking about the end user; "I used this technology because it sounded cool" instead of "I wanted this result for my customers." Tell us about the situation and why you made the right or wrong choices.
This sounds like bullshit too, to be honest. Most people don't get to decide the general point of what they do unless they are the CTO. However I do realise that this is absolutely the kind of stuff interviewers look at and you need to prepare for it.
>- Interrupting the interviewer and not asking questions. This is a weird one. We want you to do well, so sometimes if you're going down the wrong path we'll try to help. I've had candidates talk over me and just barge ahead down a totally incorrect path. ¯\_(ツ)_/¯
>Obviously most weaknesses can be overlooked if there are serious strengths.
Which you really won't know until months later.
>I've done a lot of interviewing, happy to answer questions.
I know people who'd do badly on several of these accounts and are completely brilliant at getting the job done.
The problem is that you cannot possibly know how well would the ones you rejected would have done.
If you can have a probation period of 3 months or so, that should be the main yardstick. Of course someone's attitude can become shittier over time but no interviewing process can reliably catch that.
Nope, I said the engineers I work with now are consistently better. "My current team is very good" is MUCH different from "my last team was very bad." (And I'm not in a job interview.)
> Some people will struggle to take credit because of politeness. Isn't taking credit trivial to do?
No, it's not so much about credit, it's about specificity and the ability to talk about the different parts of the project. A lot of people get stuck on, "We made a project that sold widgets." I want something more along the lines of, "In our widget selling application, I worked on backend services for payment processing and fraud detection." Some people get really stuck on generalities and won't dive into specifics. It just makes it impossible to evaluate your contribution. Maybe you didn't do anything. I have no idea, you're just not generating useful information for me.
> Most people don't get to decide the general point of what they do unless they are the CTO.
I totally disagree. So many tiny choices that engineers make from day to day have a tangible impact on customers. Even if you're a junior engineer, you have an impact on the latency of service calls you're responsible for, as an example. Do you pick the lightweight framework that loads quickly, or do you need the features of the bigger one? It's that kind of tradeoff that people should consider, and "I saw something shiny" is not a good answer.
> The problem is that you cannot possibly know how well would the ones you rejected would have done.
Yep, for sure. I'm positive that I've rejected good candidates. That's the side you want to err on though, especially if you have lots of good candidates.
If that line is as detailed as you're looking for, then that's reasonable. Although I've seen people get annoyed that I don't remember the specifics of exactly what I implemented and why and a list of the various decisions made on projects that are 2+ years in the past.
At my current company I've worked on 300 tasks over the course of 2 years, across 30+ projects, ranging from simple bug fixes to implementing large swaths of new software for Fortune 100 clients. There are some similar items in there, but most of them were different.
I don't have an amazing memory, so a lot of the old stuff I worked on becomes very vague. Hell, it can take me time to remember what I need to do to work on something I haven't done in the past six months at my current company.
I know those old projects in broad strokes but if you want me to talk about a specific project in detail that happened years ago, then I will struggle to dig up those memories, even after reading a document I made that refreshes things a bit before interviews.
I'd venture that not being able to answer this question and others like off the top of their head is less an indication that they are unaware of/don't learn from their mistakes than it is an indication how well they practice for bullshit interview questions and can give you a bullshit answer to your bullshit question that gets past your personal bullshit-o-meter well enough for you not to be offended.
But since you're also probably trying to filter out cynical assholes like me, ¯\_(ツ)_/¯
Well it certainly shows a lack of preparation and experience.
Here's how I approach those questions. I think about each project I've launched, and whether they're relevant. It's an easy thought exercise, and then you have an answer to that question forever. And seriously, time you've made a mistake should be an easy one, and it's one everyone should have ready. Over your whole career you can't think of a time you've done something that you'd do differently if you had a chance? I bet you can, and if not, it shows a serious lack of introspection.
We want people trying to make themselves better. Thinking about your career and studying for interviews is part of that.
> filter out cynical assholes
Dude, at least 50% of the tech industry is people who would charitably be described as cynical assholes. Part of being professional is being able to turn it off. If you can't give a professional answer in an interview how can the interviewer expect you to give a professional answer in a contentious meeting?
Especially the "tell us about a time you had a conflict with someone" question. Ugh. I guess I just need to seek out some assholes to work with because I've got a big fat nothing to talk about on that one. I ought to start writing down the crap I make up for it so I don't have to do it again every time I interview.
I know it's phony, but job interviews are sales pitches. Preparation helps.
There's a gulf between "nothing is my fault" and "some of my old coworkers were shit and they deserve the shit-talk they get from me".
I don't think it's fair to judge people universally on the base if they shit-talk or not. Being 100% kind and never saying anything bad about people in your past is vastly overrated.
> - Not thinking about the end user; "I used this technology because it sounded cool" instead of "I wanted this result for my customers." Tell us about the situation and why you made the right or wrong choices.
You're polarizing this way too much. If I go after a "cool technology", there are several things MANY interviewers don't take in consideration:
(1) It's only their own interpretation I chose a "cool" technology over the customer needs. Cognitive bias and all. Not to mention most people really have no qualifications to even claim this.
(2) In the senior programmer area (where I believe I belong) often times you have to make calls nobody can inspect for a while, you have to trust your experience and intelligence and make a decision quickly. If Ruby on Rails consistently fails you on a single website project of yours, it's very okay to start switching out its slowest parts with Elixir's Phoenix <-- that's a recent example in my work place. I chose both "cool tech" and "customer needs" together.
(3) Many times there's no immediate benefit to your work. It's easy for a manager to blatantly reject a hard mid-term decision implemented by a tech lead as "he's after the cool tech only because he's bored" and only I know in my head that the results from that "cool tech" will start showing in a month from now (it also doesn't help at all when I tell them that and they don't believe me).
> - Interrupting the interviewer and not asking questions. This is a weird one. We want you to do well, so sometimes if you're going down the wrong path we'll try to help. I've had candidates talk over me and just barge ahead down a totally incorrect path. ¯\_(ツ)_/¯
I admit I've been in the wrong on this one but I want to give you another perspective. I wanted to make a certain point -- 99% of the time it's the "why did I do this" or "why did I fail doing this" or "how did I succeed by doing this" -- and sometimes I digress because there are sub-points and I overdo my tries of being crystal clear. Granted, that's up to me to perfect as a communication skill but I've been extremely annoyed by interviewers who can't seem to trace my initial line of thought and try to coax me back in it. Instead they give you a smug expression along the lines of "this guy talks too much" and they form a negative impression right there and then. I can see it in their eyes and quite frankly I lose respect for them immediately as well -- they can handle such a situation much better. Interviews are a two-way process and both sides screw up in every single interview.
So overall, I believe you're over-generalizing on a few points.
This is just such a red-flag in an interview. I have worked with some terrible people, for sure. But I would never talk about them in an interview.
> t's only their own interpretation I chose a "cool" technology over the customer needs.
Heh, I swear to god when I asked the question, "Why did you pick X technology?" they said, "Because it seemed cool." I want to hear, "because it performed faster" or "because it had X feature I wanted" or whatever. Tell me WHY it's cool. I just want to hear how you'd explain a choice to me as a coworker, I'm not looking to actually second guess old choices. I'd even take, "Well, we had a really short deadline, and I was familiar with the technology. I decided hitting the date was more important than evaluating all the choices." I want to hear about your thought process; I don't care what choice you made. "Because I thought the industry was moving that direction and wanted to be future-proof and improve our ability to hire" is a good answer too. There are a ton of good answers, but "because it looked cool" is not one of them.
> interviewers who can't seem to trace my initial line of thought and try to coax me back in it
For sure. I try really hard not to talk when a candidate is talking because I know how hard it is to get thrown off in such a stressful situation. Sometimes the candidate has just obviously misunderstood my question though. This is much more important on technical questions -- I believe a good interviewer will treat the candidate like they are a coworker, and basically solve the whiteboard problem in a fairly collaborative fashion. Obviously the person doing the interview will take the lead, but I'm SUPER happy to answer clarifying questions, and if you start to struggle the right thing to do is to ask some small questions and see if the answers help. Just sitting there banging your head against the whiteboard isn't useful.
> So overall, I believe you're over-generalizing on a few points.
For sure, I'm just trying to give some really brief summaries of bad behaviors I've seen. I've also given a "hire" to people who did one of each of the above; like I said, none of it is disqualifying.
> This is just such a red-flag in an interview. I have worked with some terrible people, for sure. But I would never talk about them in an interview.
Of course! Don't get me wrong. I am not making it a goal to shit-talk former coworkers in an interview. I try to avoid it, but what am I to do when I have to share that I maintained a Rails 4.0.x app for 16 months and was unable to upgrade it even it to 4.1.x because the previous team did a horrific job? It wasn't my fault at all and I actually sunk several weekends trying to upgrade it but was drowned in mysterious and horrifying errors (like an ancient state machine gem having very rigid dependencies and utilizing 80+ monkey patches practically made any upgrade impossible) before giving up? Lie that I suck? No I don't suck, they sucked, and I ain't taking the blame for them. That being said, I have better uses of my leisure time, and my work time as well -- having in mind I am not expected to refactor at all. So I tried sinking 35-40 hours of leisure time in that problem and moved on.
> Heh, I swear to god when I asked the question, "Why did you pick X technology?" they said, "Because it seemed cool."
You'll hear a very hearty (and non-sarcastic) "I am sorry for that awful experience, man" from me. I never do that. I always explain myself. I went out of my way to rehearse in my spare time, just asking myself the question "why did you pick tech X?" -- I am a firm believer of Einstein's saying "If you can't explain it to a 5-year old then you don't understand it"!
> Sometimes the candidate has just obviously misunderstood my question though.
Perfectly fair. Never objected to that and I also went to apologize for the misunderstanding because in an interview every minute counts.
Thank you for being constructive.
The reason I personally care about the skill of talking about bad coworkers tactfully is that I think it correlates with being able to navigate tricky workplace fuckups without making a big political mess that I have to clean up. The way I would phrase the above situation is:
"I inherited a Rails 4.0 app, and I wanted to upgrade it to Rails 4.1 so that we could take advantage of feature X. This wasn't on our roadmap, so I spent a lot of my own time looking into the feasibility and working on it in my personal time. However, it eventually became clear that too much of the legacy code would have to be totally redone, and there wasn't enough of a business upside."
> I am a firm believer of Einstein's saying "If you can't explain it to a 5-year old then you don't understand it"
Me too! Love this.
You're a better diplomat than me. I could've phrased it almost the same, but with me the tone and the wording depends a LOT on my current mood -- I have to work on this, that's for damn sure.
It does however still leave the question "but whose fault that legacy code is?" open, don't you think? With my wording in the above comment I would've aimed at being crystal clear -- and thus somewhat rude in the process, admittedly.
> without making a big political mess that I have to clean up
Could you please clarify on that point? I am curious.
Heh, I can do this all day. "We accumulated a lot of tech debt on a previous project because of very short deadlines." Or, "the previous owner learned the technology while building the system -- I'm sure he would have done it differently today." What I'm looking for is that you understand what leads to this sort of situation, and it isn't usually, "the last guy was an idiot." I mean sometimes, sure, but even then it's usually closer to, "the last guy was hired into a role he wasn't ready for and needed a better mentor, or better training."
> Could you please clarify on that point? I am curious.
It's really helpful to be able to send an engineer to collaborate with another team without having to worry about whether they'll end up butting heads with someone and making a mess. Tact is important. Sometimes other people are under constraints that you're not aware of, and it's useful to have empathy. Maybe they have super tight deadlines, or maybe they're having to use a technology they've never used before. It's easy to say, "this person is a moron and their code is bad," but if they get the impression that you think that it can really harm working relationships. At that point, I end up having to step in and smooth relationship over, and it's not a great use of time.
Talking about how bad ex-coworkers with tact shows me that you 1) have empathy and 2) will understand how to navigate similar situations if hired.
I can see that! :D Thanks, you've been very helpful. Believe it or not, I am learning from this interaction.
> Tact is important.
I don't disagree and I'm with you here. But herein lies the dilemma -- I've been tactful and diplomatic way too many times for my taste. I've had my fair share of politics. I am not horrible in it; I simply started lacking any patience for it and thus my mood started leaning heavily towards being blunt and somewhat inconsiderate. I am not outright offensive but I am no longer tactful and diplomatic (sigh).
I started standing up and heavily protest in the face of bad politics however. I have less patience now because I expect everybody else to do the same -- and they don't.
That's why I get easily irritated. I absolutely agree with ALL of your points -- the people might have had horrible customers (or team leaders), they might have drained the budget close to the end of the project so they probably had to cut tens of corners, they might have been gaining experience along the way... and a plethora of other possibilities. I agree.
I do have empathy and tact. But I have lost almost all patience in the last few years.
I appreciate that you might not believe such a polarized message in an interview. But I got slightly carried away sharing. :)
At the very least, just prove that you can keep your mouth shut if you think someone is the worst, heh. That's really all I'm looking for. ;)
I also totally understand getting frustrated. I've been a total dick to my coworkers in the past,
for lots of different reasons. I was burnt out, I was young, etc. I've mellowed out a lot over the years. A good chunk of that is also just being on a good team; it's really hard to be the best version of yourself of you're not well supported.
And here you have myself at 37 being much more impatient compared to 27. :D Truth be told, I am also much more mellow in general, just not in work lately.
> At the very least, just prove that you can keep your mouth shut if you think someone is the worst, heh. That's really all I'm looking for. ;)
Double thanks, this is an extremely valuable advice!
I appreciate you taking the time.
Dummy candidates mixed in with the interview flow that gave randomized answers to questions (interviews were structured to somewhat blind this process), and interviewers lost no confidence from those interviews.
Interviewers, when told about the ruse, were then asked to rank between no interview, randomized interview, and honest interview. They chose a ranking (1) honest, (2) randomized, (3) no interview. Think about that: they'd prefer a randomized interview to make a prediction with over no interview at all.
Of course, the correct ranking is probably (1) no interview, (2) a tie between randomized and honest. At least the randomized interview is honest about its nature.
Similarly, the people doing the prediction were other students rather than teachers who know better what to look for. The research would better fit HR people hiring for roles they know nothing about than experienced team members hiring for roles similar to what they do.
The research results are massively over-applied, and in no way whatsoever is sufficient to use the term "utter uselessness of X". Unsurprising, given it is research by a 'professor of marketing'.
I assume that interview could be pretty short.
The study was done on students, who almost universally have zero experience selecting people to work under them in an industry setting (or at all). Drawing conclusions from this particularly inexperienced subject pool and then extrapolating out is bogus, particularly given the extremely certain language of the article. The subject pool is at an age (18-22) where people are still figuring out what to make of themselves and others; they have extremely little adult experience of the workplace and judgment of character - indeed, at this age, people are notorious for making bad decisions in their personal lives.
When you look at the actual paper they link, out the window goes the declarative language, and instead the article is unusually full of weasel-words ('can', 'may'). There's a major difference between "utterly useless" and the actual conclusion of their paper: "interviewers probably over-value unstructured interviews".
People think they can get valuable information from this, so they want to meet the candidates, even if they are told the answers are random.
You need to prove (and then convince people) that all this extra information (the impression a person makes when you meet them) doesn't improve your prediction of their future GPA over a prediction based only their past GPA.
Remember a few years ago when this forum was drowning about how to find and hire "10x" people? 98% of employees were useless in the face of the 10xer.
The reality is, most of the time screening for general aptitude, self-motivation and appropriate education is good enough.
I've probably built a dozen teams where 75% of the people were random people who were there before or freed up from some other project. They all work out. IMO, you're better off hiring for smart and gets thing done and purging the people who don't work.
I feel like as a company you could exploit a lot of value by just hiring people and training them. Training btw doesn't mean at work. Think how much you learned in college lectures vs. reading the actual textbook. It means going home and studying, and the incentive should be that you're getting things that make you more employable, so this shouldn't count towards hours (if it was say a very proprietary old programming language though I could see this argument falling apart).
Not saying this is the best way but shocked that everyone is "trying to find the best". What kind of world will that leave us in if all companies want to hire the same 3% of
people? Businesses move slower, talent is lost, and inefficiencies accrue.
If you provide meaningful training, you need to be fair in the application of said training. If you admit that you can train people with common existing skills to do most technology jobs, it's going to hard to justify your cozy recruiting funnel with a small number of universities, picked in the basis of where a few bigshots in the company went to school.
Then again, theoretically this gives a huge edge to the proactive learners.
However there is some challenge with having experienced people. They simply can't answer the screening after a while and can't get hired. And who's gonna teach the juniors if you don't have seniors???
Personally, I moved to finance. I find that experience, domain knowledge and maturity is valued more there :D
I'd rather hire a programmer who knew less and could get along with others than a master dev who's a total a-hole.
In my personal experience, structured interviews can be very helpful in determining a candidates abilities.
It's possible that what you consider to be a structured interview is in fact what this author (and I) would call unstructured. Specifically: if the interviewer has any discretion about questions at all, the interview is probably fundamentally unstructured.
In a structured interview, the interview is less an interrogator than a proctor or a referee. Every candidate gets identical questions. The questions themselves are structured to generate answers that facilitate apples-apples comparisons between candidates: they generate lists of facts. The most common interview questions, those of the form "how would you solve this problem", are themselves not well suited to these kinds of interviews. It takes a lot of work to distill a structured interview out of those kinds of problems.
This has me mulling whether that might be a better approach to administering law-school exams than the traditional analyze-this-hypothetical-fact-situation approach. (I'm a part-time law professor.)
More generally: I wonder to what extent school examinations can draw useful lessons from job interviews.
-The more hoops a candidate has to jump through, the more likely they are to bail out of your recruiting funnel. This is especially bad for college/postgrad recruiting when you aren't the #1 employer in your field. Everyone wants to work for the Googles and Facebooks of the world. It's hard getting someone to spend a couple hours for your startup job.
-People cheat. We usually issue a short coding project, grade for correctness, then do a code review over Skype or face-to-face. Many candidates turn in the exact same responses. I've even seen people cheat and have a friend do the Skype session with a totally different guy flying out. Do you proctor your test in a secure center? Use an online service to lock down their machine and record? Both are pretty invasive. Switching up the questions constantly is tough and makes your signal noisier.
-Industriousness and raw intellect trump skills/knowledge most of the time. Sure there's a baseline level of skill required to train someone quickly enough, like I wouldn't hire someone who didn't know basic data structures, but work-sample tests are often biased to those with a very specific background. I don't want employees who are great at doing what I need today. I want ones who will be great at figuring out what to do years down the line.
Second, incorporate the work sample tests into your in-person (or even telephone) interviews. Now you have a (hopefully interesting) technical problem they ostensibly just solved to talk about. Your evaluation should be by formal rubric, not interview, but it's easy to see if someone actually did it. We had no problems at all with people cheating (of people we hired with this process, over about 4 years, we fired not a single one).
Finally, I could not be less interested in the kind of amateur psychoanalysis tech interviewers hope they're accomplishing as a side effect of quizzing people about code in their conference rooms.
Here's how I think it works. Skilled interviewers are biased toward rejecting candidates based on any negative impression. Structured interviewing has the same effect. It's the precision versus recall tradeoff. For this use case only precision matters. Extremely low recall is fine.
Also, in the GPA prediction example, the interviewer is penalized for predicting a low GPA for a person who performed well. But in hiring, there is no penalty for failing to hire someone who would have performed adequately.
(Yes, I understand there is an implicit assumption in my argument that candidates are not in short supply, but that's usually true, certainly at Google)
Only if you have arbitrarily large amounts of time you can spend interviewing. Most of us don't.
People see patterns where there are none. I think this is fundamentally why humans fail at statistics. If every fiber of your being wants to see patterns then you will see patterns. Probably why people hallucinate when in sensory deprivation tanks as well. The brain will make up patterns just so it can continue to see them.
The paragraph right after follows up with the statistical failure that pattern seeking leads to
> They most often ranked no interview last. In other words, a majority felt they would rather base their predictions on an interview they knew to be random than to have to base their predictions on background information alone.
So people would rather do busy work in order to continue to satisfy established pattern seeking habits than figure out a better way.
I'd take that over every other bullshit interview process I've gone through in the past, while also feeling satisfied that it's a more accurate assessment of how I would actually perform at the job.
It's far more extensive than three nights.
Personally I feel like if you can't come up with an interview process that gives you enough confidence to risk a 3-6 month onboarding period on a new hire, you're doing something wrong.
And it's not some mere formality, according to their stats they hire about a third of the people they try out.
This is the part that's crazy to me. If you think you can accurately judge someone based the work they perform after a full normal workday/week, I have some beachfront property on the moon to sell you...
I have a feeling their conversion rate is so low because a lot of people get another offer during their trial and immediately jump ship.
Or because they're just selecting against good candidates in the first place; if you've got a good job and are performing well elsewhere, you're less likely to jump through that kind of hoop. It's easy to persuade yourself you're hiring the best candidates by being strict about selecting from a pool that is already heavily biased (I'm pretty sure Joel Spolsky wrote something along those lines years ago, so this is hardly a new insight).
Sounds like a recipe for a bunch of bad candidates.
As far as I can tell, there's always a pre-defined pool of questions. Some of them are pretty 'open ended', but still targeted towards getting a good impression of a certain area of knowledge.
That being said, I had an interview with Google at some point in the past where one of the interviewers almost seemed appalled that I didn't know the exact list of items that can be found in a filesystem superblock. But at all other companies it seemed a bit more sane. I guess it's partially a function of the type of personalities a company is willing to hire :)
But even then, for such a process to be rigorously structured, the questions themselves need to be determined a priori and without reference to the candidates background or preferences. Otherwise:
* You're subject to the interviewers and judges (probably subconscious) biases about the merits of different questions.
* You're more exposed to the candidates own innate ability to interview well by navigating themselves to more easily-answered questions.
Do you mean characters to indicate a null child?
I think the Google hiring process is fucked up in a lot of ways, but I don't think reliance on whiteboarding or the selection of interview questions are two of them.
Well sure, but you only have so much time to study for these interviews.
So, you make sure you review your basic data structures, algorithms, and complexity analysis. The things that will most likely show up.
Then you have to consider the more advanced topics. Realistically, you're not going to have an expert-level understanding of all of them. None of them are extremely difficult concepts to understand necessarily, but you need the kind of mastery to understand such a problem quickly and under pressure (since it's a short, timed interview).
The most effective way to study, then, is to simply run through a battery of dynamic programming questions and hope you get lucky and the interviewer asks you a variation on one you studied.
I've had more than a few AmaGooFaceSoft interviews ask me slight variations on the change counting question. They're really testing my ability to recall the solution to that problem, not any sort of deep understanding of dynamic programming.
On the flip side, the interviews I've failed almost always involved some pet topic that I wouldn't dream of studying. I once (no joke, and not THAT long ago) had an interviewer at AmaGooFaceSoft ask me do a bunch of calculus questions involving power series (turn a function into a power series, determine the radius of convergence, etc.). Why would I study that for an interview?
Considering the payoff (potentially a $40,000 to $100,000 raise on a 40 hour work week), I think that the amount of time to study for these types of interviews is very trivial. Interviewing skills can be used for interviews at many well paying companies because this type of interview has been standardized, so it's not like you're only studying specific knowledge for one company.
MapReduce, Pregel, Bigtable, Flume, etc. are building blocks: they solve some of the distribution problems, but you still need to understand how the algorithms that run on top of them work, on a step-by-step level, to implement on top of them.
Let's say you work on some part of Android. Obviously you need to interact with things like Google's build system which are distributed, but are you really implementing some distributed computation in the course of your every week, or even every month?
I get that Google wants to test during the interview for suitability over a large space of possible specific roles, but I seriously doubt that "distributed systems stuff" would be in the list of top 10 programmer domain knowledges that are useful in those roles. Is it more useful than knowing how to work with version control well? Everyone at Google has to do that, but they don't test it during the interview. Is it more useful than being able to read and write idiomatic and readable Java? They don't really substantially test that during the interview either.
(On the other hand, the things that spawned this conversation were "dynamic programming, parallelism, and networking" and the latter two are much more obviously generally important things.)
There are a lot of other ways to screw up an interview.
* Skills tests (hacker rank)
* Whiteboard interviews
* Unstructured interviews
* Employee referrals
No wonder headhunters have such a good business. Not that they're more discriminating, but they can pretend to be the solution to an intractable problem.
Do companies view them this way though? I've always thought that hiring external recruiters to bring in a bunch of candidates is done to outsource the scummy task of bothering people via phone, linkedin etc.
I mention closed-source for a reason. For technical hiring, there is nothing better than open source. Open-source projects allow engineers and their potential employers to collaborate in depth over time. The company can experience whether the engineer is competent, reliable and friendly. The engineer can judge the team's merits in the same way. And they can both decide whether the fit is right.
Closed-source and/or non-engineering jobs are the opposite. You get a resume, a Github repo if you're lucky, and a half-day's worth of interviews and tests. Then you roll the dice on that imperfect information.
This is one reason why a lot of recruiting and hiring happens through the networks of people that a company can tap into. It may seem corrupt or nepotistic, but the advantage of those referrals is that someone with more information than you is willing to stake their reputation on a candidate's performance.
Large companies with lots of historical data have the opportunity to train algorithms to learn how job applications and long-term performance/flight risk/etc. actually correlate. From what I can tell, most haven't.
I agree with vonnik, that running the Project in Open Source definitely has an edge in the interviewing process. The potential employee has already seen your code (or you can make sure that he has), knows what he/she is getting into. I would seek for some links of their contributions to other projects which will help me evaluate if their code.
Of-course less than 5% of the job applicants now have had the opportunity to develop for open source. (Either the folks contributing in the open source love their job so much, that they hardly look around or they are already being offered jobs without looking..).
Kahenman's work on the illusion of validity comes to mind. It kind of boils down to our hiring process must be right because we're doing it.
As opposed to a situation where your interviewers are asking off-the-cuff questions, and you're uncertain whether the questions have a systematic bias toward a bad direction -- racism, sexism, or just selecting for something idiosyncratic that adds unnecessary constraint to your candidacy pool. You also now don't know whether it's your policy that's systematically bad (or good!), or that your interviewers are systematically bad.
Or put another way: if 'random' was better than 'unstructured', you'd never have a round of hiring where there were no unsuitable candidates - one will have been chosen.
The best interview I've ever been on was one for a young startup. They gave essentially a homework problem, a day to solve it, and then in the interview we talked about the problem and my solution. The worst interview I've been on was sitting in front of multiple engineers as each one threw out a random CS question (from seemingly the entire space of CS) and asked me to talk intelligently about it. When I seemed unsure of myself, they glanced around nervously and disapprovingly.
Interviews are the worst. I've spent my time trying to bolster my OSS projects, so that I can point to them as evidence of my competence, but I can't help but prepare for the worst anyways.
Competency is only one component of a hiring decision. After some base threshold of competency, the question then becomes whether or not you, the candidate, is going to be someone that the team WANTS to work with.
Bolstering your OSS projects is fine but you'll get a better return on investment for your time to practice "behavioral interview" questions. This is absolutely the hardest type of interview but in the hands of experienced hiring managers it works better than any other technique, IMHO.
If you can find an experienced mentor who will do mock behavioral interviews with you and give you honest constructive feedback, that will boost your interview skills more than anything else.
At the end of the day, it's pretty well known that at any big tech company a second pass through the interview process would easily cut out a lot of the people there. Could be a bad day, a question you don't remember, etc.
I personally now only ask the homework type of questions. Making sure to give the person plenty of time to do it. Works really well but obviously someone won't like it (though I've yet to run into the mythical person turned off by having to do a simple coding project that HN keeps talking about...)
The best part is that most people at companies like this aren't aware that it's not normal, so they'll be open about it.
In my experience (having hired > 100 engineers) one of the basic problems that tech hiring, as a whole, has is that it misunderstands the point of a technical interview. Organizations and hiring managers see the interview process as a way of improving the brand of the engineering organization - "We have super high standards and to prove this our interview process is really hard - therefore if you think you meet these standards you should apply". This leads to the current interviewing trends of super academic/puzzle/esoteric technology based interviews. Applicants leave those interviews saying that it was super hard reinforcing the brand messaging (classic marketing).
Rather, in my experience, the best results come from viewing the hiring/interviewing process for what it is - an attempt to predict future performance (and specifically performance at your organization) using a variety of techniques which interviewing is one. In this context, of attempting to predict future performance, interviews are not a great tool - better to look at specific past performance.
Past performance is always the best predictor of future performance and the point of a technical interview, in my mind, is to critically inspect that past performance to understand how closely it relates to the future performance that your organization needs.
It seems like it would be much better to instead put the job prospect in situations that model the sorts that would come up at work.
My evidence is anecdotal, but I've spoken with multiple Indian women who are in arranged marriages, have zero real love in the relationship, and aren't particularly happy, but the idea of divorce is entirely unthinkable to them. The lack of divorce probably makes this a 'success' by most metrics, but doesn't seem particularly successful to me.
Not sure how using inexperienced interviewers proves anything. Would have been more interesting to have lecturers interview the students.
The additional 50 students that the school interviewed but
initially rejected, did just as well as their other
classmates in terms of attrition, academic performance,
clinical performance, and honors earned.
She said that, essentially, the interviews were ad-hoc, with the interviewer just coming up with whatever questions they thought relevant based on the resume - often asking the candidate to go through their career history.
I explained that the only effective approach I have found with recruiting is to have a set of pre-defined questions, and each question is specifically designed to give insight into how the candidate meets the pre-defined job requirements. Very much like software development, where test cases are related to software requirements.
I explained also that it is not critical to stick precisely to these questions, but that should mostly be the case - interviews are human interactions and some flexibility is required depending on circumstance.
The HR person then explained this to the hiring managers at the company, and worked with the hiring managers to define interview questions that give insight into the job requirements.
The next two people interviewed got the jobs, after months of no one getting through the interviews.
In the early days of software development, the business was often dissatisfied with software delivered because it simply did not meet the requirements of the business. So the software development process matured and came up with the idea of tests that can be mapped back to the requirements via a requirements traceability matrix. Thus the business has a requirement, the developers write code to meet the requirement, and a test is designed to verify that the software meets the defined requirement.
Recruiting currently has no such general understanding in place of the relationship between job position requirements and definition of quantifiable questions that identify to what extent a given job candidate meets a requirement.
Once you get your head around the idea that recruiting should be very similar to software development in this regard, then it is easy to see that ad-hoc interviews do nothing to verify in any organised way to what extent a candidate meets the requirements of a given job opening.
This creates a system that can be gamed though. The interviewees can pass this information back to the recruiter and the recruiter can couch future interviewees. The recruiter has a vested interest in placing people.
It would be better to have a thousand questions that are randomly selected.
Interviews can be good and bad, I'd venture to say that many the horrible hire has been avoided by any interview at all. Thus, don't make perfect the enemy of good, and try to improve on good.
The set of potential bad hires is vast compared to the good hires, and that ratio is only remedied by good filtering before and during the interview.
Actually, stories about horrible co-workers that weren't weeded out by interview or any other hiring process are quite common workplace stories in every field, including tech.
In order to ascertain this, I propose job-hiring hackathons. Have the company hold a mini-hackathon, once every 2 weeks or once a month, where all job applicants must show up and work on projects (corporate employees' presence can be optional). Just watch them complete the projects and hire the best candidates.
However, "don't apply for jobs you don't really care about" is a very 50/50 advice. Right now I have zero money reserves. If I somehow got fired tomorrow, I'll be in the red even after 2 days of unemployment. So sometimes you have to make a hard choice.
That being said, it's good to be open about this after you get your act together months later and decide what to do with your current employer, during a lunch for example.
CLARIFICATION on the leverage remark: it's my opinion that 99% of the time leveraging an offer from another company that wants to give you more money, to make your old company give you more money, is a huge mistake. Most businessmen HATE being strong-armed, or, to use a milder language, hate being shown that their employees have power over them, and this makes them hate you even if they very much need you in a business sense. They end up actively looking for a way to get rid of you, even if it costs them more money and/or stress in the long-term. I've witnessed it.
SOURCE: 4 of my stupider younger acquantainces from 7-12 years in the past. And an observation from my first job. After I "strong-armed" my first employer to double my then pretty measly salary, he went on a hunt to replace me (even though it took him around a year to really do it), but I was smart enough to detect the signs and resigned long before he had the chance. No regrets.
Is there any company that tracks who rejects a particular candidate during an interview process, and how often that negative feedback turned out to be true. I guess with the turnover rate at todays tech places, such tracking of a record of an interviewer is not really possible?
I always wonder about this.
Once you start collecting data on your hiring pipeline work sample hiring becomes so much obviously better that it makes little sense to spend the time to do the hard work of making a good structured interview process.
If people utterly refuse to learn from proven mistakes, then all hope is lost. Einstein was right, human stupidity is infinite.
The other jumped around for few years, got laid off by some company, recently joined FB.
I guess he was good at whiteboard exercises, though?
Calling them "utterly useless" is an utter click-bait.
Alternatively? How is this not the focus of an interview?
Sure, confidence and social skills are important, but obviously they cannot predict a person's actual ability.
This shift MUST be accepted by everybody ornostracism is risked. (Like trump supporters)
But paradigm shifts take time and the majority of managers still want cogs. But instead of filtering dor cogs they have to dress the filters up as filtering for a "i give smart people freedom team" and the convoluted mental gymnastics needed for this creates shitty interview processes.
All "well this technique worked for us" stories are mostly Not useful because tehy are just N=1 stories about managers using their preferred filters.
The issue isn't with the filters themselves (all sorts exist) but with a culture that obligates everyone to out on false facades.
People who just want to be paid and can work well need to pretend to be passionate. Managers who want well-paid cogs need to pretend to promote individualistic thinking etc...
It was painfully awkward.
It was also a fantastic way to accidentally discriminate against women and older candidates.
I'm not saying anyone should conduct an interview completely by the seat of their pants, but please don't encourage this foolish consistency.
I've worked in different fields, and I talk to people who work in other fields. Most of those fields work in a way that is described in this article - interviews are question and answer sessions, where people are evaluated by a number of highly subjective criteria. "Tell me about your fundraising experience?" "How do you deal with difficult clients or coworkers". That kind of thing.
Software interviews are exams. They're not "more like" exams, they are flat out exams. There is very little banter. The closest I've come in google and Netflix interviews has been the more open-ended system design style question they often put in there, but even that has an academic test quality to it.
It's pretty much 5 hours of technical exam. "How do you find all matching subtrees in a binary tree" might be a question - and you really are expected to get it written at the whiteboard. "Find all permutations of a set". "Find all square sub matrices in an NXM matrix." The "top" companies are good at modifying the question so that you must know how to do this but can't just regurgitate it.
Alternatively, you may do a "take home" exam. Most recently, I did a mini rails project. I actually liked my result, I kind of enjoyed writing it. However, it was a no-hire, one reason given was that my routing was non-standard. True. I hadn't really thought about it, it was a take-home, so I mainly focused on the UI and code, and just chucked in a couple of named routes for demo and testing purposes. The other reason was that there was some duplicate code (I disagreed and had a reason for this, but there is no chance to defend your code, you write it and send it in, and they say "no hire").
I have no idea if it was a real piece of crap and they were just being nice. It had 100% test coverage and git for version control, and implemented a few features. Unfortunately, like I said, I never got a chance to defend the code.
Our processes in high tech are badly broken. I'm probably done interviewing, my next job will have to be one that doesn't involve a software interview. The routing and duplicate code, along with a google interview, pretty much sealed the deal for me.
My advice to people is (this isn't my idea) be an X that programs, not an X programmer. Coding is an amazing tool for a job, but avoid making it your job. For instance, I actually know a fundraiser who does a lot of data science, and he's a rock star in his field, but I guarantee you nobody asks him to reverse a binary tree in an interview!
Best of luck out there. Our interviewing processes are their own special version of horridness, just not uselessness described here.
Yet why do some marriages last forever (till death do us apart) while others fail miserably or crumble even after 20 years?
The search for the global optimum cannot be performed by asking a set of questions. I argue that it cannot be done consciously. It's gut/instinct thing. If you have a mechanical approach, anyone can game the system and get a job because humans can be like chameleons to present themselves as the right candidate, and they can study for the interview. Only way IMO is to have that 3rd eye or whatever you call it... instinct, gut feeling, etc
The problem with this conclusion is that instinct and sexism/racism are often conflated.
No good answer.
In your example, how would you structure a series of questions and procedures to limit the risk of marrying someone who will abandon you or cheat on you? I think you can apply good interview process techniques to this quite well!
1. Have they been divorced or cheated on someone before? People who have been divorced before are much more likely to divorce again. The 50% divorce rate in the US is slightly misleading, as many of the divorces are concentrated in repeat divorcees. In fact, among younger (early to mid twenties) first-time marriages, the divorce rate plummets to something like 15-20%.
2. Have their parents been divorced or cheated? Children of divorced parents are much more likely to get divorced themselves.
3. Have they been physically, emotionally, or sexually abused as a child? People with traumatic early childhood experiences are much more likely to develop trust issues with long-term partners, especially if they never had extensive counseling.
4. Do they have a good relationship with their family? People who have a difficult or unstable relationship with multiple family members are more likely to see tumultuous relationships as a norm.
This is all equivalent to reference checks in a job.
Then a long dating / engagement period is necessary. How do you they treat you during this period? Do they cheat? Are they abusive? Do they leave you during a period of difficulty? Do you have the same religious views? Do you split housework evenly? Do you both want kids? How do you view money? The three most common reasons for a fight among couples are 1) money, 2) housework, 3) free time (and how to spend it).
People who are otherwise happy and well-adjusted adults who get married and then divorce bitterly after 10 years are not the norm. Most divorces can be predicted. And most divorces happen before 2 years of marriage. If you are aware of the warning signs and are not blinded by a "gut instinct" I think you can definitely minimize the potential for marrying a snake -in-the-grass.
To examine this "unfairness", let's imagine it at its most extreme: a society in which divorcees are so stigmatized that it's practically impossible to ever re-marry; in which children of divorcees are likewise stigmatized; in which victims of child-abuse are further victimized by a society that considers them potential "snakes-in-the-grass". Do we really want to live in such a society?
Obviously I don't think you were advocating this. But it's a thought experiment which demonstrates a classic class of problem: what's good for the individual isn't always good for society, especially taken to extremes.
If you want to make the most accurate predictions possible, you absolutely should not use gut instinct. If you want to be fair, then have a lottery or select randomly. You can't have both. There's nothing remotely fair about gut instinct. See, e.g. judges giving unattractive people twice the sentences of attractive ones. I can provide tons more examples of stuff like that. Gut instinct should be illegal.
Anyone from any background can either rise above their circumstances or fail regardless of the help they get. This should never prevent is from thinking clearly and logically about our future with these people based on their backgrounds or past actions.
I know a couple that divorced because one of them didn't want to be monogamous anymore. They tried to make it work, first one way, then the other, but in the end they couldn't. Do you think the other one is more likely to cheat or file for divorce than someone who's never been married?
Children of divorced parents are more likely to get divorced, but I've never seen that statistic controlled for personal and cultural attitudes about divorce.
What I've read suggests that whether someone has experienced a healthy relationship is more predictive of relationship stability than whether they've experienced trauma.
A flawed heuristic may be better than no heuristic, but too much confidence in a flawed heuristic can backfire.
(Incidentally, most divorces happen after 8 years for both first and second marriages. The 50% figure applies to first marriages as well. That doesn't tell the whole story, because young adults now are divorcing less than young adults a generation ago. On the other hand, we don't know if that will remain true; divorce later in life has increased.)
Well, you don't exactly "date" for years before getting a job. You're acting like people get married on a hunch, or that people don't work hard to present themselves as "marriageable." (Incidentally, it's also not clear to me that a 20-year marriage is a failure, and certainly not in this analogy)
In most of the places I've worked where there are short-term contracts before full-time hiring, the employer has a far better idea of the skills and quality of the candidate.
At the place I work we offer paid internships to college students who haven't graduated yet. By the end of a summers' worth of work, we have a really good idea as to which interns we would like to hire full time, and we give them an offer on the spot (contingent on graduation), no interview needed.
instinct and sexism/racism are often conflated.
Not sure why you think people can't game your gut instinct as well. A whole lot of bullshit artists out there, which may or may not be a requisite skill for the job they're applying for.
This is ridiculous. Mechanical approaches can be gamed, but so can gut instinct. You just have to look at the White House to see that. A lot of gut-instinct voters got hoodwinked by a skilled self-promoter. And he's a gut-instinct guy himself, so he's getting led around by whoever's got his ear.
Gut instinct, though, is worse in several ways. One is that it's not transmissible. As a company grows, how do you scale? Another is that it's not necessarily repeatable. Was that person bad, or is your gut off because you're tired, depressed, or upset? A third is that it can't be consciously improved. If your mechanical process has a flaw, your team can discuss it and come up with solutions. But if your gut judgment isn't good enough, what can you do?
I am a big fan of gut instinct as a component of a hiring process. Often, thanks to experience, we perceive things we have a hard time articulating, and it's wise to pay attention to that. But I think it's a giant mistake to treat our subconscious as some sort of mystic oracle that we must worship and never question.
If it were only a gut/instinct thing, you would have no basis to reject 'wrong action' other than it feeling bad for the gut..
At some point, the 'set' of circumstances will/will not intersect the 'set' of actions such that it is clear that the person acted/didn't act 'correctly'
according to ones heirarchy of values.
Ethics, philosophy, and theology exist for a reason - the fact that our society tends to ignore this , presuming and not investigating the subtleties within a relative-individualist framework doesn't make these frameworks correct, or even a very coeherent set of doctrines with which to gauge things..
but yes, a 30 minute interview, or even a 1 week trial period is not a very good means by which to judge character, since the potential reward is great enough, and the period to observe inconsistencies short enough, as to allow a deviant personality to 'fool people' for the trial period.
It's not the unstructuredness or the judging humans part that's the problem, it's the fact that this is all done at scale by people with more important things to be doing.
If companies all did like Apple in the early days and treated every hire as a bet-the-company decision, bad hires would go away.
And if the organization for some reason needs to work at scale, add an initial lightweight sanity-checking and routing to the correct group. But keep the main hiring decision with the group the engineer would be working in.
I think it would be cool if a company like Google published anonymized data about the correlation between interview "scores" and some post-hire metrics. My guess is the correlation would be poor to non-existent. Picking those metrics is difficult, perhaps an obvious one that is difficult to game is how long the employee stayed with the company. Another interesting one would be anonymous peer evaluations where it's guaranteed no one would see the data points so it wouldn't suffer from the problems 360 reviews have.
I think the best you can do is some sort of mix between a more structured portion trying to gauge where the candidate is in terms of knowledge and intelligence and a less structured portion where you try and get more of a "feel" for the candidate's personality. You have to realize that under the best of times it's not perfect. Rejecting someone can't code at all for a software position should be relatively easy. Ability to tackle bigger things can be gauged by looking into previous projects and through references. I think there's a big region though where the outcome is difficult to determine.
I think back to the very first time I was involved in a hiring decision. The guy was very smart, technically capable, engineering degree; PhD material. Seemed pleasant enough. Got hired and IMO definitely the kind of person a company like Google would hire with their process. He lost interest fairly quickly on the job. Working with him I found he had some very odd, not to say crazy, political opinions. Everything was too boring for him so he didn't really get that much done. Couldn't really work independently at all. He left the job, left the country, and I think he ended up being a cab driver. Not sure. Yet another example is someone fresh out of school with a CS masters degree who despite all the help of the team could simply not wrap his head around the project and become productive. On paper all the right credentials but first real world job and he couldn't cope for some reason. Ended up leaving. I've seen a few CS background people just not find their place. I'm sure we've all seen situations where we wonder how some person got hired and how come they're still there. Over time I think I learned to do a better job hiring/interviewing and have had a stream of pretty successful hiring decisions.
It turns out that it takes many months on the job to really know the fit. Even if the person is capable the specific job or team may not work out. Some people are very good at making it look like they're accomplishing things while they're not really. There's no way any company has a secret sauce of only hiring "great" people and even great people will do poorly under certain circumstances and conversely not so great people can be very successful under the right circumstances. Some people can grow really quickly while others can't.
Google's interviews may well boil down to "prove you're smart enough to work here at this smart company with smart people like me. "
"Oversell yourself so you pressure yourself into high standards and thus end up working late often out of guilt"
"The job is easy but with a lot of specific untransferable domain knowleedge. Prove that you're a company man. "
"Most of us aren't sure what we are doing, but there's value in our companies general direction. Can you seek out help and thrive in such unknown conditions?"
These simplifications are actually for the most part "good" and is actually reflective of how the manager sees the world. Google is unique in that the direct manager doesn't have as much control, but the manager spirit is a belief in giving smart people freedom.
That gets you in, but your immediate manager may end up not believing in this. Michael O Churches story comes to mind. In some cases, the manager having no say in the process is bad as there isn't a good fit.
A manager who believes in treating people like slaves and people who want to be treated like slaves for a specified amount of money is actually a good fit. Someone who wants to be a cog but is given freedom is paralyzed by indecision and vice versa, the person is stifled.
It's hard to see this in the US because there's a strong bias for the free/smart paradigm and all companies have to outwardly present this shared value. In china though "i'm just a code monkey" is said a lot because despite having little to no say in hours worked or project assigned, software pays much mich hogher than other jobs. It's a deal they accept because there aren't any better ones. Or more specifically, because there are too many other people who will take the deal.
Only when a majority of people demand a base standard of life can you prevent a race to the bottom inflicted by employees themselves.
The key is to see through the game that companies arr required to play (at least in the us) and track down the exact team you want to be a part of, figure out the actual culture (having a taxonomy beforehand is useful) and then deciding if it's for you. (Given your BATNAs)
Because of this game, all marketing about being great places to work is BS because thats the ONLY thing they can say. I say mostly because the marketing is a result of a real cultural shift in realizing how to effectively manage knowledge workers.
So silicon valley does in fact have a more progressive attitude to management styles unlike the east coast which is more about play books, but it's fsr far less than what you would think. The majority of managers still subconsciously reject new team managment styles, and shitty interviews are a result of having preferred filters but NEEDING to dress them up in all sorts of covolution.
It's also limited. I can be much more productive when I do one thing full time than a bunch of things part time. Depth takes time, as does keeping up with changes. You just can't get as far with fractional attention as with serious focus.
And uncertainty is also uncomfortable. Many people just want to settle into a solid, reliable situation and do the work. They want to be able to plan their future with some confidence. Even if they can make more money juggling a variety of things, they'd rather make less and have less chaos.
So I don't think we'll ever get to the point you suggest.
I too quit my freelancing for a stable remote job but looking at it 18 months later I found out that I sucked a lot in managing my freelancing gigs.
So IMO with some good contract and budget management -- and confidence, and actually having a choice -- you can reap most the benefits of freelancing and almost none of the drawbacks, if you can take the thought of switching customers every 3-6 months.
I understand not everybody can pull this off -- I am not sure I am yet ready to do this that well. But I've seen people doing it very successfully and almost stress-free.
Yeah well, when you're asking questions of someone who looks thoughtful very briefly, then answers almost immediately after, it sounds like you might have more reason to think you know them better than the one actually considering their answer
Might be introducing a confound or two that you then proceed to completely ignore and even conclude past lest someone accidentally draw other conclusions