Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The two questions I ask every interviewer (wesleyac.com)
307 points by deafcalculus on Nov 1, 2017 | hide | past | favorite | 303 comments


Each time I read a post like this I say to myself “they’re not wrong, but something is also missing.”

If there was a way to evaluate a candidate based on how well they would do at their actual job, in an hour or less, that couldn’t be games or cheated, we’d all be using it.

That isn’t to say we shouldn’t work on improving the hiring process, but how much time, both from the perspective of the candidate and the company, is too much time to invest in a potential hire?

Would we get better signal if you came to work for us for a week and paired with everyone on the team? Sure! But how many candidates can take a week off from their current job? The author talked to 11 companies. Could they invest 11+ weeks in their job search full time in order to find their next role? What about the loss in productivity? Can you have more than one candidate in the office in a given week? What if your ramp up time for full time employees is actually longer than a week? Are you biasing for people who ramp up fast instead of people who will be ultimately more productive and impactful?

I often feel that engineers write these posts because they know that they themselves are good at their job, but feel it is silly to have to develop an alternative set of skills in order to signal that they are a good hire. Or they get rejected and blame the process or those alternative skills. I’ve felt the same way and wanted the system to be better tailored for my skill set, but balanced against the time investment for some alternatives, I see why we have the process we do.


I use a work sample (short paid contract), but it's not perfect and boy is it labor intensive.

As a hirer, you really can't win. There's not one hiring process nor guiding principle that doesn't seem to bring out the pitchforks of those who were frustrated, disrespected, or rejected by said process:

- Show us your open source work. (You're excluding all but a lucky few who have the privilege of writing open source code!)

- Okay then, show us a personal project or some work you've done in your free time. (What, so I'm expected to live eat and breathe code 24/7 to get hired?!)

- Well, how about a short contract/work sample? (How am I supposed to find the time to do that? I have a day job and a life!)

- Shall we try whiteboarding/coding tests then? (This is so insulting! Solving CS puzzles isn't what the job is about!)

One cannot, sadly, rely on the résumé. I have interviewed multiple self-deemed "experts" in such-and-such language, only to find that they could not even write a basic for-loop on the whiteboard.


I mostly agree, but he still makes some good points. For example, "Give your interview to current employees".

One of my coworkers and I often joke about how we'd never pass our current hiring test, and yet we're among the highest paid and most productive people at the company. I've expressed this to the hiring manager and he still thinks it's a good approach. They have literally said they want more people like me... so they brought me into the hiring process, but I disagreed with them on almost every candidate, so they brought me back out of the hiring process. Some people don't know what they want out of interviewing a candidate, they just go with the status quo interview and assume they're going to feel good or bad about someone based on their results.


Most of these interview tests are game-able, there are a few resources you need to study hard for and you’ll be able to pass the interview questions without many problems. This has the perverse effect of raising the bar since everyone is doing so great on the problems and filtering is needed! But in the end, you only select for who is better at studying for the interview, not who is a better developer. Fun stuff.

If you onboarded in a time before this, or you aren’t practicing for an interview, it is very likely you wouldn’t just pass the interview, like you wouldn’t likely pass a physics college exam right now even though you did when you were in school.


I've found that it's usually obvious when someone is gaming my interview. Their answers are too correct, and too smoothly given. Those candidates are very quickly rejected and the recruiting agency basically told to scram.


Perhaps, but here's a personal anecdote when I was interviewed years ago. The interviewer asked me one of the trick problems popular at the time, which I knew the answer to. Ironically, it was not a problem I'd studied for; I just happened to have seen in elsewhere years prior, found it interesting, so I remembered it and its solution. To be clear, I didn't solve the problem the first time I'd seen it, either.

Nevertheless, I gave the interviewer a detailed, step-by-step, golden answer she was looking for, and she's obliviously impressed. After a few minutes, however, youthful stupidity^W^Wmy integrity got the better of me and I told her that I'd seen the problem before. So I got a 2nd trick problem, one that I didn't know nor studied for, and naturally didn't arrive at the golden solution.

I didn't get called back.

If you were to ask me to solve a problem that's on my study list today, believe me that I've got the selling part down pat now.


If I study a bunch of interviewing problems really hard, I’m bound to do good on your test. I can study hard, I proved that in unveristy and grad school. Your interview process probably isn’t that unique, and for bigger companies like Google there is plenty of study material out there.

And everyone is doing this now, it just isn’t a certain few bad recruiting agencies and applicants. Heck, even the good applicants (the ones that you want to hire anyways) have to do it to get through the loop.


It's almost as if the people who want the job the most would study for it the hardest.


Best company I've been hired at did a sample project.

They had the project all setup and ready to go. All I had to do was fill in some of the more complicated bits (hooking up to the database using LINQ to SQL, and implement some queries to search by name/album and two or three other fields).

Then display them in a certain order.

I was given an hour to do it. Apparently, after I was hired, I was told I was the only one they interviewed that could do it. Ironic part was I'd never used LINQ to SQL before and had studied it the night prior. (It was on the job posting)


At my last job, I set up something like this. We were specifically interviewing people who claimed to have frontend and Angular experience - I confirmed this on every resume before the interview and again with the candidate before explaining the test. To test it, I gave the test to 2 coworkers who each completed it in under 15 minutes. Candidates were given an hour for it.

The test was a JSFiddle pre-filled out with Angular boilerplate. It had comments in places where things needed to be added and filled out. Just to be clear, there was no gotchas or tricks involved. It was very simple: "get JSON from this public HTTP endpoint, display it on the page in a table and then add a filter field. If there's time, add some CSS to make it look pretty."

Candidates were encouraged to use docs and Google if they couldn't remember off the top of their head how to do those. More than one candidate copy/pasted from Stack Overflow during the interview and while I wasn't impressed, I didn't count it against them.

Several candidates completely failed at it and I suspect it was a combination of nerves and lack of experience (one candidate it became clear that he had never used Angular before and he never got past the "get JSON from an HTTP endpoint"). During the test, I was more than willing to talk through the code with them and help them figure out the best way to do it. The one candidate who passed it was hired and they were very successful at the company.


Was it marked as a requirement by the job posting? Because if not, that's just a test of knowledge, not ability.


The test is the ability to apply knowledge, which is probably the only thing they need.

The commenter did say it was mentioned in the job listing.


Yes, I understand that it was mentioned, my question was whether it was mentioned as a requirement.

If the listing just said "LINQ-to-SQL knowledge a plus" but then gave you a test that actually required foreknowledge of it, then that's kinda shitty.


Shall we try whiteboarding?

I've used this in the past. But, I don't ask for actual code. I'm using it to see the candidates thought process.

More important than any psuedo-code is how well they communicate what they are thinking (whether that be trough writing on the wall or conversation). Can they break the problem down into workable pieces? Do they ask questions about the requirements? Do they try a simple solution and iterate on it? They they get hung up on small details and miss the bigger picture?


Sysadmin/infosec guy here. We do the same thing. I was shown this a while ago and I use it on other candidates I've interviewed. Propose a problem, then as they start working through it, throw other wrenches into it or state that their solution didn't work. See what tangents they go off to.

Another neat one I was shown was, as a whiteboard test, was to write down all the letters, A-Z, then come up with the name of a command for each letter. It's a double-sided test -- it shows your ability to respond quickly, how you respond, and the ability to back up your responses.

Of course, joking questions like "vi or emacs?" can be good as well to see a candidate's character as to how they respond to technical inside jokes like that.


Not sure about the alphabet. Got to K before I got stuck and thought... So in that moment, if I forget that there's kill that starts with a K, why does it matter? How does it relate to the fact that I know to use kill for sending signals? The recall for a situation where you'd use something and for a letter matching game seem to be very separate. (On the other hand I remembered join which I used maybe once in my life...)


It's multi-faceted. It shows your ability to think quick and how you solve multiple problems. If you stop in the middle of the list and try over and over to come up with an answer, how will you react to a real problem? Or do you just give up when it gets hard?

It's testing more than just rote memorization. You also have to explain any commands you write down, what they do.


You mean... "Hmm, my assumption, from looking at your resume and portfolio, is that you aren't quite ready to hold your own in a spontaneous, 1-1 discussion about an actual, real business or technical problem. And not only that -- I actually have severe doubts not only about your relevant experience, but about your basic intellectual capabilities. So let me give you this little made-up toy problem, with this little checklist in my mind about where I think you're going to screw up. And watch you perform. Being as I can tell you don't have a lot of options right now, and will put up with pretty much whatever I dish at you."

That might be OK for someone with, say, <2 years experience out of college. But for anyone mid-career or higher, it comes off as silly and condescending, and as not exactly a good use of their precious, irreplaceable time. And more to the point, conveys the exact opposite of the message you should be sending: that you know they're smart and intellectually self-sufficient, already. And that it's up to you to win them over, and convince them that you're worthy and interesting, and that they should jump over, and join your mission.


WTF? Maybe I didn't communicate it well, but the whole point is the discussion, not the answer...

More important than any psuedo-code is how well they communicate what they are thinking

It's not a test of technical skills. And you'd be amazed at the number of resumes that signal somebody can have a quality discussion about technical problems, but the candidate can't communicate for shit.


But the candidate can't communicate for shit.

You know, it just might be... the interviewer who's not so hot at communicating.

Or is "communicating" okay enough, but following a template that's broken to begin with - and pretty much designed to make candidates feel alienated - or at best, highly unenthused at the prospect of playing along with.


I generally agree, but I was involved in an internal hiring interview where a candidate was extremely qualified. In that situation, I was able to ask a hard and open-ended problem about graph traversal on the whiteboard. And it was fun to see the interviewee's approach. "Wow, I haven't done anything like this in years," was the reaction.

Obviously that is an example of a low-stress interview. When the candidate was stuck I gave hints, and the whiteboard portion was only intended to last 10-minutes. It was also only 5% of the feedback I gave. The majority was comments on candidate's knowledge and experience.

---

That's definitely how I'd like my whiteboard interviews to go. I've experienced the back-to-back grinding interviews of softball problems to write in syntactically correct code. Getting dinged for errors and even getting a close but wrong answer is a pain in the butt. When you write a program and realize it's wrong at the end, the question, "What do you think the right answer is?" is frustrating. I can't say, well, I wish I had 30 minutes to start over. If you tell me the answer, we can talk about why I was wrong.

Those interviewers have been engineers peeled off of their daily projects though, so I can't say I fault them individually.


Interviews are naturally awkward.

If you had that attitude during one of my interviews I'd show you the door.

I can't begin to count the number of impressive resumes that lead to unqualified candidates.


If you had that attitude during one of my interviews I'd show you the door.

And with that attitude on your side -- "if this isn't going well, then by definition, it's the candidate's fault" --- let's just say we're clearly not match.

I can't begin to count the number of impressive resumes that lead to unqualified candidates.

Ditto for job ads leading to unimpressive companies with obtuse interview processes.

Interviews are naturally awkward.

As currently conceived. So maybe it's time to start thinking about them differently.


You're awfully full of criticism today. In your perfect world, what would the interview process be? How do I, the hiring manager, ensure you are what you claim to be, without wasting either of our time or hurting your feelings?


Yikes. So which of the other 3 listed methods do you prefer? Or something else?

Also, I think you’re somewhat confusing whiteboarding and being interviewed by an egomaniac. If someone is a total jerk, I imagine any interviewing methodology will suck.


Disclaimer: I've written code outside of work for my own satisfaction.

> - Okay then, show us a personal project or some work you've done in your free time. (What, so I'm expected to live eat and breathe code 24/7 to get hired?!)

I have software I can show off to potential employers but I didn't write it for them. I'll admit that I got really lucky to both enjoy coding and to have the privilege to so, but I didn't have to give up my life in order to get it done. I play video games many more hours per week than I code outside of work.

If I had to choose between otherwise equally qualified mechanics, I would tend to prefer the one with a hobby car at home.


Whereas I would choose the one with the cleanest shop.


I think I'd prefer the one who tells me the potential consequences of not doing a repair, without me having to specifically ask about it (or pry it out of them) and then verify the answer from another source.

Dealer service departments always fail this test. They will never tell you that it is not necessary to do a repair. Some will even say "I don't feel comfortable letting you drive it off the lot like this." Then you take it to another mechanic that says, "Yeah, this might affect your fuel economy by a tiny amount. I wouldn't worry about it in a car this old, unless you plan on moving to California."


Great story, but can you make it relevant somehow to the discussion about interviews?


Rather than hire the person who continues to do his day job at night, or who maintains an organized workstation, I'd hire the person that actually provides value to the company.


What if you accept applications for 30 days, then write the name of each applicant on a card, put the cards into a deck, shuffle the deck, then draw cards off the top of the deck and make offers until you run out of open positions?

It isn't merit based, but it's hard to argue that it favors one candidate over another! And maybe it's mimicking the effectiveness of your current system!


This has the added benefit that you only hire lucky people. Noone wants to hire an unlucky person.


This has a certain genius to it. The same genius as sleep sort.


This! We have tried all the points listed there and at least one person has found fault with that. That said, we follow pretty much the order you listed. The people invested in their job search do respond.


I agree that there is no "perfect" interview process, but I have seen from experience that there are ways to go about it that reduce the crappiness for both sides, beyond what a lot of "top" companies are currently doing.

How about sticking with standard phone-screen/on-site process, but emphasizing writing real code, on a computer, solving realistic software engineering problems? I've interviewed at several companies (including one of the "big four") that took this approach and found it was the best overall balance IMO. You get evaluated in a more realistic setting than a whiteboarding interview, but takes the same amount of time an also doesn't depend on "side-projects" or "take-home" assignments.

Some general examples from my own experience:

- Here's a poorly-written / buggy program: Find the bugs and do some refactoring

- Here's a loose spec for a small program: Spend an hour designing & coding it up (on your own, no one looking over your shoulder the whole time), and then walk through it with the interviewer AFTERWARD and explain your design / answer their questions about it.


I've learned that a decent proxy for this mythical silver bullet of hiring is basic questions about tooling and networking.

Gauging someone's vast, claimed experience in some language or technology can be tricky, but if they can't explain what a rebase is, or how maven dependency management works, or how SSL works in general terms, it tells me a lot about what kind of technologist they are.


An alternate way to do this (which avoids false negatives cause they don't know the same trivia as you) is to ask them to tell you about a tool/library they are excited about. Or ask them to explain some part of their last project in detail. And then keep asking probing questions to see how deep they can go. You get the same sense of "how deeply does this person actually understand the tools they use" but you dramatically reduce the risk of hitting one of their (rare?) blank spots.

Side benefit #1 is you get a sense of what they think is important

Side benefit #2 is you find out if they can accurately gauge how well they know what they think they know

Side-benefit #3 is you might learn something new!


This is the kind of thing people in electrical engineering do.


As a preface, I’m not saying that those are inherently bad choices per se, because the underlying ideas are obscure enough that anyone who knows them at this moment probably knows other things you care about.

That said, never forget that ultimately your filtering criteria leak, and candidates will begin coming in with knowledge JUST about that, because you care.

Furthermore, you’re currently optimizing for trivia obtained in CS education that skilled programmers from other STEM fields and non-traditional backgrounds will lack. Besides people directly working on SSL, very few people need to know about symmetric encryption schemes at any practical level, and at best remember it at the same level I just stated it at, as a factoid. So you’re going to mostly find people who’d know that factoid and who’d remember that factoid; put another way, canned-answer spouters.


If you set up a web server for anything in the last few years you would have picked up something about SSL. Even if you just skim the rationale before copying the configs from Mozilla.

https://wiki.mozilla.org/Security/Server_Side_TLS#Modern_com...

If you used git, you probably even did a rebase. Or at least tried to. Or at least saw it as an option.

These aren't obscure university topics.


Right. They were chosen because experienced Java devs should be extremely comfortable with them. Not just familiar, but basically experts. And if they're not, it's a big red flag that casts a shadow over really anything else they have to say.

I'm sure each language/technology has a small list of universals that experienced practitioners should be quite comfortable with. Of course there are exceptions, it's not perfect, but my point is that these things are a good proxy for estimating overall competence level.


By what measure are you prioritizing Java+git over e.g. Java+Perforce in terms of universals?


Thank you for saying this! I have 30 years of paid CS experience, a degree, owned my own software company, worked for Fortune 50 companies and don't use any of those technologies. (I have used git, and used rebase, but like everything in git, it's so poorly named for what it does, that I don't remember exactly. I wouldn't even want to try to explain it at this point.)


And I bet those are exactly the kinds of things you're interested in and could answer. What about people who just aren't interested in maven dependency management (seems awfully uninteresting to me)? If your process is set up to find people who think like you and are interested in the exact same things you are, your process is broken.

And just to be constructive: you have a nugget of a good idea, that a technologist should have fairly detailed knowledge in some domain of interest. Finding what those areas are and getting them to speak about it beyond a superficial level of detail might be a good hiring criteria.


And that's one of the things done by companies in other businesses, who don't seem to have the same sorts of hiring problems that software companies do.

I'm an electrical engineer, and I've never been asked to design something from scratch in an interview. I have been shown existing design drawings or products and asked questions about them, which leads to various discussions. What we do in interviews is talk, both about what the company works on and about what I've worked on.


I agree with your idea of a proxy. Others seem to be interpreting it as asking trivia questions about your favorite pet projects and technologies. I don't think that's what you're doing.

In any given specialty, I think good developers will pick up the stuff around that specialty. Java devs will learn about Maven and git and maybe some JVM customizations to tweak memory usage. Rails devs will learn about which gems are widely-used to solve specific problems, and git, and maybe some devops/deployment stuff with SSH keys and Capistrano.

The particulars matter less than that there's some amount of holistic knowledge of their specialty. Of course one can argue the specifics (what if they've used svn but never touched git, etc)... but if there are no specifics -- if the candidate only knows their specific language of choice but nothing outside of it -- I would argue that they're probably not that great.


As a candidate, I'm the most excited when there are short contract/work sample coding challenges that are problems directly related to the job I'm applying for, and I wish more companies did them than whiteboard stuff. It tells both parties the most information out of all of those options. To the employer: "Can this person do the job, or not?" Pretty simple. To the candidate, I think a coding challenge can tell you a lot about the company and what type of work you're going to be expected to do. I think it weeds people out much more effectively than whiteboarding. Making time for this part of thing is part of the job search, the same as scheduling an onsite.


On the third point, I think that of the four counter arguments, that one is the least convincing. If a candidate is applying for a new job and are asked to do a work sample or an exercise, but they can't find the time (generally exercises I've seen are between 3-5 hours long, and usually you have a week to finish them) then why are they applying for a new job? You could easily find 3-5 hours in a week to complete some exercise, even if it's broken up into multiple sessions of coding, if your goal is to work at a new company.


You are missing tests. Either short-timed coding or written tests.

The best way is probably a mix of some of those. Yet, I have no idea what mix :)


For what it's worth, we also think work sample assessments are the most accurate representation of whether a candidate will succeed (and there's a good deal of industrial psychology research that backs this up).

If you or anyone else is interested in a platform for technical work sample assessments, come check us out! www.headlightlabs.com


> I often feel that engineers write these posts because they know that they themselves are good at their job, but feel it is silly to have to develop an alternative set of skills in order to signal that they are a good hire.

It does feel silly. But sadly it's often the way the world works. Standardized tests carry many of the same problems and benefits. Getting a perfect SAT score doesn't really speak to your level of intelligence as much as it speaks to your ability to dedicate lots of time and focused energy toward preparing for an arbitrary system of evaluation... which turns out to correlate somewhat well with being successful in school. There are few false positives but many false-negatives, just like with whiteboard-style coding interviews.

Is this fair? Maybe, maybe not.

What bothers me more than anything is that people either don't realize this distinction, or aren't honest about it. I was once rejected from a "big four" company after a whiteboarding on-site, and I was advised by the recruiter to "spend a year working somewhere with a very large codebase" and try again... As if that would contribute anything toward my ability to pass a whiteboarding interview. We all know the real way to pass whiteboarding interviews is to practice whiteboarding interview problems. A lot. I've been working as a software engineer on "very large codebases" for a while now and can confirm that I am no better prepared to pass that interview.


I remember reading another story on HN this morning from someone who applied to the big four (well, five) and got accepted by all of them. How they did it? By optimising solely for the hiring process and little else.

And, to be honest, I wouldn't hire this person - they are so focused on optimising towards the METRICS that they ignore what abstract goals the metrics are trying to measure. And that can easily come back to bite you as an employer; you can even see this behaviour if you look at what weirdness ML systems can optimise towards without the right constraints. It's a natural and easy way to do things, but it probably doesn't make good developers.


I quite disagree. This to me is a rather capable individual I'd hire simply because they are a good adaptive problem solver. Problem: Stupid interviews. Solution: game the system to get what that person wants, a job.


Every large dysfunctional project I've been on has been run by "good adaptive problem solvers" who prioritized metrics over the success of the project.

Honestly all the best managers I knew who ran the most successful projects were all bad "adaptive problem solvers". Who never really played the classic game so instead focus on project success over politics.


In the back of my head I wonder though - Once you hire them, will their goal be to optimize the same metrics you also care about?


As with all things in life, what you reward is generally what will be optimized for. So make sure your corporate culture doesn't reward dumb metrics.


> And, to be honest, I wouldn't hire this person

So... you wouldn't hire a person who came in prepared and performed well according to your metrics and standards? Or you have no metrics or standards, and just throw the chicken bones to make a decision for each candidate? Or, I guess a third option might be that you do not tell the candidate anything about your interview process before they come in, all but guaranteeing that they cannot do any meaningful preparation?

I'm having trouble seeing why a candidate working hard to prepare for an interview could be considered a negative signal, which seems to be what you're saying. Correct me if I'm wrong, by all means.


You wouldn’t want to hire them, but you probably would as they optimized themselves to your interviewing process. You’d have to add something to your process to weed them out, but then how could you tell? As hiring processes are all necessarily heuristics, they can all be gamed in some way.

This is why referrals or at least reference checks are so important.


I agree with rafiki16, the candidate "hacked" the interviewing problem - great! The stuff you bone up for a technical interview isn't going to hurt your work performance, it just an adaptive response to a broken interviewing process.


Don't hate the player. Hate the game.


There is. Take a set of realistic, representative tasks from your day to day work, decouple it from the context as much as possible to limit the need company specific services, software or knowledge while performing the task.

There's a few reasons this is rare:

* Setting up a test like this requires prep work, whereas a chat, recycling old tests and downloading questions from the internet does not.

* It's rare for hirers to evaluate or be evaluated on their own hiring process.

* Lots of developers want to cargo cult Google's hiring process so they ask you to whiteboard algorithms.


This is what we do. We created a small programming test that's very representative of 80% of our work. Candidates are asked to only spend 2 hours on it. If they can follow simple instructions and write a reasonable implementation we'll have a face-to-face.

This saves us a huge amount of time and I don't feel it's overly burdensome on candidates.


>I don't feel it's overly burdensome on candidates.

I do. After doing a couple of these "2 hour" projects from companies I have a suspicion were not even really serious about hiring and not even receiving a courtesy rejection email I swore off them forever.

These days I just point the company to my github and am explicit that if that isn't enough to grant me a face to face I probably don't want to work there.


Someone sounds pretty entitled. You know how burdensome it'd be to verify that your repos were fully yours and how much energy an employer would have to exert to make sure everything was connected/architected well vs using a template I'd be already familiar with / created, that could be slightly altered to avoid it circulating the internet.

2 hours is not unreasonable for a take home project with the entire internet at your disposal. People need to be motivated to join the company they're applying to! (Anything more is definitely unfair for the candidate and doesn't scale well when reviewing code)

This notion that 2 hours is a lot of time, well plenty of engineers would rather have that than waste a few months memorizing algorithms they'll rarely use. It's pretty common for algorithm tests to be an hour long anyway.

So many software engineers have it backwards. Companies don't work for you, you're not even in the door yet. Unless you're a vp or principal engineer with a stellar bg, your tasks are replicable and most employers aren't going to be drooling over to get you on board as entitled / piss-poor attitude is going to cause more friction than it's worth.


Eh. There is this thing called a market. Sometimes producers have the whip-hand, sometimes it is the consumers. We have markets in labor, and anyone who (for instance) weathered the 2000 dot.com bust knows, employers in tech don't hesitate to use it when the wheel turns.

I refused a homework project for one firm[1]. It was a highly specific problem that only made sense for one specific version of an app server, and was really a lot of work with some tricky edge cases. Call me entitled, but after I made sure I understood what he was asking, told the interviewer where he could put his homework.

I think it is important to keep a sense of the power dynamic. It is really easy to start moralizing to the powerless (here, employees) when they do find themselves with bit of leverage for a change. Not only is it a bad look - punching down is for insecure assholes - systems break down without feedback and (sometimes) pushback.

[1] Well known, I'm not naming names because it probably was a fluke.


>Someone sounds pretty entitled.

Maybe the guy who wants me to volunteer 2 hours of my time before deigning to have a conversation with me?

>You know how burdensome it'd be to verify that your repos were fully yours

I've never interviewed anybody who lied about the provenance of their repos. I'm pretty sure if I did, it would come out in a 5 minute conversation during the interview.

If one or two gifted liars slipped through the hiring funnel into the interview stage before being found out I wouldn't view it as a tragedy.

>This notion that 2 hours is a lot of time, well plenty of engineers would rather have that than waste a few months memorizing algorithms they'll rarely use.

2 hours is fine for an interview, but it's way too long to spend on a speculative application for one company.


> 2 hours is fine for an interview, but it's way too long to spend on a speculative application for one company.

So... you limit yourself to no more than 4 hours effort applying to any company? Doesn't that limit your choices greatly?


I limit myself to about 15 minutes' effort before talking to somebody who works at the company.

I don't find it limiting, no. I am only very rarely asked to jump through a bunch of bullshit hoops.


That's not what I asked.


I think "2 hours" was in quotes because maybe many of these assignments are nominally 2 hours but realistically require 4-6. Just speculating though.


Consider that you're interviewing at 20 companies: What if they all did this? That's 40 hours, or an entire work week of unpaid work, for the chance at getting to the next round. Seems unreasonable to me.


I never hand out assignments like this (or have in-person interviews w/ a similar task while having the internet at their disposal too) unless the person's already sent in their CV/Resume, had a phone-screen and they seem like a good prospect.

Regardless, you're going to have to put in time unless you're already trusted by one of the seniors/leads/managers.. if that's the case then there's no coding assignment.

If you're interviewing at 20 companies concurrently you should probably not interview at so many companies at one time and narrow down who you would most likely want to work for.


>I never hand out assignments like this (or have in-person interviews w/ a similar task while having the internet at their disposal too) unless the person's already sent in their CV/Resume, had a phone-screen and they seem like a good prospect.

If they seem like a good prospect then you can signal your seriousness by granting a face to face interview / test.

By throwing out homework assignments (which cost you 0) you are signaling either that they do not seem like a good enough prospect to be worth your time or that you simply view their time to be worth vastly less than yours.


> If they seem like a good prospect then you can signal your seriousness by granting a face to face interview / test.

Yup, that's what I will do occasionally too. If the phone-call goes well, I'll invite them onsite for that coding task... no algorithms. I like being flexible with them, if they are more comfortable doing it at home due to scheduling, they can hack away at home.

>By throwing out homework assignments (which cost you 0) you are signaling either that they do not seem like a good enough prospect to be worth your time or that you simply view their time to be worth vastly less than yours.

I generally stick with working at start-ups. I need to know that the engineer I'm bringing on can handle high-pressure situations when we have to deliver milestones. Which I can totally understand why some potential employees that I've handed this assignment to become upset / flustered. But it's a good indicator if they don't complain at all and do an exceptional job, that they'll generally do well and at the very least be open to critique so they become a better engineer.

These requirements are absolutely not appropriate for more established/larger corps just as start-ups aren't for everyone.


> These days I just point the company to my github and am explicit that if that isn't enough to grant me a face to face I probably don't want to work there.

I'd be happy with that if the projects looked they were yours. I can count on one hand the number of candidates who have had GH profiles, and even fewer with anything worth looking at. All I really want to see is a few examples of coding style and basic thinking.

To me, being an engaged participant in the open source community is a massively positive signal, even if it's just reporting issues and the very occasional PR.


agreed. if it is a "2 hour" project, have that be part of the on-site interview (with the expectation that people might not complete it)


I applied to one position here on a HN job thread a bit back. Looked right up my alley. Doing IoT, 3d modeling, circuit design, and glue code. So, I did the initial phone screen. Then they gave me the 'zinger': we want you to make a webapp for us. We expect this will take 8-12 hours (?!).

I responded back, that's billable time there. That would require me to take PTO at my current job, and do your work unpaid. And that, I will not do.

I kindly told them where they could put their job. (They still advertise jobs on the monthly, but they seem legit aside that onerous 1.5 days of work. And no, I won't mention whom. Hopefully they'll rethink their policies, but alas..)


On behalf of many, many people: Thank you! An 8-12 hour unpaid work assignment is an entirely unreasonable request.


Sure thing!

Yeah, I did ask around in their area with fellow engineer-y types. They are a legit company, and the webapp wasn't a way to get free work. Think of it as a standards approach, rather than bulletpoint resumes that verge on untruth.

To be honest, I'd prefer they have a sit-down with me. They can do their fizzbuzz or linked list test, but I want to get a feel of the problems they're fighting with. I've always liked deep technical discussions, and how I would do it vs how they would/did it. It's a bit more subjective, but I would think that discussing technical aspects and asking how I (interviewee) would do it.

I offered an IoT project Ive developed myself in lieu of it, so they could comment on my code quality. They declined that, and wanted their webapp.

They're missing out :) Although I've an interview with GitLab coming up. Everything I've seen how they operate and treat their people is just awesome. Regardless if I'm a good fit for the position they're offering, I still think highly of them. :)


The thing is, as gilleain says, the majority of promising-sounding candidates aren't worth progressing with. This is a combination of:

* Failing outright to do anything even close to the requested task

* Not even basic error handling

* No tests

* Using generated code where it's not appropriate, or not bothering to implement e.g. catch blocks they've generated

The instructions explicitly tell candidates what we're looking for, so it's not worth chatting to someone without doing this filter.


Take home assignment is basically an inquisition-style "accuse yourself" situation. Implemented the requirements? Sorry, there are no tests and we took them as granted. Skipped non-trivial part of the task? Sorry, the implementation is incomplete. Used XHR? Sorry, Fetch API is the thing. Used Fetch API? Sorry, async/await are the thing. You sent URL to a repository? Sorry, we wanted an attachment. You sent the code as an attachment? Sorry, our spam filter threw it away. ENOUGH.

I'm applying simultaneously to tens of companies, if you insist I can send you a link to my repositories with all take home assignments I've done so far.


Just a month ago I had a company task me with implementing an xSV parser with four cases, the fourth being "arbitrary delimiter." I implemented all cases except this last one (only because of time) and included the tests and benchmarks offered as "extra credit."

Ghosted.


The best jobs I have ever had spent less than 2 hours total in on-site interviews. The worst companies that I never heard from again burned a whole day running me through the gauntlet.


Last interview cycle I got 10 hours of homework total from three of my prospects.

It’s not that it’s 2 hours, or four. It’s that you’ve given two hours to twenty people and so have five other companies.


We do the same. It's amazing to me how many people with superficially impressive CVs do one or more of:

* Cheat

* Fail to follow simple instructions

* Write no tests for there code (even the given examples!)

* Just horribly fail to implement an algorithm

All in all, it's just a filter to be used before proper face-to-face interviews.


> Write no tests for their[sp] code (even the given examples!)

Well, I've been on the fence on this one. With imperative languages, testing is pretty much a requirement because of side effects and scope-creep (nahh, throw it into global).

Tests can be a good sanity check for simple checks like "I fixed a bug where X is supposed to give 22. Lets verify that"... But the problem is the tests are then code that can update and rot as main code changes occur.

I've preferred a more functional method. I don't want to test - I want to prove a function handles its inputs and provides the correct outputs. Unless there's a good reason for state , I prefer keeping functions clean. And even then, with state (say, a tally function for bandwidth), I prefer to have the variable updated with its own function. Keeps clean/nonclean separate.

Now.. one thing that's absolutely not acceptable is not commenting code. "Magic sequence in perl doing 6 things unintuitively" is not code comments. :(


One person's "Magic sequence in perl doing 6 things unintuitively" is another person's "I actually know Perl, and the standard functions being used here, so this is nothing special".

Comments are good, but what's aceptable as a comment is entirely subjective to someone's competence in the language in question.

For example, if you can't look at a Schwartzian Transform[1] in Perl and understand what it's doing, if not immediately than after a moment or two of looking at the steps, then you do not actually know how to program in Perl, you're just muddling your way through. Nothing in the transform is special, and if you are confused by anything in it that's the equivalent of being confused by arrays and simple array syntax in C. A comment of "optimized sort on computed X attribute" should be more than enough, but to anyone slightly unfamiliar with Perl it will seem woefully inadequate.

1: https://en.wikipedia.org/wiki/Schwartzian_transform


it is java, so yeah. of course production code could have all sorts of levels of tests and qa and all that jazz.

really, though, if i'm implementing an algorithm that i've never coded before it would seem strange not to write at least some kind of verification of its correctness.

comments can be language dependent - perl yes, java maybe (on an API, definitely) - and situation dependent. i wouldn't care about comments on small amounts of code with good variable names and sensible structure, for example.

edit : also apologies for the formatting, i hate typing on a phone!


> Write no tests for there code (even the given examples!)

Just curious, do the instructions ask for tests? If one of these "programming quizzes" asked me to implement algorithm XYZ, I would follow it to the letter and implement only the algorithm.


we don't ask them to actually provide their tests, but implementing a non-trivial algorithm without tests is optimistic at best.

we give them a couple of examples (input, output) pairs - surely you would at least run your implementation past these, right?


I find that the quality of candidates is a function of the quality of the hiring funnel and the money offered.

If your funnel spits out 'surprisingly' shitty candidates then it's probably not the best funnel.


there are also surprisingly good candidates, as well. this is just the top of the funnel, but agreed that it could be better.


Absolutely.

I'm not going to name or link, but this is a service we've started providing and have a growing library of assessment techniques (and analytics on the effectiveness of those techniques). This reduces the prep-work required.

The aim is to get to the point where you can say, e.g. "I need a Finance Director for an SME" and we can pre-populate a set of sensible defaults for you to tweak if desired. Baby steps though..

Once we build out those reports we'll also be able to report back to senior management on hiring manager performance. Metrics like typical time-to-hire & volumes etc. but also whether there's unusual demographic bias, the satisfaction of rejected candidates, and some others. I'd also like to make those metrics public (companies would have to opt-in).


I agree, and I'm not sure what these kinds of posts are trying to accomplish, because they're speaking from hindsight and ignore with many other counterfactual possibilities that don't happen to support their point. Every alternative I've seen proposed to FizzBuzz and Whiteboard style tactics is flawed or self-selecting in its own way, and often come at a higher cost to one or both of the interviewee or the company as the candidate pool grows. Cultural fit getting a lot of flack these days for reinforcing unconscious bias is hardly proven fact, but even if it were, it would not be too hard to identify and reach consensus beforehand on what the team lacks, e.g. someone who is a testing fanatic, a security wizard, a deep dive experimentalist, etc. and to instruct all interviewers on how to spot those needed elements.


> Cultural fit getting a lot of flack these days for reinforcing unconscious bias is hardly proven fact

Talking about things as a "proven fact" (or not) is not a helpful mental model. It's an impossibly high burden of proof that would mean we never learn anything. All we will ever have is a bundle of findings from studies with finite funding and various shortcomings. Right now we have a fair number of studies that found statistically significant increases in productivity from more diverse teams, and (to my knowledge) none showing the reverse. Perhaps there are some but they weren't published.

Is this a conclusive law of nature? No.

Does it seem probable on balance that the types of situations studied experienced the effects reported? Yes.

It's also not useful to talk generally about "cultural fit" because it's broad enough to be essentially meaningless. Companies that hire for "fit" well do so by unpacking exactly what behaviours they're looking for; entrepreneurialism, directness, risk taking, etc. and finding ways to assess those qualities.

If folks want to hire people they want to drink beer with, that's their choice, but failing to unpack the components of culture fit means that their choice is uninformed and unmeasured.


It would be nice if you could link to some of the key studies in this area. Just because there were a few flawed studies is hardly a reason to take it as actionable information.

Has it even been determined that diversity leads to productivity or is it just that diversity is a result of the same thing the productivity is?


> Just because there were a few flawed studies is hardly a reason to take it as actionable information.

It's interesting that you (someone apparently unaware of the work in this area) is reducing my "a fair number" as "a few" and tainting them as "flawed" when all I did was to acknowledge "various shortcomings". Were you conscious of trying to minimise the weight of evidence without having actually seen it? I'm not judging you for it, it's normal, I'm just calling it out so that you're aware if you previously weren't.

As for whether or not this is actionable, I'm not aware of anyone acting on the basis of diversity carrying a productivity dividend. As far as I've seen (my company works in this space) it's used as a supporting argument to help justify measures to reduce the effect of unconscious bias (a separate area of study) which carries its own dividend, i.e. better employees.

> Has it even been determined that diversity leads to productivity or is it just that diversity is a result of the same thing the productivity is?

That's right, correlation and causation are not the same.

It's been correlated using real world company data, theorised under social capital models, observed under lab conditions relating to jury decision accuracy, and causally shown at a GDP level using computer models. There may be others. There may be publishing bias. There may be flaws. I've already spent enough time writing this so you can google them yourself. These studies each have various shortcomings, eg. self-reporting, scale, assumptions (in the case of the models) and relate to specific contexts rather than generally... so this is a case of the balance of evidence rather than "proven facts".


Have you noticed how you dodged the request to link the related studies by bringing disproportial attention to the rest of grandparent's post in a condescending manner? I'm not judging you for it, it's normal. Just calling you out.

Also can you link what you consider to be representative studies on the topic?


You're right, it was a little condescending, I was irritated by the way hueving minimised the evidence without apparently knowing anything about it.

(I'm sorry, hueving, that was immature of me.)

wanderer2323, I used to cite studies but no longer consider it a useful way to discuss topics like this on HN. Too often it descends into methodology theatre and too rarely does it result in useful dialogue.

So, with my apologies, no. I won't link to studies. If you're curious I've shared enough background in other comments for you to get started with your own research.


> Too often it descends into methodology theatre and too rarely does it result in useful dialogue.

That is what happens when you present weak evidence.

I have yet to see any convincing studies despite it being straightforward to test in small repeatable experiments.

It is not hard to get 20 different 5 person teams to perform a complex task for a couple of weeks and see what impacts different types of diversity have.


[Just to address the other points you raise because I was in a rush last time]

> I have yet to see any convincing studies despite it being straightforward to test in small repeatable experiments.

If these experiments are so "straightforward", and your resistance to these ideas is well-founded, then where are all the experiments demonstrating that there is no such effect?

Alternatively if you believe publishing bias and/or feminist conspiracy are combining to quash all this budding anti-diversity research then why don't you run one yourself and show everyone how it's done?

If you're not up for doing the actual experiment I'm happy to pass your protocol around for some feedback, maybe I could find someone to run the experiment for you. Maybe I can even help with funding.


> Alternatively if you believe publishing bias and/or feminist conspiracy

Why does it have to be a conspiracy?

There are two major groups pushing this research:

1) Companies looking to make a buck by providing "solutions".

2) Social justice types.

Both these groups want there to be serious problems caused by a lack of diversity.

And then you look at the expressed ideologies of the researchers themselves and it is clear they also want the same outcome.

As such what makes the research that comes from these groups any different from research coming out of corporate and political think tanks?

I wouldn't blindly trust the Cato Institute so why would I blindly trust these researchers?


Sure, publishing bias exists, as does confirmation bias (on both sides of any debate). Those biases do have an effect, and 'social scientists' do tend to be left-leaning in my experience.

But, those people have done the work, usually had their protocol evaluated, written up their paper, had it reviewed, and had it published (under their real name).

Hacker News critics on the other hand have (typically) not done the work, and have not tried to get their experiment funded and written up, and make sweeping judgements like "methodologically bankrupt" based on unqualified criticisms like "poorly designed" (no specifics or evidence of understanding), usually under an alias with effectively no repurcussions.

You yourself claimed that designing convincing studies without shortcomings is "straightforward", implying that the lack of convincing (in your eyes) evidence is therefore proof that the evidence base can be broadly ignored without actual study. That's rationalising, not logic.

Similarly, when challenged to provide a design yourself you change the subject. My offer is genuine by the way... our investors include some of the most influential organisations in this space. If you have a great experiment design I'm happy to put it to them and give you credit.


> You yourself claimed that designing convincing studies without shortcomings is "straightforward"

Yes because - like I said in my other comment - studies on effective teamwork have been done for decades.

Point me towards a well cited meta-analysis on the impacts of race and gender diversity where the studies are repeatable experiments involving complex problem solving in a small team.

> Similarly, when challenged to provide a design yourself you change the subject.

Because you are not acting in good faith.

We both know designing a proper study would take at least a couple of months of work.

What I have asked from you on the other hand is to provide a link to an existing meta-analyses that supports your claims.

For someone who works in this space and claims there is significant evidence this should be a 5-10 minute exercise.


> Because you are not acting in good faith.

I'm sorry, ad hominem attacks are the sign of a debate that's over. Bye.


> then where are all the experiments

That is exactly what I am asking you.

Where are all the small scale experiments that show diversity in race/gender in a team lead to better outcomes?

There are a lot of small scale studies on effective teamwork. I went through them a long time ago when I did my business degree.

I don't recall any of them suggesting diversity of race / gender were large contributors.

Business schools have been running small scale studies on effective teamwork for decades.


> That is what happens when you present weak evidence.

In a perfectly rational world you would be right. That's not where we live though, is it.


>Too often it descends into methodology theatre and too rarely does it result in useful dialogue.

That must be because your studies have glaring methodology defects, like most of the social studies these days. (It's easy to predict the things you'll see: vanishingly small sample sizes, no preregistration, data massaging, etc.)

You however are trying to get other people to update on methodologically bankrupt information and, when asked for proof, are taking the high ground "I won't link to studies, it leads to methodology theater and I'm above that". With my apologies, your opinion that 'diversity leads to improved productivity' is unsubstantiated, likely false and you delude yourself if you think you are right.


> That must be because your studies have glaring methodology defects...

How delightful it would be to live in such a simple world.

Here's a study for you:

Handley, Browna, Moss-Racusinc, and Smith (2015) found that men are more likely to judge evidence of gender biases as low quality, and that effect is particularly pronounced in STEM fields


must be good to live in such a simple world where you can expect to bring a study about gender biases into a discussion about diversity studies and have it prove or explain anything. anyway, the problems with the study you linked are glaringly obvious -- poorly designed study, exceptionally poorly designed controls, no preregistration (that I can see mentioned at least), sample sizes that make me really want to check their math -- and these are just from the abstract.

If this is what you have in mind when you speak of research, then you can believe whatever you want, really. false -> true === true after all


> must be good to live in such a simple world where you can expect to bring a study about gender biases into a discussion about diversity studies and have it prove or explain anything

As I'm sure would be clear to you if you stopped to think about it... you strongly implied that studies are evaluated based on merit alone. This shows that they are not.

Does the study look at precisely this situation? No, that would be an unlikely coincidence.

Does the study demonstrate a gender bias in the evaluation of gender-related research, even amongst those knowledgable in the subjects? Yes.

Does the study suggest that STEM fields could be more susceptible to that? Yes.

> If this is what you have in mind when you speak of research...

As I've been saying from the beginning, this is about the balance of probabilities. By all means create your own study (and get funding, and get it published) that counters the findings. Until you do that, or someone else does that, or someone finds that the studies in question are fraudlent, unreproducible, or so flawed as to have literally zero value, then the balance of probability is against you.

You're welcome to subscribe to the view that any evidence less than a large-scale randomised controlled trial is worthless, but there aren't the funds for that, and may never be.

So your choice is between learning nothing, or learning 'maybe something'.

If you choose to learn nothing, or more accurately to deny 'maybe something' then I'm curious to know why.


There are two separate claims:

1) Ensuring that factors not relevant to job performance like age and gender are not excluded for in hiring or antagonized in workplace hostility under the vague protection of "cultural fit", because skills, knowledge, talent relevant to business success can be found across demographic subgroups. That's supported in studies.

2) Packing a company with people of those non-relevant characteristics with the assumption that diversity itself is the causal instead of the correlative factor. That claim I believe is not supported, and I challenge you to cite evidence for it, or else to qualify the statement "statistically significant increases in productivity from more diverse teams".


I've addressed this in a bit more detail in other comments, if you need more you're welcome to look into it.


I don't see citations in your other comments, nor sentiments that clarify controlling for factors identified as occupationally non-relevant. If you're advancing the idea (which I deduce from your other comments) that people who are diverse insofar as they have separately relevant attributes which the individuals themselves perceive as incompatible and deleterious, then I have no argument about it, and can admit I don't know anything about the research specifics in that sub-problem of management. But you should explicitly spell out that this is your claim. Furthermore, that's an internal problem, not the problem of rejecting people at the hiring phase for being a bad "cultural fit". For example, if you don't write tests, "move fast and break things", and feel strongly about that, but we have mission critical code and clients who will drop us for competitors if our software is buggy, you're flat out not going to work here. The "diversity" you bring will be objectively bad for the company, barring some proven god-like irreplaceable ability to deliver value on another front.


If you're advancing the idea (which I deduce from your other comments) that people who are diverse insofar as they have separately relevant attributes which the individuals themselves perceive as incompatible and deleterious, then I have no argument about it...

This looks like an incomplete sentence to me; what is the "it?" "People who are diverse insofar as [condition] which they themselves [see negatively]" defines a certain (sub)set of people, but the idea, assertion, or conclusion that you're agreeing with appears to be missing.


I have to confess I don't follow a lot of your comment, but I wouldn't describe whether writing tests (or not) as diversity.


> Right now we have a fair number of studies that found statistically significant increases in productivity from more diverse teams

Are there?

Diversity defined how?

Repeatable experiments or statistical analysis of groups of companies?

It could be studied experimentally and so I am sure it has been.

The studies using statistical analysis to compare across companies are always weak and should be taken with a grain of salt.


> Are there?

Yes.

> Diversity defined how?

Depends on the study. Usually gender or race.

> Repeatable experiments or statistical analysis of groups of companies?

Depends on the study. See my other comments in this thread.

> It could be studied experimentally and so I am sure it has been.

It has been. See the study into racially diverse juries making more accurate judgements.

> The studies using statistical analysis to compare across companies are always weak and should be taken with a grain of salt.

Everything sould be taken with salt because salt is awesome.


> Right now we have a fair number of studies that found statistically significant increases in productivity from more diverse teams

People (manager/HR/VPs of Diversity/etc) do not understand the implication.

Here's an intellectual exercise:

Would a team of two people - one being a foaming at the mouth antifa and another being card carrying garb wearing KKK dude be very diverse? Certainly. Would this team be productive? Unlikely.


I'm not sure your hypothetical is as clear cut as you think it is.

> ... be very diverse? Certainly.

Very diverse?

Okay, for the sake of your hypothetical let's assume that this antifa member has a different educational and socio-economic background to the KKK member. Otherwise they're just differ on one axis and may be identical on every other.

Let's also assume that this anti-KKK person isn't Daryl Davis, who has talked around 200 KKK members into hang up their hoods by engaging with and befriending them. He's an unusual case.

> Would this team be productive? Unlikely.

What's the timeframe? Can we assume that these people have rent to pay and need this job?

Social capital models indicate (and experimental evidence supports) that people in diverse teams feel like they've been less productive (despite actually being more productive). Reduced social capital makes people less comfortable collaborating and more willing to disagree, but the flip-side of that is reducing group-think.

Think how little group-think would be evident between these two individuals, they'd disagree on everything. Yes it would be uncomfortable, but they both need the job so there's a limit

But let's go national with this pairing hypothetical. Southern Poverty Law Center estimates KKK as 5-8000 members. Assuming 8000 and that antifa is similar and all are working-age, that means that in a US-sized of 320,000,000, maybe halved to focus on working age adults, you'd be lucky/unlucky to see many (if any) instances of this type of pairing. Even assuming that all 8000 were paired directly against a mortal enemy that still only represents 0.01% of the total pairs. Even if we assume that each organisation has 100x its official membership in sympathisers that's still only 1% of the pairings (and remember we cheated to make sure they were always matched up).

If we can't assume that these people need their job then that narrows these numbers (and hence the probabilities of occurrence) to include people in well-paid positions or with inherited wealth, or who would rather be homeless than work with someone they disagree with politically.


> but feel it is silly to have to develop an alternative set of skills in order to signal that they are a good hire

The problem isn't that they have to develop a set of skills that combined with their true skills gives the interviewer a glimpse into their abilities.

It's that it's an entirely different set of skills entirely. We might as well be asking people to learn woodworking in order to showcase their software skills. The ability to sound knowledgeable and make someone like you in 50 minutes is pretty orthogonal to writing good code as a member of a team.


> it's an entirely different set of skills entirely

I'm not sure whiteboarding-style interviews are entirely orthogonal. They are definitely not a measure of real software engineering ability, but I suspect there is decent correlation, enough to give companies a sufficiently low false-positive rate, which is what they care about. The flip side is you have a high amount of false-negatives, and it sucks to be one of the people who end up in that category, because you know you're good enough but you just didn't pass the arbitrary filter that requires that "entirely different set of skills". I've been there, as I suspect most of us have. It's not fun, but it's hard to see what would motivate these companies to change their practices as long as they feel their false-positive rate is low enough.


A standardized white-boarding interview is probably not completely orthogonal. But too many white board interviews don't have a standardized set of questions, with a standardized set of criteria over which to measure them.

And the ad-hoc tests have very little validity, and rely more on what questions the interview asks, and how confident and charming the interviewee is. Not to mention lots of credentialism. "Candidate Joe didn't answer the question but he's a Googler so that I probably just because I did a bad job explaining that." "Jessica didn't answer the question because she didn't know anything about software".


> I often feel that engineers write these posts because they know that they themselves are good at their job...

The Dunning-Kruger effect suggests that almost certainly, some non-zero proportion engineers who have written this type of posts are not.

https://en.wikipedia.org/wiki/Dunning%E2%80%93Kruger_effect


While what you're saying is true, I think the dunning kruger effect has an inverse bell curve based on level of confidence vs knowledge.

High confidence low knowledge. Moderate confidance moderate knowledge High confidence high knowledge.

Of course that only applies to the things the expert knows like the back of his hand. There are things I am 100% confident I know, because I've done it thousands of times. Then there are some advanced things I still question if I truly understand because I've only successfully done it a few times.

Of course when I was younger, the things I'm 100% confident now I went through the phase of "oh I can do this super easy" to "holy crap this is harder than I thought" to "I've done this 1000's of times, I got this"


> There are things I am 100% confident I know, because I've done it thousands of times.

If you've done thousands of job interviews (for example), does that make you an expert at job interviews? Or a terrible candidate?

Now: how many people who pass every job interview immediately subsequently write an article about questions to ask at a job interview?


Dunning Kruger is only valid for simple tasks. The study tested, for example, whether participants get a joke. The participants, btw, were all Cornell undergrads, NOT the general population.

The "Dunning Kruger effect" actually reverses and there's a negative correlation for highly skilled and highly knowledge-based tasks like "engineering".

http://www.talyarkoni.org/blog/2010/07/07/what-the-dunning-k....

"...some studies have shown that the asymmetry reported by Kruger and Dunning (i.e., the smaller discrepancy for high performers than for low performers) actually goes away, and even reverses, when the ability tests given to participants are very difficult."

So please don't treat Dunning Kruger as a way to claim anyone who's wrong or you disagree with is suffering from inflated confidence and lack of skill. Do note the potential for confirmation bias as well as inflated confidence in the very claim that others might be prone.


>The Dunning-Kruger effect suggests that almost certainly, some non-zero proportion engineers who have written this type of posts are not.

The effect can't suggest anything, you would have to test your own hypothesis on your own sample set of people writing those posts. Extrapolating from other studies on other sample sets used for completely different inquires is bad science.


>If there was a way to evaluate a candidate based on how well they would do at their actual job, in an hour or less, that couldn’t be gamed or cheated, we’d all be using it.

Right, because there's currently a ten million dollar bounty up at way-to-evaluate-a-candidate-based-on-how-well-they-would-do-at-their-actual-job-in-an-hour-or-less-that-cant-be-gamed-or-cheated-bounty .com - which everyone knows about and has known about for decades, and checks regularly. The site can't be gamed and it's run as a public service. Literally within a day, and I mean a day, that anyone ever uses the winning method a single time, God himself will write up the method and submit it on the interviewer's behalf (invisible hand hypothesis), the site will judge it correctly and approve it (just world hypothesis), and the next day we'd all be using it. Oh also there's a genie that makes people use the winning entry I forgot to mention that, otherwise people might not all use it even once it's well-known.

I mean just consider the wildly implausible set of circumstances it would take for not everyone in the world to be using the best interview process in the world!

It's clear that it just doesn't exist. How could it?

/s

-

Edit: someone didn't like this. Just imagine what method you suppose by which everyone would learn the perfect interview method from each other. If that method exists, why would everyone know it?


The "good fit" sent shivers through my spine!

I'm a bit of an odd ball in terms of background, and in my early 20's I interviewed for really interesting jobs at really interesting companies (Intel, oracle, Digital, and some more). Every single time the tech staff I would have worked with finished the interviews by saying "You just need one last interview with HR, it will be nothing, looking forward to work with you!".

And every single time the answer from HR was that " I would not fit in".

After 4 or 5 companies turning me down I thought fuck this, I don't want to fit in, I want to produce good quality work and make customers happy, and started to look exclusively for contracts. Thirty years later, I'm still contracting, every single of my customers has been very happy with my work.

Funnily enough, nobody tried to hire me as an employee in those 30 years! And they were right, I'd probably wouldn't fit in. I don't accept status quo, I don't buy in cargo cult, I'm not interested in coming in to wait for the hours to go by.

Remember, "fitting in" is not necessarily a good thing. People hire me because I don't.


The purpose of HR departments is to avoid liabilities. The purpose of rejecting “bad fit” candidates is to avoid losing their job if someone who looks or acts odd ends up being a liability. “You should have know he’d burn the building down. He was weird. He had a nose piercing!” So they avoid hiring anybody with a characteristic that could be seen as a red flag in retrospect.

It’s a product of the corporate mentality.


Makes for very blend boring companies. Probably also explains some part of the "lack of diversity": This person is so different than anybody else we've ever hired, let's just say "They won't fit in". This is an easy loop to get stuck in, and can really hurt people from minorities or with different background.


Unfortunately, the word “diversity” isn’t usually used in that context in the corp world. Disruption is only something to talk about, they don’t want to actually do our, nor hire anyone disruptive to the current culture.


I got my current job without interacting with HR at all. Phone screen with the CTO. Phone screen with the engineering manager. In-person interview with the CTO, then the engineering manager, then the team I would be joining. In a perfect world, prospective employees (of any discipline) would have absolutely no interaction with HR prior to being given an offer, and HR would have absolutely no say in hiring.


The perfect world you're talking about has Personnel departments instead of Human Resources. Personnel departments deal with the tactics of staffing a company without the strategic planning that HR departments do. Interaction is okay. They just shouldn't be part of the decision making process.


Do you think the amount of contracts available has changed over your career?


It obviously fluctuates with the economy. For example, in 2009, I spent 8 months with not a single chargeable hour.

What has changed the most is the nature of the work and being able to work remotely. 30 years ago most people who "ended up" being syadmin were people who coulnd't write software. There was a lot of demand but very few people who were willing to commit to what looked like a "career for failed programer". Nobody would go to university with the hope to become sysadmins. Then sysadmin became a sought after job. People did mix of IT and CS in uni in the hope to become sysadmins. Then it became a commodity. Then it sort of disappeared. These days there is a lot of demand for "DevOps" (which means different things to different people). If you're interested in both the ops part, and the delivering software in the most automated way possible part, there is work out there.

Until 5 or 6 years ago, nobody, or at least very few people, were hiring remote workers. It still isn't the majority, but you can find remote contracts if you look really hard for them.

To give you an idea, 30 years ago I was the sole sysadmin (for UNIX machines) in a software lab, and was in charge of 5 AIX servers. These days I tend to be part of a small team and working on CI/CD pipelines, controlling resources and security on public and private clouds, help with the architecture of some solutions (CDNs, type of stores (redis vs S3 vs...) etc... You do need to adapt.

When I receive calls from recruiters asking if I know anybody who could do the same type of work, I contact my old sysadmin friends who often haven't worked for a few months, some of them who are often younger than me, they say they are not interested in that DevOps and those cloud things, they want to stay sysadmin, and some of them like to blame ageism for their lack of finding work.


> "DevOps" (which means different things to different people)

you're not kidding. at one company i worked at, it meant "let's lay off the support team and have the developers take tech support calls from the customers!".


> When I receive calls from recruiters asking if I know anybody who could do the same type of work, I contact my old sysadmin friends who often haven't worked for a few months, some of them who are often younger than me, they say they are not interested in that DevOps and those cloud things, they want to stay sysadmin, and some of them like to blame ageism for their lack of finding work.

It seems to me you're implying that, because they're unwilling to change careers from being a sysadmin to being "DevOps" (for any non-sysadmin meaning of that term), including programming against the APIs of those cloud things, their lack of work can't be due to ageism.

It seems to me you're also implying that, because they're younger than you, their lack of work can't be due to ageism.

Although there's some merit to the first implication, given your general thesis of the necessity of adaptation, the second holds no water. You may yourself be suffering from ageism and not even realize it, and ageism may not even apply in a linear, progressive fashion (nor on actual age versus perceived age).

Personally, I've also found traditional sysadmin jobs are an endangered species, but I'm unconvinced they'll go extinct. After all, those cloud things cost so many multiples more than running your own (even factoring in the cost of sysadmins, which may even be cheaper than the "DevOps" for the cloud services) that managers are bound to wake up and smell the savings.


> you're implying that .../... their lack of work can't be due to ageism.

I think ageism is a two-way street. Some people won't hire anybody over 30/40/x years old. Some people don't want to learn news ways, new technology hoping to keep doing the same thing they always did, until retirement.

What is traditional sysadmin? Once upon a time we used to re-compile kernel after changing parameters... and monitor handful of machines. There's value to own your hardware for companies that are large enough, but methodologies have changed.


I disagree that "ageism is a two-way street". Ageism is prejudice and/or discrimination based on perceived or actual age. How can that street go the other way? A lack of desire to learn has nothing to do with age.

Traditional system administration is approximately as varied as "DevOps", so I won't attempt to answer that question. However, I can easily answer what it isn't, and that's programming, which is what the vast majority of "DevOps" is today.

I'm not sure what your point is regarding recompiling kernels. Kernels are just a piece of software, and software (including kernels) still needs to be custom-compiled on occasion for a variety of reasons, though, of course, changing hard-coded parameters for static allocation of resources isn't one of them.

My point is that there's tremendous value to operate (not necessarily own) ones own hardware for almost all companies. To whit, "large enough" is actually very small, considering the prices AWS and their competitors charge.

As for methodologies changing, that is, of course, true, and not all have changed for the better, but, again, what's your point?


dorfsmay,

I'm trying to look at your résumé, following the link in your HN profile. It gives a 404.


Ah yes, I'll have to resurrect that one day...

Check the link to my linkedin page at the bottom of that page, it is fairly up to date. I also added a link to my CV there.


"In 1985, Freada Klein (then head of organizational development for Lotus) did an experiment... With Kapor’s permission, Klein pulled together the résumés of the first forty Lotus employees... Klein explained that most of these early employees had skills the growing company needed, but many had done “risky and wacko things” such as being community organizers, being clinical psychologists, living at an ashram, or like Kapor, teaching transcendental meditation.

Then Klein did something sneaky. She submitted all forty resumes to the Lotus human resources department.

Not one of the forty applicants, including Kapor, was invited for a job interview. The founders had built a world that rejected people like them."

http://vault.theleadershiphub.com/blogs/if-you-werent-boss-w...


It's unclear whether this was and is a mistake. You need different kind of people in the beginning and when the company is executing a tested business model.

We can usefully divide intellectual tasks into two sets: filling and framing. Fillers add more useful detail or content within some framework, while framers explore possible new frameworks. While both tasks are essential, framing has higher variance; most frame attempts fail, but a few produce great value.

http://www.overcomingbias.com/2006/12/fillers_neglect.html

This definitely causes issues later on when the organization or movement becomes incapable of replacing the original founders. The founders would need to both hire a lot of fillers and also mentor others like them to take over in the future. But this causes enormous resentment. Fillers will feel (absolutely correctly, this is exactly what happens) passed over for promotions that they by every other metric thoroughly deserve.


Why should a company hire people that are similar to the founders?

At an early stage, companies need very different people than later on. At first, they need generalists; once you reach a certain size, you need specialists.

The founders should be self-driven, they should have a vision, they should know about all parts of the business. Later employees should be excellent in their speciality, should have execution skills, should be able to follow someone else's vision.


And down the road they need zero generalists? I think the important takeaway is that they were 100% rejected. Maybe they need fewer people who are framers or founders as time goes on, but to have none sounds risky.

As you said the founders need vision. If those visionaries are filtered out in hiring, eventually the company will die (on the inside).


I always interview for "culture fit". In my experience, that's almost always the bottleneck, and the technical part is rarely a problem. Learning a new skill is not an earth-shattering problem if you have a good attitude towards work, and good interpersonal skills are a must in teams


Isn't culture fit just another word for "people like me"? There have been numerous articles reporting the huge damage this sort of optimisation does.

Did you mean "inter-personal skills"? The rest of your answer seems to suggest so. This is not the same as culture fit.


I've been looking for another word for "someone who is interested in the same software engineering and organization practices as the rest of the team" without using culture fit.

I don't care what TV shows they're watching or the color of their skin, but I do care that they don't grate against testing a peer's ticket instead of thinking it's below them. I care that they are helpful as a human quality. I care that they are interested on working on a team that's highly collaborative.

There are places where I'm not a great cultural fit. If the team's culture is to take a problem, go off for months, and solve it, it's probably not for me, but I know people who would be overjoyed in that case. I know people who'd like nothing more to get a spec and deliver rather than gathering and implementing requirements based on conversations within the organization.

In some sense, that's people like me, but I was in turn hired because I fit those criteria. People who fit those criteria seem to be happy with the engineering portion of the job, and people who don't fit that criteria tend to be frustrated.

I use the word "cultural fit" but I'm not sure if there's a better term.


This might be 'process fit'?

Some people have a bigger need to find a good fit on this, but many of the choices are philosophical in nature and having constant battles over why so/don't we do X is not productive. (Having occasional discussions can be productive, especially if there are clear deficiencies in the current flow).

As an interviewer, I appreciate when candidates ask about our process, regardless of if they love or detest what we do, I know they're asking important questions. As a candidate who can afford to be choosy, I would ask this of interviewers, as I know from experience I'm not going to be happy if I am fighting with the process all the time.


Can you not just say "someone who is interested in the same software engineering and organization practices as the rest of the team"?

I get that it's more than two words, but it clearly expresses what you're trying to say and makes it clear you don't mean what so many others mean when they say "culture fit"; same schools, hobbies, affluence, political attitude, and so on.

That said, seems like you might benefit more from people interested in different practices; people who can bring something new to the team, rather than more of the same.


There's a difference between someone who is looking through new ideas and suggesting that we try experiments to improve our quality, time to delivery, or the other things our team finds important, and someone who is irritated at the thought of writing unit tests or that they have to test a peer's code.

In other words, http://programming-motherfucker.com/ advocates are a bad fit.


someone who is looking through new ideas and suggesting that we try experiments to improve our quality, time to delivery, or the other things our team finds important

So someone who is interested in different software engineering and organization practices? Not the same software engineering and organization practices as the rest of the team?

It sounds like you don't want a culture fit so much as you have a culture in mind that doesn't fit, and you don't want that one. Different culture fine, except from a particular range of damaging cultures.

I would suggest that the very fact this thread (and others nearby) exist indicate that "culture fit" really isn't a good term, given how much extra definition is required.


That’s a philosophical fit


Maybe "methodology fit" or "workstyle fit"?


Neither of those are even close to better. Basically the same but with more letters and muddy the waters.

The only way to improve the concept, IMO, is to find the definition of the real thing you are looking for. I think the goal is exponential outcome? That's what business owners are looking for with any hire. Fit is silly, it all seems like a puzzle, but it's an ecology. You want a catalyst that challenges and unifies or someone that strengthens the existing base.

EDIT: Clarity


As mentioned above, the best way to describe is that people who want a culture like http://programming-motherfucker.com/ advocates for is the wrong fit for us.

And that's the problem with fit questions - it's much easier to say what is not us rather than what is us. I like the ecology metaphor, though. Organizations as ecologies...


> Neither of those are even close to better.

You just failed the "being civil" part of the interview. I think I'll let the GP judge whether those are better. At least I'm trying to help instead of just slinging stones at others' attempts.

> Exponential outcome maybe?

Seriously? You complain about "more letters" and muddying the waters then you vomit up that monstrosity?


I am really sorry you read that that way. I was not saying your contribution wasn't useful to the dialog. I was just saying that I think the category of "fit" doesn't need a different term at the front but more a rethink. I was not saying "Exponential Outcome" was a better set of words, but just my first blush at what I think the desired thing in a hire is that could be used to define new terms. I've changed my post to reflect that.

Again, sorry for making you feel like I was shitting on your contribution.


> I think the category of "fit" doesn't need a different term at the front but more a rethink.

That didn't seem to be what $ancestor was asking for. It sounded like s/he was wishing for a slightly different phrase for a slightly different concept, vs. what "culture fit" usually means. Your "rethink" was not that. It was a complete shift from strategy to outcome. Is that help, or appropriation? Not for me to decide, but perhaps something to think about. It's a mistake we engineers often make, and I'm hardly innocent myself.


I literally have no idea what you are talking about.


That will work until someone claims (truthfully or otherwise) that it's equivalent to what we typically call "discrimination".

"Culture Fit" is a good term. The problem is that certain firms use it disingenuously and that certain candidates are making a social issue out of it.


I was recently hired for a job at a high level position that I'm really excited about. While hiring me the CEO said "You are a bit outside of our normal culture, but we have talked and think that's an asset". I had to question that.

I'm a college dropout with 18 years of consulting and startup experience and they are all highly skilled academics. He even recognized that it was a backhanded compliment. The "culture fit" term has always bugged me, but it really came clear to me in this meeting. It's a useful cypher for exclusion on class and world view. I'm pretty sure that's not immoral or wrong, but I do think the concept is damaging to organizations that want an exceptional outcome. Turns out business is risky and basically runs on what can be predictable.

Working with people you like is really important and at high level jobs, there are few people that can transcend from outside orbits. The only reason I can involves a lot of hard fought failures and an ability to explain myself in documentation and method. A large amount of successful engineers have had a protected arc of experience that makes the unfamiliar grating and closes off dynamism that could come from people that don't whip ideas from the same pocket they pull from. Being an engineer seems to also be magnetically opposed to the nuances of human resources too. A lifetime of meditation on failure can sometimes nerf the poetry available in human experience.

I have no solution to it broadly. Stoked on the new job, it looks like I'll be in charge of hiring too. I have a good track record with it in my own org, but the stakes here are bigger. If I get proof of my method I'll report back, if I fuck it up I'll probably be fired for bad culture fit.


> Isn't culture fit just another word for "people like me"?

My ex came to me in a rage the other day because the large bank she works for had produced a promotional video for HR purposes talking about success that was presented before a presentation on diversity, and where every single person interviewed from the bank was a white, blond male, while black people were only presented when cutting to news footage (e.g. of Obama etc.) - they apparently couldn't, or didn't want to, find a single black person or asian person in a workforce of tens of thousands that they could use...

The same bank created a promotional ad to hire more women in an Asian country last year where of 10 women, 9 were white blondes and only one was actually Asian.

And then they wonder why they don't get more black or asian people applying for jobs.

To top it up, they strongly prefer people from a small pool of schools that are expensive and strongly favor children of alumni to the extent that their preferred hiring pool already is very heavily skewed towards white upper class men.

Their "diversity team" works within that constraint: To have to try to capture the very small black and asian contingent that fits hiring criteria and a culture that is so strongly predisposed to them that even if they hired everyone that met the criteria and were willing, they'd not get close to achieving diversity in any meaningful sense.

It's a real problem that many places want things like "culture fit" without actually thinking about how they actually ue that.

Often even while putting together committees to try to solve specific hiring problems that can't do their job because the "culture" is holy and questioning whether or not it is being so narrowly construed that it causes extreme monoculture is often not acceptable...

(to be clear I'm not suggesting the person you replied to or others here are doing what the bank in question is doing - I just wanted to give an extreme example of what "culture fit" can lead to, even without it necessarily being the intent)


> Isn't culture fit just another word for "people like me"

More like: people who aren't going to grate on me, which is very different and encompasses a much larger set of people.

The "culture fit" strategy can no doubt be abused, but it's also wrong not to recognize that a technically brilliant hire can be terrible for the company if there are communication problems or personality conflicts.


This type of selection (“people who don’t grate me”) often favors people that are similar to us. Project Implicit[1] at Harvard has done interesting work highlighting our difficulty to make “objective” judgements (esp. about other people that are different from ourselves), and that we go to great lengths justifying our biased decisions not just to others but to ourselves. Hence “implicit” biases.

1. https://implicit.harvard.edu/implicit/aboutus.html


Is that really so bad? Obviously when extended to race/sexual-orientation/etc it is, but isn't it legitimate to some extent to want to surround yourself with people who share your values?

This is precisely the issue that the social sciences often miss because it's an ethical, not scientific, question.


It is pleasant to live in echo chamber and that is exactly what you would build there. And the result is place where people don't talk about issues and don't solve problems if doing so would go against groupthink. And just like in any echo chamber, consensus will more more toward extreme and people will simultaneously become less and less capable of handling different opinions.


The key phrase you seem to be (intentionally?) missing is "to some extent".

There's a continuum between "echo chamber" and "agreeing about nothing".


>This is precisely the issue that the social sciences often miss because it's an ethical, not scientific, question.

You're completely missing the point - if it doesn't get results, they're not going to do it regardless of whether it's morally acceptable or not. Businesses care about results.


With all due respect (truly!), you are missing the point.

The point is: scientific arguments fail to convince because the issue escapes the scientific domain. The question is not "do we naturally tend to do X, Y or Z", but rather "at what point and to what extent are we ethically obligated to seek out people different from ourselves?"


"Inter-personal skills" plus "works well in a team" seem to be those criteria, as I would describe them (from the UK).

(I don't actually mind what anyone calls it, but "culture fit" has got a bad press and got my hackles up! Apologies to all!)


I agree, I regret my use of the term "culture fit".


> Isn't culture fit just another word for "people like me"?

This is a hot-button topic; I look forward to additional contributions here.

[yesterday] We only hire the best means we only hire the trendiest (2016) | https://news.ycombinator.com/item?id=15591441 (2017Oct;596comments)

[one week ago] If you care about diversity, don't just hire from the same five schools | https://news.ycombinator.com/item?id=15543371 (2017Oct;479comments)


>Isn't culture fit just another word for "people like me"?

I've noticed that an emphasis on "culture fit" does tend to correlate with an ethnically & culturally homogenous workplace.

I think most of the candidates I've interviewed and recommended for hire that have been rejected for a vague/no reason have all been foreign.

IMO hiring for culture fit is fine provided you're willing to be very specific and state directly to the candidate's face what aspect of your desired 'culture fit' that they failed.


Are you suggesting that "cultural fit" and "interpersonal skills" are so well defined and bounded that they are completely distinct, and definitely do not overlap?

I remember hearing about a study that success in engineering graduate school is better predicted by the GRE writing scores than by the math scores. So could general communication ability be more important than engineering ability when it comes to job performance (on average, across all people & jobs)? Is it possible that both "cultural fit" and "interpersonal skills" are both trying to get at whether someone is a good communicator who emphasizes getting along with the people around them?


Please see my other reply below


> Learning a new skill is not an earth-shattering problem...

Depends on the problem. If you're cranking out brochure websites from a set of templates, you have a pretty accessible role to fill.

But I have many colleagues who make mistakes (even when people object clearly), dig a hole for themselves, double down on the mistake, and the either back their way into job security with their mess or end up moving on once the chickens come home to roost. Some of the time, I can even see their eyes glaze over as I try to get them to think a few more moves ahead in their plan.

Now, organizations should have safe roles for journeyman engineers who crank out and maintain code as directed, but most organizations expect all senior engineers to be part time system design experts, which I don't think everyone has the knack for. Likewise, not everyone is a great communicator and strong enough technically to be accurate in their communication.


    > But I have many colleagues who make mistakes [...] dig a hole for themselves, double down on the mistake, and the either back their way into job security with their mess or end up moving on once the chickens come home to roost.

A skilled practice of behavioral interviewing can, I think, mostly screen out such candidates. People who have a habit of "doubling down on their mistakes" will find it hard to wiggle around pointed behavioral questions which require the candidate to describe real experiences and answer follow-up questions.

The problem is that most organizations are terrible, TERRIBLE at evaluating candidates. They forget that interviewing candidates is a skill in itself that needs to be developed and needs to be consistent across the organization.

Moreover, these same organizations fail to "close the feedback loop" when it comes to evaluating employee performance. The "annual review" data can be put to good use by informing hiring decisions. Instead annual reviews are just an irritating burden used for the purpose of carving up the raise/bonus pie. It is basic engineering... evaluate your outputs (employees), then use that data to adapt to your inputs (candidates). But virtually no one does it, because its all tied up in HR-bullshit instead.


>But I have many colleagues who make mistakes (even when people object clearly), dig a hole for themselves, double down on the mistake, and the either back their way into job security with their mess or end up moving on once the chickens come home to roost.

Your making the parent's point for him. This is a behavioral problem. Hence, screening for culture fit.


> This is a behavioral problem

No. These people are generally open to suggestions that they grok. They just can't, at least sometimes, comprehend more indirect or abstract problems (like CAP theorem implications that influence the design of SQL tables and business keys). They write code that meets requirements in the short run but get bogged down in a local maximum that can't be improved upon without an expensive rewrite.


That prompts two questions for me if you don't mind sharing.

(a) How do you define and assess "culture fit", and why?

(b) How do you know the things you're optimising for are actually good proxies for productivity?

Appreciate these aren't questions with simple answers (or maybe even answers at all), but hiring/assessment process is my current field and I'm always curious where peoples thinking is at.


That’s a fancy way of saying “we hire people we like”.

It’s an approach that isn’t perfect but is good enough for many roles. We pick spouses based on that approach and it works out pretty well for most people.


You are correct, "culture fit" is not a good descriptor of what I meant.

We hire people we believe we will be able to work with. It's not a popularity contest, we evaluate problem solving, analytical thinking and communication skills, such as the ability to explain complex topics in easy terms.

I'll give you an example, usually we are interviewing people straight out of the university, but this can be applied to any candidate.

I'll start by asking the candidate to talk about his master's project. As he talks about it, I'll guide the conversation towards these topics : -What was the problem -How was it defined? -How did you tackle it? -Explain your solution -support your decisions -Was there a cost/benefit analysis? -How was the problem decomposed?

This will give me an initial picture of the candidate's ability to address a problem, and is far more important to me than wether he knows "the number after F in hexadecimal" or some other nonsense.


This isn't screening for 'culture fit' though and I'd sincerely recommend avoiding referring to it as that. You're screening for strong communication skills and that's absolutely fine (and highly recommended).

Interviewing for culture fit (as you've seen in the responses to your comment) has an extremely negative stigma attached to it for good reason.


I strongly agree with this. “Culture fit” is one of those concepts that often disadvantages “others”. This is something we may not be able to avoid without putting in place safeguards (concert musicians were predominantly male until blind auditions were used - and it turned out that female musicians play just as well or better).

As I mention above this is often not a result of bad intentions, but of implicit biases that have an effect on our judgements even if we are fully aware of them and explicitly try to avoid them.

See https://implicit.harvard.edu/implicit/aboutus.html


On reading this post, it may well be that culture fit is quite a good description.

At least from the POV that you're selecting from a small cadre of (almost exclusively) male masters grads.


About half of all marriages fail. All the intimate relationships people have prior to marriage also don't last. So people we like has a fairly high failure rate for intimate relationships.

Having said that, it doesn't necessarily mean it was all a complete waste.

Most people don't stay in their first job, so there's that.


I suggest that the average marriage lasts longer than most jobs.

Average Job: 4.6 Years - https://www.thebalance.com/how-long-should-an-employee-stay-...

Average Marriage: 8 Years - https://www.mckinleyirvin.com/Family-Law-Blog/2012/October/3...


Given how much harder it is to leave a marriage than it is to leave a job, I'd think the gap between those two should be much larger.


The two are pretty tightly linked. If you have an underemployed or unemployed couple without a support network, you pretty much have to get divorced to get social services.


What's your definition of success?

I don't walk into any job interview situation expecting 30 years of bliss -- that's generally not a reality. If you stick around for 3 years that's probably a decent employment relationship depending on industry.

Many marriages, including those that ultimately fail, work pretty well for a long time. I'm not that old, and I've been married for 13 years. Is that a win?

Also, the average marriage is around 8 years, but the median is close to 40 iirc. People who get divorced trend to have issues like being young, poor, previously divorced, criminal issues or substance abuse problems. Divorce rates for couples at least 25 without those risk factors are dramatically lower -- implying that people with fewer signs of bad judgement display better judgement in picking a spouse and maintaining a relationship.


Yes, I thoroughly agree with all you've written.

As someone who indulges in more than his fare share of young/poor/previously separated/criminal issues/substance abuse problems... I attempted to capture that in my somewhat more terse "Having said that, it doesn't necessarily mean it was all a complete waste." Thanking for expanding on that where I was laconic.


"People we like" is not how 100% of all people pick their spouse.


Which is a polite way of saying “we hire people like us”


Have you looked at the divorce rate?


But you still need some technical skill... Otherwise you could just hire some soccer player (outgoing teamplayer. etc.) and you're good.


I think that’s a given here


>If you want to make a change to your interview process, give it to some of your current employees first. If anyone fails it, ask yourself if that person should be fired. The answer is probably no.

That's a fascinating idea that's never crossed my mind. E.g., how many current employees can pass whiteboard coding exams?


Interviewing is a random event for many companies. You as a candidate could fail miserably on one day and get great feedback the other day. It depends on so many factors in larger companies. For example in one of my previous employers, they would send a random engineer (juniors included) to interview a candidate and give a mark from 1-10. There were engineers who would not pass their own interview and questions, and on the other hand, there were others who would just see that you can code anything and you are cool to work with and would let you in. And anything in between. There were times where juniors wanted to be smart and asked some puzzles they got from the net, and when later I asked them to solve them they failed. So many interviews depend on so many random events and circumstances and.

The biggest takeaway I got from all this experience is that as a candidate I should not be intimidated by anything and should just be cool and myself, because there is so much randomness that trying to find reason in it will just make things worse. And ALWAYS negotiate as much as you can because your biggest raise is when you get hired.


Many companies instruct candidates should study for interviews with specific books or courses (e.g. “Cracking the Code Interview”).

That indicates the idea is not that someone who is a competent professional should pass without studying. The passively hostile whiteboard coding exams are more like a hazing ritual than an actual job performance test.


I prefer that over being ambushed by unexpected topics. Here, you measure ability to learn, at minimum. Unfortunately, you measure also how much free time candidate has. Which is still better then having candidate to guess what is popular now or which if competing preferences your company has.


Apparently there are already companies that take it a step further. They test sample of their employees with some fancy IQ test that measures them across several dimensions. From those results profile of desirable new hire is build and all candidates get measured by the same test.

On reddit in threads about interviews I've already seen couple people discussing those strange tests where in many questions there are no obvious correct/wrong answers, like for example all presented shapes can in their own way fit in presented pattern.


If there is no obvious right/wrong answer, I wonder if they are limiting their pool taking people who all think alike (kinda like in-breeding) or are they using it diversify the ideas and knowledge etc. (like diversifying gene pool).


Different answers probably give points to different traits the test measures, the ambiguity is just irritating for test takers coming from more classical tests where you know (or at least suspect) what is expected from you.

This whole test setup is not cheap so one can at least hope they considered the issue of diversity? The other thing is did they actually tested which teams perform better in specific problem domain that interests them?


>That's a fascinating idea that's never crossed my mind. E.g., how many current employees can pass whiteboard coding exams?

I once confronted a colleague who had refused a candidate because he couldn't answer a technical question. When I heard the question, I asked him "How many people in the current team can answer this?" (Hint: Only one). His response: "Yeah, but we told him to read the first 6 chapters of textbook X!"

So we denied him a job because he didn't do his homework.


Testing the change is one thing, but if the original people were hired and asked something very similar, you have an existing group selected based on that characteristic and so will likely do well.

e.g. if you have always done a whiteboard interview, then you've selected for people that do well in that type of scenario. Changing the question but keeping the format will likely just prove that those people are still suited to that type of task, but not that an otherwise good hire will be filtered out.

I'm not against whiteboard interviews or similar, but just pointing out that selection bias is an important factor here if you're judging effectiveness by comparing it to results for existing employees.


> less than 10% of the companies that I've interviewed with have said that their interview process was designed to evaluate how effectively someone could do the job that they're being hired for.

I don't think an interview process can do this tbh.

A good interview process can evaluate if they can do the job that they're being hired for, in most cases, but I would say only an actual trial/probationary period can evaluate how effectively they can do it. And even then, how effective they are may be environmental - completely external to their own internal competence.

After concluding that a candidate can do the job, the secondary purpose of the interview process is filtering for people that will positively affect the work environment of their colleagues (and/or filter out those who may negatively affect others). As an interviewer, I would make some effort not to lean too heavily on my own biases and try and also consider how they'd get on with my colleagues that I know well, but there is unfortunately always going to be some implicit bias here.


As an interviewee, one of the questions (that I shamelessly stole from somewhere on HN) I like to ask recruiters is this: "Do you like working here?" The answers and the way the answers are given usually tell you quite a lot about the company and the specific place you are going to work.


It’s a fair question, but if you asked me that in front of my boss or coworkers I would likely immediately take a disliking to you because you put me in an awkward situation where there are few easy ways to answer honestly.


This is why this question is so good: you either like working there, and then you have no problem answering this question honestly (and without any awkwardness), or you don't like, then surely you have to come up with some elaborate yet unclear answer to this question, which is quite clear at that point, and then I do not really care whether you take a disliking to me or not, since it does not really matter any more.


I really like my current job and still would think you are dumb and dislike you for putting me in that position.


Maybe in general that’s true. In my case even if I feel it’s a nice place to work, I wouldn’t like being on of the spot in front of my boss/coworkers. In a one-to-one interview it would be fine. Which was perhaps more the context of your comment.


Not being able to speak freely in front of your peers would be a hard pass from me. I'm not going to work somewhere I can't feel comfortable speaking honestly.


Let me guess, you wouldn't work somewhere you are unhappy to work at too?


In fact, it is perhaps the best way to avoid you joining a place you will dislike, as they won't let you in to begin with.


if you're working in an environment where you can't answer that honestly, I don't want to be working with you.


question can be softened as "what would you change in the company?"


I try to ask some variant of this, "What are your favorite/least favorite things about working here?" I'm not entirely sure how useful it is, even though it feels like a useful question to ask. I've gotten answers like "the culture is great" or "there are really no good food options around here."


Or a more open-ended variation: Tell me what you like about working here.

And then: Is there anything you wish were different here?


His point, that hiring process is not a scientific process, is probably correct. But the same is true about performance review process (in my opinion). At best, the performance review is unscientific and biased, at worst it is a popularity contest, while usually, it is a formality.

> I've never had an interview that tried to evaluate my ability to work in a team or prioritize tasks - both of which are probably more important to doing well in a software engineering job than being able to find anagrams in O(n) time.

And I have never seen a test that could reliably evaluate your "ability to work in a team or prioritize tasks".

The hiring process is usually unscientific because making it reliably scientific is very expensive.


Yes, but what about the commonly trotted out line of "hiring mistakes are expensive so we'd prefer to not hire than make a mistake". Perfect example of having cake and eating it...You have to eat the cost somewhere. Either it's in the hiring process, the firing of a bad employee, or spinning your wheels waiting for the right candidate.


I think this is a bit of a slippery slope. In that, if most employers really evaluated what skills they needed over cultural fit and their desire for _The Best_ then, I postulate many of these companies would also realize they have absolutely no reason to be paying super high US/EU salaries, especially the salaries for tech talent in the tech hubs.

They would realize that they probably just need a kick ass project management framework and access to a pool of low paid but capable programmers (eg. a remote team in India/China/etc). Most startup tech isn't solving any huge tech problems, it's just building to spec/vision and iterating based on feedback and maybe if you're lucky you get to do this under load/at scale.) I know things have been outsourced with countless examples of bad results but I think it's really just a project management problem - not a skills gap (again for most startup tech we see today, but probably a significant portion of enterprise jobs too).

I have more of a bootstrap mindset for most things. So I almost never understand the $1M seed funded startups that get an office and hire a few programmers in SV and instantly only have 6 months of runway just to build a Tinder clone or something with little tech complexity. In my mind, they could easily outsource development and spend 90% of their money on sales.

Edit: I say "outsource" a lot above, I actually mean remote teammates that you manage


He almost gets it right. Hiring for some abstract "best" isn't so great. Hiring for "culture fit" is practically an invitation to discriminate. OTOH, people rarely do only one thing at a company, so it's important to remember that you're hiring for a job rather than a specific task. You need to evaluate fundamental skills, including collaboration or leadership skills (and styles), not just knowledge of specific subject matter.


I have mixed feelings on culture fit becoming a bad word.

I don't disagree with your point but I think there is a danger with ignoring its value.

For example --leaving aside your views on SV startup sweatshops-- if you're in a start up, you will at some point work late hours and you will at some point go through stressful times. Sometimes the things that get you through those times is that feeling of camaraderie and actually getting on with your colleagues.

Screening on culture-fit is a good way to guard for that, and at least for me it is unwise to not associate value with that.

Now, it could be that your culture is inclusivity. But then isn't hiring for people who fit that mould still hiring for culture fit? Isn't that still creating a monoculture? Isn't discriminating against a bigot brogrammer (who may well be great at coding) still discrimination?


The problem is that "culture fit" is too broad and too vague. Often it includes things that are completely unrelated to job performance. If you want to make sure people are able/willing to work long hours, ask about that (and be aware that you're probably discriminating to some degree against people with families or health issues). If you want to know how someone will handle stress, ask about that. Not only will you be measuring actual job-related skills or qualities, but you'll also be telling the candidate something they should know about the work environment. On the other hand, hobbies and musical tastes and political views should not be part of a hiring process, and that's the kind of thing "culture fit" often seems to include.


> If you want to make sure people are able/willing to work long hours, ask about that (and be aware that you're probably discriminating to some degree against people with families or health issues).

Incidentally, discriminating on the basis of familial status or disability is illegal in various jurisdictions.


> Incidentally, discriminating on the basis of familial status or disability is illegal in various jurisdictions.

Indeed it is, and it should be, but many companies successfully use proxies to get around that. It's a lot harder to win a suit because a company demanded long hours than because they explicitly refused to hire people with families or above a certain age. I didn't mean to encourage such chicanery, which is why I already had the parenthetical comment about discrimination. My point was that if there are expectations or behaviors like this, then they should be addressed specifically instead of under a "culture fit" umbrella. Often that will make their il/legitimacy clearer, and help to prevent discrimination.


One of the good interview experiences I had, included going through a project I had previously done, framing the problem, how I'd have worked through it (with class/function level whiteboarding), and what could be improved. It was a project from a previous job, but too high level to be replicated. But could see how going through work from a previous job could be problematic (no code was shown).


Note that if your company is small, it can be very hard to do any kind of meaningful statistical process control on hiring.

The "track careers of people you rejected" idea is an interesting one, but depends very much on how public people make their careers. I suppose you can rely on LinkedIn for 90% of people.

(What is HN's opinion of the Stack Overflow Developer Story?)


We run an early stage applicant tracking system and see ourselves broadly as a rejection engine. Something like 95% of the activity on our system relates to candidates who end up being rejected.

Since our assessment data is structured our current iteration of the product uses that to create value for rejected candidates, informing them where they were strong or weak in comparison to the rest of the group, and showing them some aggregate scoring.

It goes down very well with hiring managers, who typically don't have the time to respond in detail or at all to rejected candidates, and (for the most part) goes down very well with candidates.


>The "track careers of people you rejected" idea is an interesting one, but depends very much on how public people make their careers.

I really like this idea. This reminds me a bit of the "anti-portfolio" that some VCs track (aka a bittersweet regret of the unicorns they passed on years ago). I think the same could be said of great talent that an organization passed on that then goes on to build great products, invent, and/or otherwise lead great teams despite that.

However, employment is a tad different than investment so, to me, the only caveat to this is that it simply isn't the case that every super talented person will bring about great results and value creation at every single organization.

Much of what a talented person's output depends on is the very room full of other people in which that person works with. And while "fit" is such a cliche thing these days, this is what matters so much in terms of whether or not an otherwise talented person will be successful or not within one organization versus another.


Very good point. A nice blog post illustrating the problem with drawing conclusions on small sample sizes: https://jvns.ca/blog/2014/07/11/fun-with-stats-how-big-of-a-...

You need a lot more than a startup’s number of employees and their eventual job reviews to make strong conclusions on the quality of the interview process, and need to rely more on industry best practices instead.


There are so many cognitive biases in play when it comes to hiring its basically a total crapshoot no matter what processes and shit is in place.

People hire people they like, period.


Disagree, we can basically eliminate the personal part of it. Blind interviewing. I don't need to know name/gender/school etc. Technical skills test as a filter given to current employees to track if it actually tests for necessary skills. Second part of interviewing can be assessed by people who don't see the candidate and we can obscure their voices.


The cognitive biases aren't just around appearance or voice...


Not sure what else there can be if you create a normalized test that has been designed to predict success by being applied to your current employee population. Take away personal interviewing and voila. I can't judge this person on my inherent bias towards a race, gender, etc. if I can't see them or determine where they are from. The test should be the only bar required to grant a person an opportunity, nothing else. No need for resumes. https://hbr.org/2016/04/how-to-take-the-bias-out-of-intervie...


> If you want to make a change to your interview process, give it to some of your current employees

I don’t think I would respond well to this as an employee.


Why not? I think it's a valid test for a company. If the majority of your employees wouldn't pass the hiring test, why use it for new employees (unless you're looking for someone more senior)? Employees can probably best give feedback on how to adapt it so that it mirrors the actual work.


Maybe it's a generational thing, but I'm old enough to have experienced "we're doing interviews for existing employees." It was not because they were trying to improve the process for hiring more people.


Many people seem comfortable allowing their companies to treat them as low value commodities and will do anything asked of them.

In general I don’t exactly think like this. I do the work within the scope of what I’ve been hired to do. I’m lucky, at least right now, that I have this choice because my skills are somewhat rare. Want work outside the scope of what I’ve been hired to do? Pay for it, or find someone else.

I don’t want to be asked to interview again, it’s not a pleasant experience for me, and critically it’s being presented as “something you have to do”. If you want to do this, offer additional compensation for doing something outside of the scope of my work.


I think a good work environment (and that includes competent coworkers) is in everyone's interest. What if you can take the test but the employer doesn't know the result?

I'd trust my employer enough not to use interview questions as my performance evaluation. If you don't, the level of trust appears to be very low.


>Many people seem comfortable allowing their companies to treat them as low value commodities and will do anything asked of them.

If that's how you feel about it, would you not agree that asking such questions in an interview is treating the candidate as "low value"?


Personally I think interviewees should be financially compensated for interviews. The typical view is that they’re being compensated with a “chance for a job”.


These posts always read to me as being a bit presumptuous. The fact that you're looking for a job at that particular company means they're doing _something_ right, with regards to hiring, or shipping quality products, or having a good work culture, etc, etc. If I was the interviewer, I would never hire a person with this sort of "know it all" attitude. If you think you can be a better interviewer, then you have to show you are, rather than coming up with a logical argument of "doing it this way makes sense". Basically, unless you yourself are in that position, or have intimate knowledge of it, you don't know why someone made those decisions.


In the original article the author said:

> I've never had an interview that tried to evaluate my ability to work in a team or prioritize tasks...

I found this interesting, because there are some good ways to look for this that I always ask when interviewing. I wonder if the author had been checked on these things and just didn’t notice.

One thing I always do in interviews is after I see the approach the candidate is taking, show them another approach and see if they can understand it and run with it. Many candidates struggle with wrapping their brain around any other approach than their own. This is about collaboration and teamwork.

It’s also very common to have behavioral questions that talk through what some challenges were and how you solved them, and to ask probing questions about how the candidate dealt with differing opinions in the best way to proceed. This also is looking at teamwork skills.

Regarding prioritizing tasks, it’s routine for me to ask a candidate to break a problem down into components. Then evaluate which are the most difficult, and which they’d tackle first and why. I’m looking to see if they try to reduce risks early in their process. This is something more senior people do better.

I wonder if the author has indeed been interviewed about teamwork skills and just didn’t realize it.


I would like to see YC take over the initial chunk of the hiring process for their companies, and publically document the heck out of how to do it best.

Posts like this one demonstrate that free access to the front page of HN for job listings isn't enough:

Don't work for CrateJoy | https://news.ycombinator.com/item?id=15486301


A side note, but the latter two answers (2. being a good fit and 3. doing a good job) sound very much the same to me.


I think "fit" is referring to ability to get along with the rest of the team, regardless of ability to do the job. They aren't quite opposites, but they are mostly unrelated.


It can mean that too, yes, but if the recruiter word-by-word says 2. then it could also mean 3. But then the rest the rest of the argument quite some weight since 3. is not rare anymore, as it's included in 2., too.


> I've heard "Hmm, I don't actually know what it is that we're looking for" from a few recruiters

Why do you even care what recruiters say? They're there to make as many hires as possible and get their commission. Don't get me wrong, this is actually a decent question, but you might as well ask a car salesman what he likes doing in his free time. It doesn't matter -- he's trying to sell you a car.

> How do you evaluate how well you're meeting your goal? -- The majority of the companies that I ask this to essentially answer "we don't" to this question.

I'm going to go ahead and call BS on this one. Literally every company (outside of early-stage startups) has some kind of performance review. Most of these are completely artificial and completely suck, but they're definitely there.


Performance reviews and their flaws are discussed in detail later in the article. In particular, they detect false positives but not false negatives.


The point was the performance reviews aren't used to evaluate or iterate on the hiring process.


Just as a spitball idea...

What about a structured "cohort" hiring process where you give potential employees short term (literally one or two days, where you have a group exercise like a "hackathon" where you make a sample product for demo, then moving up to one or two week rounds) contracts that allow you to evaluate and accept (pass on to future rounds) or reject a large candidate pool and thereby filter your potential future employees. Given the current costs of recruiters, HR personnel, interviews and headhunters this might actually end up being cheaper than current "traditional" hiring methods and could generate some very effective development teams.


That means you can only hire people who don't currently have a job.


My goals as an interviewer:

1. Do you have a solid understanding of the foundations for your skill/discipline?

2. If you were given $100,000 VC capital and told to build a revolutionary new thing how would you build it?

If a candidate cannot adequately address the first objective then I will find another candidate who can. This is pure and simple elimination with simple non-biased questions on things the candidate should know to do their job without abstractions, tooling, and other trendy bullshit.

The second objective is where I can evaluate my bias against the candidate's, but only after the candidate has passed the technical evaluation in a non-biased manner.


No offense but if candidate has a credible answer to "2. If you were given $100,000 VC capital and told to build a revolutionary new thing how would you build it?" wtf would she/he work for you :)?


If your answer was anything about fundraising, marketing, or capital I would fail you. The ability to answer the question that was asked is an unstated implied task. The question is about building something if you had funding and no limitations.


100K is a huge limitation especially for building revolutionary product :)


You are completely missing the point of the entirely hypothetical question to provide an impromptu technology brainstorm session.

Another reason why this is an excellent question is that it forces people to follow a very simple single instruction. If a candidate can't even do that, such as getting in irrelevant discussions about financial burdens, why would I want to hire them?


If you tunnel-vision on expecting a specific answer when asking a vague, mostly irrelevant question, why would I want to work with you?

Frankly, I find your strategy for making new hires to be absolutely absurd.


The question was pretty specific. Here is it again for you:

> how would you build it?

Following simple instructions is hardly absurd.


You're reducing:

"You're a skinny little nerdy freshman in high school with no social circle. How do you get the hottest, most popular senior cheerleader to go out with you by the end of the year?"

to:

"How do you ask a girl on a date?"

Interviews are a two-way street Michael. I'll call you.


If that is how you perceive the question in your mind I would expect a wonderfully creative answer that addresses such a problem with a solution and either a plan of execution or organization of necessary functions.

The inability to understand a simple question and provide a valid response is a problem, but fortunately you don't have to hire that problem.


It's very easy to reverse that as in inability to formulate a clear question is a strong indication of inability to manage a software dev. team effectively :)


That's a very open-ended question and not specific at all. Expecting a specific answer to that is just going to lead to disappointment for you and waste the time of your candidates.


You are missing a point of Software Engineers paying close attention to all the inputs in the question. If you provide a specific funding level that is significant data point :).


This is extremely general question with undefined scope and is absolute opposite of "a very simple single instruction".


You're at least making this an easy decision for the parent, in this hypothetical situation :)


There's often an inverse relationship between simple to ask and simple to answer.


It's because he wants to focus on building the thing, not focus on acquiring VC funding.


You are really asking to be bullshited (even if people had a good answer, they won't have an elevator pitch to tell you in an interview). Unless you are looking for people that can make an argument, you are selecting for the wrong trait.


> You are really asking to be bullshited

The question is rhetorical, so what does this matter? It is part of a job interview. It is not a VC pitch. If this confuses a candidate and they answer a question not asked I would not hire them. Interpreting and following instructions are important skills.


> so what does this matter?

You are select for the "can bullshit people on short notice" ability. It is very likely not correlated with other abilities required for the job, thus, if it is not required, you will very likely fail the stronger candidates because of that.


Creativity is directly correlated with problem solving.


And dishonesty correlates with what?


What dishonesty? The entire thing is a rhetorical exercise on the fly.


Communication is also an important skill. Your question poorly communicates whatever information you are trying to glean from the candidate.


> "how would you build it?"

Where is the complexity in the question and how is it poorly communicated?


It first requires answer to the question "build what?" before the candidate can talk about "how". Unless they would build everything using the same tech, which also would be an odd signal. You putting a specific amount of money on it adds additional concerns regarding that.

Do you expect the candidate to talk about some tech stack and/or development process to you? Do you expect the candidate to describe what they'd work on (then why don't you ask that?)? Do you expect the candidate to to treat the question as "if you could work on whatever you liked", or is it "what kind of product would you build", and then talk about the process? Do you expect the candidate to ask for clarification, or is that a bad sign already (since they were to dumb to understand your "simple question")?...


If the candidate is hopelessly stuck in a state of circular analysis paralysis then the quality and thoroughness of their answer would ultimately suffer. It would demonstrate either poor decision making, weak communications skills, or limited comprehensiveness.

There are a lot of candidates who need to have their hands held by a parent authority or are hopelessly incapable of forming original ideas. This is a means of identifying such people so that they can be removed from consideration.

> Do you expect the candidate to ask for clarification, or is that a bad sign already (since they were to dumb to understand your "simple question")?

If I had to answer the question for the candidate, then yes, I would consider that a bad sign. It is a simple hypothetical question demanding a hypothetical response.


as an open-ended question you gain insight from the candidate's response to the scope with which they think about things (and whether that matches up with their experience)... also 100K isn't much so it should force some critical/creative decisions


The key thing missing from the question is what the thing being built is. Any answer to the question of how to build must (to be coherent) address what is being built, and anyone who has a what to build and a how to build it for a revolutionary product that takes only $100,000 in VC capital should be shopping that for financing, not looking for a technical gig working on someone else's (probably not revolutionary, certainly more expensive to build, and even more certainly providing the candidate less equity) thing.


Presuming this is a technical interview and the skills are already identified and qualified the "what it is" isn't particularly important. This isn't hiring for marketing or product management. Make anything up. In this case the production process and software organization strategies are what matter.

Attempting to bullshit past the spirit of the question with deflections and non-answers are easily identifiable failures.


> Presuming this is a technical interview and the skills are already identified and qualified the "what it is" isn't particularly important.

It may not be important to the person asking the question, but it's actually quite central to being able to have a coherent answer.

A better question, if the “what” isn't important, is to identify a concrete thing rather than a “revolutionary idea”; if “what?” isn't important, don't ask a question which fundamentally requires having an answer to “what?”.

> Attempting to bullshit past the spirit of the question

Asking a poorly framed question and blaming the person questioned for bullshitting for not correctly deducing the intended spirit certainly reflects negatively on someone, just not the interviewee.


That doesn't make any sense, the question presumes you've already acquired the funding.


No, the response "why would he work for you" presumes that knowing how to build a product after acquiring VC funding would include the skill set or interest in acquiring the VC funding.


No, it doesn't, since even without that they would be better served by looking for someone with that skill and being the technical side of the founding team.


Your second example is a lovely example of how terrible interviewers don't realise how fucking awful they are.


Indeed it illustrates how much we don't need human resource workers.

I have a reasonable amount of experience with scripted questions for phone calls and face to face conversations.

It of course should be "foundation of" not "foundations for" but that aside...("..." for a short breathing break) your first question has a slash in it? The whole point of preparing questions is to not have such things. After a very tiny number of interviews one would chose between the 2.

I would smile at the second question and politely explain why I thought the job was interesting in well prepared words, then ask "Why do I want to work here? How long have you worked here? Why do you work here?"

I would totally expect an answer (only) to the second question. To which I would respond by asking if they enjoyed it.

By that time the vc question will seem alien to the context of the conversation. If you bring it up again I will ask if you ever applied for jobs while working there.

Then I will openly mock you by arguing it would be interesting just to see what kind of questions they ask.

After "I have other interviews to do today" to crank some urgency into the conversation, I will explain why I thought this job was more interesting than the others (same as before only more contrasting than generic), how it fits with what I know, my expectations for further development of these skills and what interests me about the product.

If nothing serious comes out in response I will leave "expecting to hear from you soon".

You will be in awe - how fast that went.

Then I will set fire to your bikeshed. (haha)


:)


No interviewing system is perfect or even good. Having said that interviewing, in my experience, isn't actually the biggest problem companies have with hiring. The biggest failures are when you have that realization that someone isn't working out where most companies don't have the stomach to deal with the problem decisively through candid feedback/conversation or possibly letting that person go. This natural shyness around confrontation hurts everyone involved including both the employee in question and all their team members.


Interviewing is subjective. It’s about sitting down in a specific context and extrapolating, based on incomplete information, about a future set of conditions that will likely change when time catches up to them.

If we can start with the idea that interviews are a very blunt instrument, the process becomes at least somewhat compressible: it selects for people who recognize the parameters and perform well within that context. At best, that pattern-matches to success. At worst, it filters out talented folks who don’t interview well, but could code well.


"Best people" and "good fit" means they are looking for someone with both technical and soft skills. They don't want a skilled worker who is difficult to manage or communicate with, and they don't want a nice guy who is not skilled. It makes sense to me.


I disagree with the reasoning here. Just because the interviewer says 1 or 2, doesn't mean that they're not actually or also looking for 3. We humans tend to be imprecise in understanding and stating our own motivations.


The best interview I had (for a lead games programmer position) was a written test, I can't remember the exact details but perhaps 24 pages and 45 minutes to complete it. Before starting I flicked through the entire test to get a feel for what was being asked of me, and then for a little while debated just getting up and leaving the room. It felt a bit crazy, a test on all sorts of disciplines of games programming that I wasn't an expert in, and I didn't intend to become an expert in (and the job I was applying for didn't require me to be an expert in). But I calmed down, tackled the test in the best way I could, answered the questions in an order where my most confident answers (and quick to complete) answers were done first, then moving onto ones I had a pretty good idea of, then ones I knew less well, etc.

40 minutes in, the CEO of the company came in, it is a ~300 man company, so it was interested in itself that he came in. Anyway, he wanted to go through the test, I said I hadn't finished yet, and had made a note of the start time, and still had 5 minutes remaining. He said it doesn't matter, lets have a look. And he flicked through the test and didn't look at my answers and instead found a section of the test which I had completely ignored (it was to do with AI and path finding, both topics I have done almost nothing on during my entire education and career). Of course, this was the question he wanted me to answer, I explained the reasons I hadn't answered it, and said if they wanted me to do things like this it really wouldn't be a suitable job for me. But he persisted, stop worrying about all of these things, just answer the question now. Again I had that slight feeling of wanting to just leave, but again I overcame in.

I started talking him through the things I did know about the question, pointing out areas which could cause problems, then started listing what I could do to limit the question to avoid some of these issues (it had a picture of a top down level, the question was inside a box, I said lets initially forget about the box for example). Then I started just talking about an initial algorithm which could route AI around the level - I said it would obviously be really bad (basically AI just walking into things, then working out where to go next, and then a bit later saying oh I could keep track of where I have been in case I end up at the same point, etc.) we talked more about it, he asked some questions, and bit by bit I came up with a solution.

He said that's interesting, because during this process you have described parts of various algorithms which someone who has studied AI would know about, but you are going from a brute force perspective, not having any way points in the level, no extra level knowledge. I said I didn't realize I was allowed to do that, if I could do that, I could potentially come up with some nodes in the level, and rather than bumping into objects, go between nodes, work out the distance between nodes, build up a graph so you know the best way to get between points. Again he was happy, I was describing, or partially describing an actual solution. His next question was but how would I work out where to put the way point nodes, and again I just starting looking at the image, thinking about good positions for the way points, looking at the normals from each wall face, and started to see that putting way points as far away as possible from all normals had some advantages, and bit by bit came up with some sort of solution about how to do this.

By the end of it, there was a solution which involved preprocessing the level data offline, when the game runs using this data to move around the level, being able to handle paths becoming blocked (or new paths being opened), strategies for running this on a separate thread or CPU, asynchronous to the main game. It was demonstrating knowledge of the full pipeline for making a game from artist making levels to game running at 60Hz on a PC/console, and also taking into account human resource, i.e. we could do things this way, which would give us a slight edge, but it would make the designers life a lot harder, so I'd probably not do that, take the performance hit, which is small and can probably be won back by the level designerse having extra time to optimize anyway.

I was offered the job, ultimately I had a better offer elsewhere so I didn't take it, but for me the test made a lot of sense. My day to day job (Tech Director of a 12 man games development studio) is constantly having to solve problems which initially do not appear to have an answer. The ability to break down an issue, not panic, look from various different perspectives, build up a solution, have a good feeling for what parts of the solution are weak and need further research etc. to me makes a huge difference between an 'ok' programmer/developer and an exceptional one. I haven't used the same test approach, but have certainly learned a lot from it when hiring people, never trying to trick interviewees, quite the opposite, trying to give them as many options as possible, and just urging them show me how they would be able to deal with day to day issues and come out on top. I certainly worry that whilst I got through, some people really would have just walked out, and in some cases you could argue those people wouldn't work well when faced with tough problems and lots of pressure, but on the other hand, but outside of the testing environment they'd be fine. Anyway, I personally wouldn't want to create quite as much stress/pressure if I were to use these technique in future.


When interviewing potential employers, I ask them

1. Do you conduct regular peer reviews of your documentation and code?

2. Do you have a bug tracking system?


I've accumulated so much material about interviewing I've thought about writing a book. No one would read it, of course, because everyone thinks they're an expert at it already.

In my experience, 9 out of 10 interviewers are terrible. They come up with assumptions before talking to a candidate, then come up with even more assumptions during the interview. These assumptions are almost always unchecked, meaning that they never ask a single question to verify their guesswork.

Perhaps the most critical failure, which this article touches on, is that people can't adequately describe why their hiring successes were successes and why their hiring failures were failures. If they did, they'd understand just how many of their interview process (such as it is) is a crapshoot. They have no conscious understanding of what it takes to be successful at their company, because they've never thought about it in any serious way.

They ask for code samples, but never look at them. What's really important is that you know all of their current technologies and use them in exactly the way that they do currently. So to determine that they sometimes give out poorly-conceived coding assignments that are either testing something completely irrelevant or are so large in scope that you are basically rebuilding half their product.

They ghost candidates, even candidates that have gone through multiple rounds of interviews, forgetting that people who interact with their company go out into the world and form opinions and things to say about them. This is especially true of candidates who are really engaged with your company's mission. (Besides, it's just rude.)

The best predictor of future behavior is past behavior, except when being interviewed, where anything you accomplished at a previous company is viewed with the detached impatience of someone eager for you to stop talking so long so they can get to their questions. Listening to what people actually say, instead of just your impression of what they say, means listening to the motives, emotions, and body language surrounding what they're saying and asking questions that get at the heart of those. It's the single most effective interviewing method, but almost no one does it.

Because I hire others, when I am being interviewed, a popular question is "How do I hire?" Everyone is looking for a solution with the appearance of a rigorous methodology. If I wrote that book, I'd have to sell a snake oil solution that guarantees great hires, because no one wants to hear the reality: that it requires experience, empathy, and active listening, and there's no one-size-fits-all solution for each person. People don't like nuance; they want a simple, step-by-step guide, not some amorphous process that might take a lot of self-work.


Using proxies for competence is a slippery slope towards misdirection and irrelevance - and as a result, incorrect hiring decisions.

In another thread today, someone mentioned they reject every CV that has two typos or more. It's a proxy, and it is irrelevant for the job.

Once you start going down the path of proxies, you add more and more far removed ones. To keep hiring performance high, resist adding the first proxy. Actually ask questions relevant to past work experience that maps to the actual job. Maybe do work samples.

To circle back to the two questions in the blog post, no interviewer can answer them because they use proxies too far removed.


Think of the resume as a deliverable. This potential employee sent out the deliverable without reviewing it for spelling and grammatical correctness. They almost certainly did not ask someone else to proofread it and suggest corrections.

Would you want to put this person in a position where they may have to deliver binaries, code, or other documents to a customer? Would they ignore or skip steps in your delivery process, too?


Agreed. Work samples are the most predictive form of assessment, you just need to be able to isolate them from other factors such as confidence bias...

i.e. no whiteboard interviews (unless hiring for someone whose job it will be to solve problems on whiteboards in front of strangers in situations that effect their financial security and sense of worth)


If there is a gross lack of diversity within your company and you routinely use "culture fit" as an excuse to turn down PoC candidates, what conclusion can be drawn about your organization?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: