Hacker News new | past | comments | ask | show | jobs | submit login
When hiring senior engineers, you’re not buying, you’re selling (hiringengineersbook.com)
973 points by ashitlerferad on Jan 20, 2019 | hide | past | favorite | 662 comments



A lot of this post seems pretty reasonable. But:

In my experience, it’s fairly easy to judge technical skill. A friendly conversation about technical interests and recent projects can often be enough.

Bullshit. Sounding credible in technical interviews is a skill, not the same skill as actually being a good programmer, and might even (statistically, in the large) be close to orthogonal to it.

We found this out the hard way. At Matasano, we started our work-sample hiring process[1] as a way of filtering out the smooth-talkers. Before work-sample tests, we'd spent loads of time on carefully designed interview questions; interview design was something close to a hobby for some of us.

Of course, it was only after we started doing work-sample challenges that we discovered that not only were a lot of excellent-seeming candidates actually not capable of delivering, but an even greater fraction of the candidates our "friendly conversations" were selecting out were in fact perfectly capable. It was a bad deal all around.

Whatever you do, don't fast-path "senior" developers. Everyone should run the same process for the same job. Not only do you risk hiring people who won't work out, but you're also depriving yourself of the most important data you need to iterate on your hiring process.

[1]: https://sockpuppet.org/blog/2015/03/06/the-hiring-post/


I disagree. If you can’t tell the difference between a smooth talker and strong technical competence you are probably interested in the wrong qualities. I have interviewed enough now to see why some companies cannot figure it out. Ask yourself if you really actually want a senior or a strong junior.

It comes down to interest. A good senior got that way because they like solving challenging problems. They are not interested in trendy framework bullshit. In other words talk about the problems and potential solutions. Are things in place to make the job easier? Seniors don’t need easier and this is a huge turnoff.

When I hear companies try to sell me with frameworks and process I know they are blowing smoke. At the very least they are boring and at worst you will be working with incompetent people who are as self-diluted as the company. I agree that filtering candidates is a good idea though.

The reason why some companies cannot figure this out is because they don’t value the problems at hand. They need bodies to put fingers on keyboards and are willing to pay more for people who don’t completely suck. Experience and competence are not the same as excellence but considering the candidate pool I can see why companies compromise on quality.


I hear what you're saying, but I think you're maybe creating a straw man argument here. A smooth talking engineer isn't one who sounds like a used car salesperson or is just hyped about the latest trendy framework.

There are people who can talk through challenging problems at their former companies and how the problems were solved. They can tell you everything you'd want to hear because it's true. Except…they didn't implement it. Maybe they are best friends with the person who did and understand in detail about the tradeoffs and the neat hacks and the insights learned along the way, but couldn't build it themselves.

Those are the "smooth talkers" of the engineering world. Those are the people you can't catch just through a verbal interview.

On a related note:

> If you can’t tell the difference between a smooth talker and strong technical competence you are probably interested in the wrong qualities. I have interviewed enough now to see why some companies cannot figure it out. Ask yourself if you really actually want a senior or a strong junior.

My dude.

Look at who you're replying to.


> Those are the "smooth talkers" of the engineering world. Those are the people you can't catch just through a verbal interview.

I agree with this. I was a hiring manager, and there are those that can really talk technical, in detail. You really think they know what they are doing, how to solve complex problems, how to come up with solutions. You put a keyboard in front of them (or pencil and paper), and they go "uhh, errr, ummm." and fail miserably.

I think until you have interviewed a LOT of people, it can be hard to quickly spot this. Some people are masters at telling you how someone else solved the problem as if they solved it, but they can not solve it themselves.


and don’t forget the reverse problem mentioned originally: false negatives.

you can have someone who is a whiz at practical and specific solutions, who thinks critically and analytically and just gets an enormous amount done WELL. And empowers those around them to boot!

they have the reverse problem to speaking about other peolle’s work as their own. instead, they speak of their own work as teamwork.

this effects many great people. also women and poc are particularly likely to do this because they have been socialized to not speak too highly of themselves. “model minority” etc.

if you as an interviewer are already skeptical of what someone says, you will increase false negatives with people who you are asking to verbally “prove” their work and yet have cultural memories of being penalized for “bragging”. they’ll describe a solution and downplay it as challenging or hard because women aren’t likes le when they’re the smartest person in the room, etc.

an interview process should seek to understand many skills: practical, implementation, execution, problem solving, design, high level, communication skills.

a varied process that focuses on a few specific skills, one at a time, is likely to convey the most accurate signal.


The problem with hiring is that a false positive is much more damaging than a false negative. Getting the group of people together to vet a candidate is expensive; recruiters are expensive; for the candidates, taking the time off is typically pulling from a very limited bucket of just a few weeks every year; flying people in to interview is expensive; and ultimately, to go through all that and hire someone bad makes you go through the whole process again. If it takes you a few months to figure out it's not going to work, it's unlikely anyone desirable from your original candidate pool is still available.

False negatives are expected, and honestly probably good overall and in aggregate, because it decreases the odds of a false positive. One of my first bosses that involved me in the hiring process told me one day that the point of interviewing is not to find reasons to say yes, it's to find a reason to say no.


The "find a reason to say no" can be very damaging as well if taken to the extreme.

I've seen people that were entirely qualified for the position be rejected at the company I work for because they made some totally understandable mistake - I'm talking about people that took the time off to take a 5-hour on-site coding assignment, and made a mistake but would have had a passing (and possibly good) grade on an academic evaluation.

And now we have 5 open positions and nobody hired for them.


> I think until you have interviewed a LOT of people, it can be hard to quickly spot this.

What is your threshold for "a lot"? I have given a few dozen interviews. I always ask at least a couple background questions. The number who even have a polished delivery for that part at all are a minority, and the couple that tried to bullshit me we're painfully transparent. Maybe I just haven't done enough yet.


These problems are very simple to solve. Get them to go into detail. Get them to explain their thought process while looking to solve the issue.

If you feel that they're talking about a problem someone else solved, ask them that directly. (Did you work with others? etc) If they're lacking on the technical details either it's been a long time ago or they didn't do it.


> explain their thought process while looking to solve the issue

I would think they and their colleagues discussed alternatives together, collaboratively in for example a Slack chat — so an interviewee can give you good replies about the thought process and alternative solutions that were considered and discarded. I would assume. Or maybe the interviewee him/herself came up with ideas, that his/her colleagues realized weren't going to work, and explained why, for him/her. Then s/he might be really good at describing the thought process.


If they understand their friend's work deeply, doesn't that imply they've done something comparable themselves?


No, let me give a specific example. Imagine you're interviewing a candidate and they're talking through how to design an analytics service. They begin talking about e.g. database architecture, and how this type of data is most appropriate for a star schema. They start talking about the tradeoffs of row versus column orientation. They mention they'll need to do indexing for performance and talk about the index space versus query speed tradeoff. They say they'll do joins on the x and y tables.

Basically, they volunteer technical challenges they're aware of while simultaneously telling you what the high level solution is. But then you put a terminal in front of them and ask them to set up Postgres in a star schema with some dummy data, and then to write a query joining the two tables they were talking about before. Despite Postgres being on their resume, they'll completely flounder and not even know they need semicolons to terminate commands. Their joins won't just be wildly inefficient, they'll be syntactically incorrect and refuse to run. They won't be able to create, insert, select, truncate, drop, etc. They don't know how to create an index and can't mention any of the options for indexing, let alone the default provided by Postgres.

Keep in mind this example is just meant to be illustrative. Thinking through how to fix the scenario might not generalize to all the ways this can manifest. The kernel of how this arises is a person like so:

1. They read a lot about technical solutions at a high level. They can follow that if you have problem A then you need, roughly, solution B.

2. They have no contextual flexibility or practical foundation for understanding their solutions. They might have read Designing Data Intensive Applications, but they can't actually code and have never administered a database. To the extent they understood the book, they only internalized low hanging fruit.

3. They are charismatic, or ar least comfortable talking about technical topics. They will try to lead the conversation as much as possible, which is where you see them volunteering technical challenges and then offering solutions. But if you force them to answer heavy technical questions which drill deep into a specific area, they'll probably try to zoom back out.


I may be miss-characterizing your point but there have been many times that I’ve sat down at a terminal and not been able to remember how I did something a few months ago. On the job the important thing is knowing the high level goal and the fundamentals of what you’re doing. The syntax can be easily googled.

edit: yeah upon re-reading you’re talking about completely obvious lack of practical experience... fair point


I came multiple times across similar individuals as you have just described. They are very good at talking in details about a tech subject, charismatic, defensive and argumentative all day long over their solution even though it does have lots of holes. But when it comes to actually implementing a task. They usually fall short. They will quickly grap an existing/similar solution from ie github., spend many hours trying to understand it, Then copy-pasted and present it as their work (half-baked solution)

Even though the whole thing could be very simple. Another thing is, they will come up with reasons that the issues with the user story/task for code/solution is due to environment or some other reasons like tools, framework, scalability and "bs".


Wow, I feel like I'm the sort of person this comment is calling out. What advice would you give me so I can be the real McCoy? The only solution I can think of is to keep writing as much code as I can, so I can get real experience instead of just hot air.


Turn off your internet when doing work. Buy a stack of Postgre books. Memorize them. Never, ever, ever use stack overflow. EVER. Don't use google. Memorize everything.

That's what that asshole is looking for. :)

Though to be fair, if you come up with that strategy and can't do -anything- at a sql console, I'm going to ask how you normally interface with the database, because that's like a Linux expert not knowing how to use tar or ls or something.


Most humans use a GUI when they can(with their connections stored and autocomplete and a few other nice-ities. Of course if you only ever use MySql or Postgres through command line, congrats - you know how to do basic connections(I assume using -h makes you an imposter)

As for the Linux/tar piece, I've used Linux on desktop for a few years(both Ubuntu & Fedora) & have used Suse and CentOS for servers for much longer.

I can tell you tar means tape archive. I can tell you I mostly use it with gunzip to compress it. I still google/reverse terminal search what flags to use with it both when archiving it and unarchiving it. I could probably remember some flags if I spent enough time thinking - why would my brain waste that much effort though?

I don't work as a backup administrator. I have better things to worry about knowing/having present at the forefront of my (admittedly human sized) memory.


> That's what that asshole is looking for. :)

Do you suppose you could have made this point without calling me a name?


I don't really intend to call out anyone, so please don't feel that way. Keep in mind what I'm trying to construe is someone who has a lot of broad (not deep) book knowledge, but who can't do even basic practical implementation.

If you can explain how to design a system and you can do it, but don't know the exact commands off the top of your head, my comment isn't describing you. I don't expect people to e.g. know awk like the back of their hand, or to write perfectly compiling code on their first try.

But even if you don't have perfect recall of the commands, it should be pretty clear whether or not you've ever opened an editor and done basic implementation. If the GUI is your thing that's fine. But your knowledge must have some practical foundation which demonstrates you can actually walk the walk.


Writing more code is, indeed, the solution. This may feel challenging or even impossible because it takes away time from doing the research of actually understanding why everything works. Ie, due to time constraints, you may need to implement things more often without understanding how they work. This might make you less good at talking about solutions, but on the other hand better at actually providing them.


> If they understand their friend's work deeply, doesn't that imply they've done something comparable themselves?

Imagine you're interviewing a candidate and they're talking through how to design an analytics service. They begin talking about e.g. database architecture, and how this type of data is most appropriate for a star schema. They start talking about the tradeoffs of row versus column orientation. They mention they'll need to do indexing for performance and talk about the index space versus query speed tradeoff. They say they'll do joins on the x and y tables.

Is this demonstrating deep understanding though?


Well, do you think my comment demonstrates deep understanding of database design? I don't feel I have deep understanding of databases, but I can certainly talk to you about very basic things like indices and joins.

Basically it's like someone else said. They read a book and know a lot of answers, but they can't do the most basic implementation of a solution.


Well, do you think my comment demonstrates deep understanding of database design?

No, it seems about on about the same level as being able to paraphrase the abstract of a paper about the system. I would not take it as showing that someone has read past the first page. A high-level overview just isn't enough for that. You have to ask your own probing questions too. Limiting the conversation to the particular problems they bring up is essentially taking them at their word when they claim to be skilled. I've seen lots of occasions where trying to drill down for a bit more detail on some part of what they talked about consistently came up empty (without going anywhere near sitting down at a computer to write a fizzbuzz equivalent).


How do you suggest getting into designing and working with distributed systems without job experience? Maybe virtualizing a data center on a home server?


There is a huge gulf between being able to follow a thought path and being able to find it oneself. I couldn't do the same work in the same timescale as them, I wouldn't make the same decisions or spot the same pitfalls. I am simply not as good. But I spent a bunch of pub/work time discussing it and could project an aura of authority on the projects if I wished.

Look at it this way. I was able to understand all of the maths proofs taught at my degree, but I could not have come up with them myself.


Yeah, that's a great analogy for the core problem.

Imagine putting a math problem in front of someone and asking them to solve it. They correctly identify it as a system of linear equations. They volunteer that they would solve it using x algorithm which has a time complexity of y.

Then you ask them to actually solve it, and they can't even make the first movement towards doing so. They mentioned LU decomposition, but they can't even do Gaussian elimination on paper. They don't know what elementary row operations are. They can't obtain an augmented matrix or put it into (reduced) row echelon form. They don't know anything about linear independence or the rank of a matrix. You put an inconsistent system in front of them and they keep banging away at it, determined to find a solution...etc.

That's what it's like interviewing one of these senior engineers. It's surreal - they confidently pattern match the problem using limited heuristics, and they toss away low hanging fruit to demonstrate knowledge. But when you ask them to do something practical and specific, they either refuse and zoom out into abstract-land again, or they hopelessly fail.


so for data scientists I guess you could ask them to hand calculate the variance and standard deviation for a sample and see how they do on that


Unless they were pair programming through that problem. They can't and won't. They won't have the hard lessons that it brought.


"Principal engineer"? Do you actually have an engineering degree? Have you formal education in engineering?

What is system engineering?

It's not just you; the entire IT industry is suffering from systemic curriculum vitae bloat. That's why the working conditions are so bad in the professional sense.


The subject of what constitutes an engineer or what entitles a professional to the "engineer" title has been discussed ad nauseum here on HN.

Spoiler: "engineer" as a title for someone who does computer programming and software development without a license is perfectly fine and acceptable in the USA. In Canada, however, it's probably not.


You say the USA as a whole, but IIRC there are a couple states that do require it to be able to use the title "Software Engineer".


Those states you mention do require it, but what I was getting at is that in general terms Americans don't blink when someone refers to an engineer without reference to licensing.


That's why? So if we merely fixed job titles we wouldn't have this problem of fraudsters?


It would be a good first step measure. There is lots to clean up in the IT industry.


Hate to break it to you I worked for a well known civil engineering consultancy and some of our pitch's where economical with the actuality about our engineers experience :-)


> "Look at who you're replying to."

> "Principal engineer"

"Look at who you're replying to." meant "tptacek" or "tyre"? Or both - a general advice?


I do, in my home country it is not legal to sign off projects as engineer if not certified by the Engineering Order as such.

A certain level of quality is expected, not labeling one selfs as engineers after 1 month bootcamp.


I have no idea if there are rules in my country about calling yourself an "engineer", but you can only use the title 'ir' or 'ing', both of which mean ingenieur, which translates to engineer, if you've graduated from a technical university or high-level technical school (technical college? higher trade school? 'ing' is from a level lower than a university degree).


> What is system engineering?

https://www.incose.org/systems-engineering


I know what it is as I am an actual system engineer; the question was for our self-proclaimed "principal engineer" to see if he has a clue.


> Have you formal education in engineering?

Good point. In many countries getting a degree in engineering takes way, way more effort than in computer science.

Then it takes 10-15 years of work to be called "senior engineer".


And from family experience (uk) its more a case of who you know not what you know.


This is a good post. Have been in the UK startup environment it was crazy how many companies got funding but didn't had a strong senior in their software department.

Because of this their hiring process was aimed to hire more "strong juniors" even tho they didn't realize it. A strong senior is a person who completely changes how you are even approaching the problem or someone who shows you problems you hadn't seen before. They often see tech in the context of business as well.

As you said, usually, they can't be bothered about learning the new framework of the month or new backend language of the month, but, they have a set of battle proven tools to solve tech problems for businesses.

P.S: I do find it hard to distinguish between bullshit talkers and people who actually get stuff done in the interview process.


There's times when i think i'm coming across as the bullshitter. "Tell me of an accomplishment you're proud of". I struggle with this one, but I've given this example years ago.

Worked at a company which did nightly data imports. Things worked until 'companyx' became a cEient, and the imports were huge. They would take 18-20 hours. Then longer. eventually they were touching the 24 hour mark - unacceptable. Client's data would be more than a day behind. Granted it was a moderate amount of data, but shouldn't take that long.

I was 'new' there - only started a month before - and the rest of the team who'd put this together had been there a year or more. I reviewed what was in place, took a couple of days, and got it down to an hour. Then worked with the existing team and we got it down to under 30 minutes with some tweaking.

I do see some eyes rolling when I tell that, as I know it can sound terribly self-aggrandizing. However, I had a decade of experience at this point, and the rest of the team was just out of college; they'd never faced this problem before. I basically just took the data and imported in small chunks in to in-memory tables (to avoid hitting the disk), and copied those to disk every X rows, and dropped indexes until everything was done. It wasn't rocket science, but did take someone who had a deeper understanding of DB mechanics.

As I'm telling this, I always realize they have no way of verifying this, and essentially I'm just another bullshitter. The more believable I sound, there's an equally high chance I'm either really good, or just a really good bullshitter, and nearly every time, the person I'm talking to has no idea how to tell the difference. It's worse as you get older, because the younger folks just think you're waxing nostaligic about the 'good old days'.


It doesn't help that in our field the context for many of these past achievements is quickly lost and forgotten.

You can't just say "I did X in Y by using Z": you need to begin by explaining that once upon a time there used to be a thing called Y, and on that thing it used to be very hard to achieve X, but in those ancient days there was also a tool called Z, etc.


This could be an example of the start of a competency based interviewing question. These questions usually start similar to "tell me about a time when". You ask for an example of them displaying some trait you care about, and then dig in to the details of what happened, why they did what they did etc.

CBI is a fairly effective technique for general interviewing, because you uncover how people actually behave rather than how they like to think they behave. Most of the gold is in the follow up digging questions, which should separate the bullshit answer from a real one.


I think it's also important for seniors to look at new ideas at a regular basis. People who don't, tend to overrate the "good old [insert language/framework here]". You should not stop improving yourself, just because you are already able to get stuff done. (Example: Java introduced lambda functions in Java 8 and many Java seniors are still not using them)


I think the difference a senior person can make is to see what things the new language/framework/paradigm still doesn't solve and think of ways to evaluate their risk to a particular project's requirements and how to mitigate that risk. Things like downtime for upgrades or migrations, handling dependencies on ecosystems of community developed libraries without quality and feature change controls, the ability to update the system when it's been in production for 3 years and see few changes have been made and the original developers have all moved on, etc.


> A strong senior is a person who completely changes how you are even approaching the problem or someone who shows you problems you hadn't seen before.

I think this is overstated. Disruption for the sake of it is often not that helpful in the context of the business (although, in fairness, you do go on to state they often see tech in that context).

I much prefer people who are delivery focused to those who are overly idealistic or want to change everything out of the gate: a good senior understands priorities and knows when to make a trade-off to live with a sub-optimal situation or solution in one area in order to deliver greater value elsewhere.


> I much prefer people who are delivery focused to those who are overly idealistic or want to change everything out of the gate

Do you have problems to solve or not? Often the problem isn't really a problem, except that your current 'solution' is making it so.

I've untied a lot of gordian knots in my career. It really is a thing.

And if you don't have big problems to solve, why do you want a senior developer then? Just hire a junior and keep on going as usual.

Delivery focus is good, till you realise that it's often just delivering status quo for years on end.


Great post


The higher you go the more compromises you have to make.

That said, nobody is talking about disruption, just wisdom.

It just so happens, sometimes that wisdom will tell you that shipping shit out the door in the name of delivery focus is going to cost you more than its worth in the long run. Calling them overly idealistic to justify your laziness just makes you look bad.

I'd say a truly good senior can tell the difference between a startup and a mature company and adapt accordingly. It takes one set of skills to ship shit out the door as quick as possible and an entirely different set of skills to come in and clean up the mess the kids left behind.

This is why you are selling. Not buying. Because nobody wants to clean up your delivery focused mess.


or... nobody wants to pay to clean up the mess. i know people who will do it. I will do it, but I can't clean it up, and delivery loads of shiny new features, and do it all while trying to justify every 30 minute block of time and asking if xyz is 'really' necessary. oh, and you also need to be able to answer some real questions about your own business process, because often your existing practices are just 3000 lines of crap in a function file. when you say "it's broken" and I ask "what's the correct behaviour?" and you don't answer... it'll never get 'fixed'.


I think this falls under the why are you hiring a senior?

Because hiring a senior to ignore the problem is a good way to waste a lot of cash.


Who's talking about shipping "shit"? You are overreading by a very wide margin.

I also don't appreciate being called lazy, and especially by somebody who knows nothing about my business or my team.

Please try to keep your tone civil in future.


I'm not overreading, you said you prefer shipping sub optimal solutions. Sounds like to me.

I didn't call you lazy, I said that shipping in the name of delivery focus was lazy, or at least your argument about idealism was.

Fact is, shipping quality is the far more optimal solution and always will be. Making the trade off and adding technical debt is never a worthy trade long term. The only people who gain from it are you and your team. The rest of the company eventually grinds to a halt and begins taking more and more shortcuts around the code which just reinforces everything in a viscous cycle.

You might find this useful: https://youtu.be/DngAZyWMGR0


> I'm not overreading, you said you prefer shipping sub optimal solutions. Sounds like to me.

I was talking about prioritisation. I am very wary (or perhaps jaded) with people who want to fix everything all at once with no regard to the wider effects on the business of doing so.

I didn't mention anything about quality with respect to what we do choose to deliver, although I can assure you that user experience - of which quality is a key facet - is our utmost concern.

I thought my intent was clear, but sorry if not, and hence my comment about overreading. I wrote very little from which you (and you're not the only one) appear to have extrapolated quite a long way.


No. The difference between talking (very intelligently) and doing is huge, and a sports analogy might be illustrative. If you ask detailed questions about football to hire a NFL quarterback, your most intelligent responses will probably be from a coach. “How should you change the position of your right shoulder if you see that a fast edge rusher is approaching you from the left side and you have two open receivers?” A coach like Bill Belichick might have a fantastic, detailed answer demonstrating a thorough understanding of every aspect of football past and present, but he could never make the throw himself.


> hire a NFL quarterback

That analogy highlights what’s irritating me about this post and this entire thread discussion - there are 1,696 players in the NFL, each with an average salary of $1.9 million. When people talk about hiring “senior engineers”, they behave as thought they’re auditioning an NFL quarterback - the 1 out of 100,000,000 people who can actually perform at that level. For a salary of, on average, about $100,000. When you start out - offering comparatively little but looking for Tom Brady, you’ll pass over pretty much everybody, because the person you’re looking for won’t just not interview with you, they probably don’t exist. After a few months, you’ll relax your standards, and after a few more, you’ll relax them even more and end up hiring a comparative retard like myself - somebody with only a mere 25 years of hands-on development experience and a bachelor’s and master’s degree in CS along with a couple of publications. But if I presented in the first few rounds of interviews, back when you were looking for the guy who could derive the tortoise and the hare algorithm in 30 seconds in front of five people in a boardroom, you would have passed.


> The difference between talking (very intelligently) and doing is huge

To be fair it is environment dependent more than anything else. Forget competence, charisma, intelligence, and everything else about the candidate that could bias their selection and instead look at processes and code already in place the new candidate is jumping into. Does the environment strongly favor original ideas/solutions or does it dictate the narrow acceptance in the most narrow of boundaries?

I have been on both ends of this as well. It is common in software for shops to define success in extraordinarily shallow terms such as whether you are using spaces versus tabs and indenting properly. Another example I experienced last year is whether you should write a giant monster configuration or a small function that receives a single argument. The reasons why shops coat themselves in blankets of code style and configuration is because they typically don't trust their developers and instead strive for a normalized baseline. They are looking for wonderful solutions, but rather task completion.

The lack of conformance isn't necessarily an indication of lower capability, but it is an indication of incompatibility. Competence and conformance are wildly out of sync when the candidate is misjudged relative to the work available. That is completely an assignment failure opposed to a candidate failure. Having gone through this myself it has taught me to ask very probing questions, as a candidate, during the interview. If, as a senior, I can determine I will not be a good fit I will happily disqualify myself.

> A coach like Bill Belichick might have a fantastic, detailed answer demonstrating a thorough understanding of every aspect of football past and present, but he could never make the throw himself.

Would you really hire a coach to be your quarterback? Is that a thought you would really entertain? Even if that coach could do that job he/she would be more valuable doing other work. I would consider this a solid example of interviewer/assignment failure.


> Having gone through this myself it has taught me to ask very probing questions, as a candidate, during the interview.

Like what questions? I'm assuming the work you're looking for is more along the lines of "wonderful solutions" as opposed to "task completion" - is it as simple as asking, "How anal are you guys about code syntax and whatnot?" and, "How hard are your problems actually?"

Or is it more subtle than that?


Here are some I thought of off the top of my head. These are based upon things I actually encountered.

What if I were to provide a solution that executes much faster, requires less documentation, passes test automation, and is a quarter of the code but ignores the framework or standard code style?

The standard DOM methods perform thousands of times faster than other options for interacting with markup. I can prove this with numbers. Code like that is not popular. Will I be allowed to write unpopular objectively superior code?

If I can reduce the application build from 5 minutes to 5 seconds will you let me rewrite the build from Java to Typescript?

A/B testing is a powerful way to determine preferential user behavior and a measured increase in conversion. Will I be allowed to write inward facing experiments to test developer behavior?

What if I provide a function as a solution the makes use of scope and nested functions but offers no support for inheritance?

Is it better to complete a task in 1 hour with original code plus tests or extend existing components with risk of regression and 4 days of effort?


A couple of these questions would be at least yellow flags for me as an interviewer or hiring manager, as they indicate a strong bias for throwing away existing systems ("ignores the framework", "rewrite the build", "original code").

There's a great quote from Lou Montulli[1]:

> I laughed heartily as I got questions from one of my former employees about FTP code the he was rewriting. It had taken 3 years of tuning to get code that could read the 60 different types of FTP servers, those 5000 lines of code may have looked ugly, but at least they worked.

Those frameworks and existing components most likely have a lot of hard-won experience embedded in them, and I would be uncomfortable hiring someone who did not appear to understand or appreciate that.

See also: Chesterton's fence[2].

[1]: https://www.joelonsoftware.com/2000/05/26/20000526/

[2]: https://en.wikipedia.org/wiki/Wikipedia:Chesterton%27s_fence


That is why I would ask those questions... to excuse myself out of your organization. I have found it painful to be at organizations who repeatedly and intentionally make really bad decisions so that their developers deliberately don't have to solve problems (invented here syndrome). If that is what I were looking for I wouldn't have developed the skills that I have.

https://en.wikipedia.org/wiki/Invented_here


Perhaps it is better to leave the menial tasks to the machines. They're far better at it - "on file save, reformat code to this template."


Maybe this is unique to sports? Do other fields exist where a BB-esque expert is totally incapable of practicing? The greatest movie critics probably can't direct or write for crap, but they're not commenting on how the movie was produced, just its output


Could Albert Einstien have actually built a particle accelerator or an atomic bomb?


From what I have seen, most of the academic research principals in computer science are not actually practitioners anymore. They are more like coaches. The actual code is written by graduate students, and most of it isn't even very good because the code isn't the point in most cases.


Not unique. One could convincingly argue that this effect exists in many fields. Art (ie art history major vs artist). Music (ie violin player vs film score composer). Restaurant reviewing (ie critic vs chef). I'm sure there are many more.


You obviously haven't seen Mark Kermodes series on the BBC where he covers exactly this.


Why talk to someone so condescendingly? I mean, what percentage of people do you epxpect to HAVE seen this bit of obscure media you seem to hold as seminal?


If the parent makes the unjustified statement that all cinema reviewers no know nothing about how film is made why not.

And I was not being condescending BTW.


Parent made no such statement. Read it again:

> The greatest movie critics probably can't direct or write for crap

They may know a lot about how a film is made. It does not follow that they could sit in the director's chair, or write a decent screenplay. (And yet, they could still be great at critiquing films!)

This is a valid point. Just knowing how something is made doesn't necessarily mean you know how to make it. It's the difference between a designer and an implementer. Or architect and builder.


I have been amazed at how many "data science" interview questions can be answered with "look it up in the hash table" or "look it up in the literature".

That last one is a real problem-solving strategy, I can't tell you exactly how to balance a Red-Black tree but I know what a Red-Black tree is and where to look up the algorithm for balancing it and I think that's good enough.

One of the master skills for interviewing is "don't choke." Often good people will make mistakes under the pressure of interviewing and will drop out.

When it comes to a practical session there is the same issue that some good people will choke. Just the sample of who chokes in which environment when is different.

Issues like this turn up in all cases where people try to measure merit. For instance, it is known that some low-SES (socioeconomic status) people will choke on standardized tests like the SAT or IQ tests. It is also known that some high-SES people are as dumb as posts and will have that revealed by standardized tests which are harder to bribe or bullshit.

The hostility towards testing is explained by the combination of those two groups: primarily the Emperor doesn't want you to see they have no clothes, but people have sympathy for poor people.


> The reason why some companies cannot figure this out is because they don’t value the problems at hand. They need bodies to put fingers on keyboards

Exactly this.

Too many companies are availability focused instead of ability focused.

The reason? When your processes and engineering are weak, you need people available 24/7 to put out fires.

When I interview for a position, I'm interviewing them as much as they are me.

One of the biggest asymmetries in the whole hiring process, is that of course every company will tell you they have the best development processes, frameworks and code to work on. They may even believe it because they don't know better.

Then you take the job and are stuck fighting fires on a big ball of mud codebase that takes over your life.

If you want to standout as a candidate, try and probe and really find out how good their 'culture is.' Interview them back.

One of the best questions for getting to the real answer is: "What are your expectations for availability?" If they expect availability from you after hours, then their stack is likely unstable because the need availability to keep it running.

If you are lucky enough when you are senior and good at what you do, you don't have to put up with that.

I often tell them about a the third of the way through the interview: "I'm not an availability guy, I'm an ability guy." They often are surprised by the statement. And the discussion that follows it usually tells me whether I want to work for them or not.


> Are things in place to make the job easier? Seniors don’t need easier and this is a huge turnoff.

What? It requires a special craft to design simpler components. Most of the people don't even see it. This is where your senior skills will shine when you'll make it work like a charm as compared to the previous state.

If you don't see the friction or can't reduce it then please take the first exit out.


I think you are conflating simple and easy. They aren't the same. Simple suggests less code, fewer pieces, and a shorter path between code and solution. Easier suggests least effort and less to look at. Simple isn't easy.

The primary difference between simple and easy are decisions. A good senior will spend more energy on considerations for appropriate decisions than the actual work.


One of the best engineering talks is about this notion that simple!=easy : https://www.infoq.com/presentations/Simple-Made-Easy

This is surprisingly often not understood, even by people I showed the video. And I am not sure why. But I do think it's necessary in out field to start understanding this much more deeply, especially for senior engineers.


Good definition of 'Simple'. Simple and easy may not be the same but there are situations where they are similar.

For e.g. - for production deployment, I could either put the entire office upside down. Fires across the departments. Broken applications. Rollbacks. Or, I can automate all of our QA test cases and have a one-click deployment. No huss or fuss. I'm sure after having this experience, someone will definitely say - well, this was easy.

Anyway, good discussion.


> I think you are conflating simple and easy.

I like to state it as complex vs. difficult.

Finding a bug in ten thousand lines of crappy VB code code is 'difficult', but not fun.

Writing an space/time efficient implementation of level set topology optimization is complex.

Senior developers usually love complex problems, but hate merely difficult problems.


Are things in place to make the job easier? Seniors don’t need easier and this is a huge turnoff.

What? I feel like you have a very narrow-minded idea of what sr engineers want.


Well, unambitious seniors are a good bet on who not to hire. They’ll only work what you pay them for, those lazy bastards!


I think I've read in quite a few places that most great programmers are essentially lazy.


"We will encourage you to develop the three great virtues of a programmer: laziness, impatience, and hubris." -- LarryWall

Laziness: The quality that makes you go to great effort to reduce overall energy expenditure. It makes you write labor-saving programs that other people will find useful and document what you wrote so you don't have to answer so many questions about it.

Impatience: The anger you feel when the computer is being lazy. This makes you write programs that don't just react to your needs, but actually anticipate them. Or at least pretend to.

Hubris: The quality that makes you write (and maintain) programs that other people won't want to say bad things about.

(That quoite's originally from Programming Perl 1st edition in 1991, the explanations I think didn't show up until edition 2 in '96 or so...)


A special kind of lazy where they will spend inordinate amounts of effort not to do the same routine task again.


I differentiate between simple and easy. They aren't the same.


He does and he is correct.


I agree with your comments, but let me share my 2c on "trendy framework bs".

Most of interviews I've done to candidates ended up in discussing technology trends and googling around for cool open source projects, libraries and so. This, to me, is a good indicator - as long as you bring them up when discussing relevant problems, this means you thought about a problem and researched prior art to avoid re-inventing the wheel.

Also we tend to end up discussing pro and cons of any given technology and so on. This at the end is a key indicator you're talking to a passionate developer. And passion usually makes someone good at programming.

When it comes to soft skills, usually having such a kind of discussion you can figure out also someones behaviour in most work scenarios (to me, having a good "discussion" mode means you're likely to be fit for team work).


Reinventing the wheel is a cliche I have grown to detest. I have found that cliche is abused 9 times out of 10 to avoid the deep dive into a problem. On the other hand, approaching the problem again allows the developer to consider the precision of current solutions or consider a simpler solution. Good seniors aren’t afraid of making decisions.


I don't know how much experience you have as a professional developer but I can barely remember the problems I solved last week let only any interesting ones I've solved in the past 10 years.


I have maybe half a dozen really interesting stories, and maybe a dozen or two interesting but only to the right crowd stories - from a career spanning a bit over 24 years. And a bunch of war stories about why we shouldn't do things that way, because of the problems we discovered trying that in 1998 or 2003... (And occasionally the "we should try that again because it didn't work back when our colocated production servers had 486's and 320Mb disks - but it'll probably work just fine as a mobile app now"...)


The question to ask is "why" they did something a particular way. If they start rattling off something that sounds like the marketing page of shiny new tech, the they are probably doing resume driven development rather than actually solving the problem in the best way that they can. But then some people seem to value that - I don't.


You recognize good senior by not being interested in new technologies and things that makes the job easier? That is a bit dubious.


To be honest, I don't think your questions were good enough to filter out the smooth talkers. I was in a senior position at a successful agency and was hired before they changed their interview style to match the eye-rolling "technical interview" that is now gold standard. I repeatedly argued they were getting too many false negatives and only a certain kind of developer with this style, but they believed "if Google and Apple do it, it must be good".

Then I was put in a position to hire backend devs and, even though I couldn't entirely change the format of the interview, I came up with questions I believe could filter out the BS artists/not-yet-good-enough from the talent. And it worked.

Example questions that are truly hard to BS:

1. I want to build a new service that crawls the internet for used bikes and presents them for sale. Roughly sketch out the architecture you would use and how all the parts fit together.

- Why this question is good: it is impossible to successfully answer this question if you don't have experience building systems. Even if you try to BS it, your answer will come off shallow and break under any sort of probing. It also reveals how the dev's mind works when approaching problems.

2. Name your favorite tech stack. What is your favorite thing about it and what is your least favorite thing about it?

- Why this question is good: any dev knows that every tech decision comes with good things and terrible things. I love Python, but GIL, circular imports, shitty deployment/package management, 2.x vs 3.x nonsense all suck. If you haven't been in the trenches, you can't answer this _specifically_... you can only answer it _broadly_. And it's very apparent right away to interviewer.

I had about 4 - 5 of these questions. None of them required a single line of code to be written.


I gave an example of one of our erstwhile technical interview questions below. Batting around interview questions is a hacker sport and I've met dozens of developers who delighted in relaying and evangelizing their preferred questions. I'll admit, candidly, that "what's your favorite tech stack and what don't you like about it" --- a question you can literally just look up blog posts on, and a question I feel I could answer credibly for several languages/frameworks I don't even know --- is a uniquely bad one. But even if I felt it was a good one, I wouldn't trust it. Interview questions don't work.


"a question you can literally just look up blog posts on, and a question I feel I could answer credibly for several languages/frameworks I don't even know"

Really? You mean you could memorize a blog post for random languages, go into an interview and answer a question and not completely fall apart from follow up question 1, 2, n?...

You make it sound like I'm asking this question, getting a monologue response and just moving on. These questions facilitate a dialogue between interviewer and interviewee, which quickly reveals your deeper understanding of a subject.

Now, I think your above approach (lookup, memorize, pretend) works great for the kind of technical interview questions that I've been refuting this whole time.

Finally, I don't think my questions are perfect, nor do I think the interview process in general is that great. I think the best way to find talent is to do a trial run for 1 - 2 weeks and see how good they are at taking fuzzy problems and breaking them down into actionable steps, then executing. This is regrettably hard to do for most people and companies.


> I think the best way to find talent is to do a trial run for 1 - 2 weeks and see how good they are at taking fuzzy problems and breaking them down into actionable steps

How do you get people to do a trial run for a week or two?

Most people start looking for the next job when they are already doing their current one. So how can we ask them to do a trial run for a week or two?

Also what about the remuneration(what kind should be given if at all for this period) and does this work for freshers and absolute newbies too?


> Really? You mean you could memorize a blog post for random languages, go into an interview and answer a question and not completely fall apart from follow up question 1, 2, n?

To me, this is the crux of why I agree with you, rather than the questions themselves. Anyone can rehearse an answer to any question, but a dialogue needs to be created in order to actually find out anything about a prospective hire's competencies.

To be useful, an interview needs to be a conversation, not a monologue.


>I think the best way to find talent is to do a trial run for 1 - 2 weeks and see how good they are at taking fuzzy problems and breaking them down into actionable steps, then executing.

Except even if you could get someone to do that it doesn't filter for the right candidates. Someone who can quickly get up to speed in a foreign domain is not the same skill as being able to build great systems/code/etc in a domain you're familiar with. The day to day of a programmer after the first X months if the latter and not the former.


Aren't you two hiring for completely different position? One is looking for web system developer and the other is looking for the kind of coder that will write exploits and smaller tools. In one position you expect the senior to have general idea about wide range of relevant technologies, because you expect senior to choose between them occasionally (plus hiring manager knows that stuff too). Second position does not require it at all.

Plenty of people can do both or can change from one to another. But it still seems to be that the interview focus should be different as both positions require somewhat different focus.


Does one require actually writing code and the other just require talking intelligently about code? If so: I concede.


Writing code is trivial part of what I would expect from senior - making correct decisions about architecture and talking intelligently about code are less trivial parts that make difference between senior and junior. Ultimately, when senior cant write code, that can be found relatively quickly. When he fails at decisions it takes more time and causes more damage. When he cant talk about code, then he is suitable to some positions but not to many others (so it all depends on position at hand).

I don't mind interview having also part where code is written at all. Be it dummy feature or breadth/depth first search or fizzbuzz or any other simple reasonably sized assignment. But in case of senior as in partly decision maker with not much direct supervision in the long term, you need those other capabilities too.


Ask them about the challenges which they've faced when using x, y framework.

Ask them why they prefer one framework to another.

Ask them how they view testing

Walk through with them on a simple whiteboard problem and ask them where they would write test cases.

Watch for the amount of detail they give you. That will give you an indication of what kind of a developer and how deeply they go into problems.


It's even simpler, "tell me about a challenging bug you've vanguished, what it was and how you solved it." People in the trenches love to tell war stories. I do.


For me it tells me a lot about how a person goes about solving the problem, how they're going to avoid it later (if the are), context about the conditions that the person worked in, where their interest lies, and what motivates them.

If their answer involves "well I searched around stackoverflow a lot and asked there" that's a no go for me.


Does it have to crawl the whole internet or can it just scrape the top 5 or so most important sites?


Asking clarifying questions is a good interviewee practice. Well done.


Their response reminds me of the Monty Python sketch about the speed of an unlaiden swallow.


Please define what is "the internet" first ^^


Define if you mean "an internet" or "the Internet" before that :-)


I really wish more interviews asked these questions. They're not only good questions for the reasons you mention, but for the interviewee (at least for me) they tend to be pretty low-stress to answer. Those sort of questions are, after all, something that I might be asked to do or think about during my everyday duties at a new company.

I always sigh in relief when a company I'm interviewing with asks me something like that vs something like "implement x algorithm to solve y niche computer science problem you probably learned in college at some point".


Just to add some thoughts here: I’ve interviewed people who were google L7+ (IC) a couple of times who weren’t very good engineers, at least on the work sample stuff.

I’ve found that the highest correlation to performance in senior engineers is raw algorithmic skill and willingness to say “I don’t know” when you don’t know.

This is not true of hiring devops or sre’s. For those positions, you want the gopher archetype. This is the person who loves hunting down weird behavior and doesn’t give up when debugging. Let’s be real, debugging is you versus the machine and you’re eventually going to win if you can just be honest and patient.

Hiring senior engineers is really hard, I wish everyone luck.


If you're asking questions related to "raw algorithmic skill" you're filtering for people who either: 1) Have had a computer science education and happen to remember the algorithm at hand. This is also a function of recency so senior engineers are less likely to remember any given algorithm. 2) Study algorithms so they can do well at job interviews.

Neither one is something you want to be selecting for. Some of the best engineers I've worked with haven't had a proper CS education. I've known extremely strong engineers with Neuroscience, Mathematics, Physics and Public Policy degrees. I've got a business degree.

Unless you're working in certain extremely hard (and extremely rare) areas you do _not_ need to filter for algorithmic skill. Most ML doesn't count. Neither does Data Science. In 99% of engineering jobs it's more important to be diligent, rigorous, and organized. (Of course, filtering for those is another issue altogether)


Spot on. The reasons are exactly right too; seniority and "ability to recall obscure minutia from college years" are inversely correlated, for obvious reasons.

Companies need to understand that not only are they mis-selecting, but they're broadcasting that they're doing so to all the candidates that go through that process.

Approaching candidates with textbook-style algo or data structure questions merely informs that they're going to be working with an educated but overall somewhat junior lot. That's not necessarily always a deal killer, but it's probably not the image that these interviews are hoping to project.

For well-qualified candidates not applying at an industry headliner like AppAmaGooBookSoft, the interview process quickly inverts itself, and it becomes more about the company selling the candidate on their offer than the candidate selling the company on their skillset. Tread carefully.


The only people that say that a CS education doesn't make you a better programmer are people without a CS education.


CS is not programming just like 99% of musicians don't have music theory degree.


It's probably a great analogy because the best musicians all know basic music theory, whether they learned it in school or on the bandstand. As for the advanced theory that they teach in graduate programs, it isn't even applicable to most genres of music.


> the best musicians all know basic music theory

Do you have any evidence for such a bold claim or is this just speculation?


They might know it instinctually but Funk brothers / James Jamerson (the bass player on a lot of Motown) didn't go to uni to study music.


Interesting analogy.

I bet there's scope to twist it beyond all sensible bounds, and compare the ability of the 99% to the 1%.

I suspect there's top level classical, jazz, and session musicians - who're the industry equivalent of 10x programmers. (And all the other stereotypes probably exist too, I bet there are occasional untrained but gifted musicians who can produce 10x output, but who're amazingly difficult to collaborate with compared to degree level music theory trained musicians... And I bet there are "10 year" musicians with one years experience repeated ten times over.)

The other interesting point there is that probably 99% (or more for, five, perhaps six nines) of "programming" doesn't actually require that much hard-core CS theory. You can get paid well playing covers in bars with a good ear and not being able to read a single note from a chart, just by listening to the originals and copying them over and over in your bedroom. Same as you can make a decent living building basic CRUD websites/apps without having written your own compiler that can compile itself or defended a phd that advances humanities start of the art understanding of something fundamental.


The only problem at that level of musician ship you lose the fun and can end up with some very sterile music that's only of interest to other people who have degrees in music theory.

Btw years ago I did work with a top session guitarist (top 10 hits) who after an accident taught himself to program from his hospital bed.


Good thing parent didn't say that then.


> If you're asking questions related to "raw algorithmic skill" you're filtering for people who either: 1) Have had a computer science education and happen to remember the algorithm at hand.

Probably true. But perhaps that could be accounted for in the assessment process. After all graduates from Neuroscience, Maths and Physics degrees have got to be some to smartest people around.


Agree. A lot of coding is simply banging your head against the wall, search SO over and over, changing things around, until it does what you want.

Raw algo quizzing skill isn't necessarily the same thing, though you'd think it was somewhat related because when you're learning to code up "find longest continuous run" you also need to change things around for a bit.

Difference is in real life there's never an end. The algo quiz leaves you at some optimum eventually due to being quite a small thing.


Coding is the easiest part. Understanding the actual problem and solving it is the hard part.

> A lot of coding is simply banging your head against the wall, search SO over and over, changing things around, until it does what you want.

It doesn't look like programming to me. Yes, sometimes we miss something, so our code doesn't do exactly what we want it to do, but when we realize it we just fix the code. This view of coding resembles an improved way to write Shakespeare with monkeys.


> A lot of coding is simply banging your head against the wall, search SO over and over, changing things around, until it does what you want.

I don't find myself in these situations nearly as often as I did back when I was a junior engineer. But damn, I'm sure I looked busier (and more stressed out) back then.


I was mostly through that phase of my career before any of those things were available. Toward the end of it, stack overflow had just launched. I mostly relied on printed books for help with languages and frameworks I was using.


the ability to prepare for an interview is likely correlated to the ability to do the required work, so that point is moot.


It's not correlated at all. I can ask you questions about implementing a b-tree and then give you a job to fix CSS. Which is the case in most jobs and job interviews.


if you were able to prepare for the btree stuff, the css stuff will be a joke to learn. that is my point.


Algorithmic ability has no correlation to the ability to write maintainable code, though. Most time at work is not spent demonstrating ability to regurgitate algorithms.


> Algorithmic ability has no correlation to the ability to write maintainable code, though.

This is not true in my experience. I usually see a strong correlation between algorithmic ability and writing maintainable code. At various organizations I have worked for, I have seen that the ones with strong algorithm skills also happen to be critical thinkers who put a lot of emphasis on simple, elegant, and robust design and code.

So I am very surprised to know that this correlation I observe may not be true in general. How did you come to this conclusion?


I don't have data but my impression is that there is an inverse correlation. My guess as to why is that people with ability to manage a lot of algorithmic complexity don't seem to suffer when code is complex so they see no need to simplify things.


I'm afraid I can also only offer anecdata. Mine is based on hiring and then working with dozens of SW engs since 2008 (my first tech lead role), plus our student Incubator and open source projects (dozens of more junior devs).

The correlation between emphasis on simple, elegant design and code and algorithmic chops is indeed uncanny.

And I'd add "clarity of articulation" to that -- being able to express your thoughts and the problem/solution structure clearly and succinctly is a great indicator as well. Huge overlap with both code maintainability and algo quality.


I haven't seen this ever. Algorithmic skills have never been correlating with productivity or maintainable code in any of those 7 companies I worked in.


No causation, I'd believe any day. But correlation? I'd want to see a statistically rigorous test done before I'd believe that. Correlations are everywhere and pretty much all good traits are correlated with each other. Hell, even vocabulary size and reflex speed is correlated. And this is probably why even terrible methods for selection usually kind of work.


Unfortunately, often (and maybe more so because I live in Europe) it's also the other way round:

> raw algorithmic skill

It's been ages that I've been asked anything remotely algorithmic. My interviews are mostly about frameworks, how you fit in a team and whether you know / can be "agile".

Not even a Fizzbuzz, much less so quicksort or more special algorithms.

> and willingness to say “I don’t know”

That never got me anything in any interview/company. To be fair, I found a few smart and cool friends because of this, but they themselves don't look as if they've found a good job either.

Being hired (valued?) as a senior engineer is really hard.


It really depends on your domain. If you're into low level hacking and distributed systems, there is a lot more algorithmic work. There is demand for software that's cheap to scale and/or low latency. Some fields are bottlenecked by hardware (machine learning, realtime rendering, etc.) and so benefit from better software. Some production systems still need a large amount of optimization to satisfy economic and product constraints.

I don't think the number of jobs requiring fairly deep systems or algorithmic knowledge has gone down, but the ratio has.


I’ve had trouble finding those systems or algorithmic jobs too, like the grandparent, where some kind of engineering quality matters, be it performance or correctness or something. Everyone wants to hire a “full-stack engineer” to write application code that a junior dev could write, but they want someone senior anyway.


I once had a FizzBuzz question in an interview, The interviewer started, I interrupted him, "We are not talking about FizzBuzz, are we?". He apologized afterwards by saying that there was actually on applicant some time ago, writing 100 printfs.


Why doesn't the gopher archetype also apply to development roles? Persistence pays off if you measure your own results. Knowing how to divide a problem is really important, even if you're great at algorithms.


While persistence is good, it has to be applied correctly. It can become the opposite problem as stubbornness, spending too much time in a small thing that is not actually that relevant because the person really wants to get it done.

More important than persistence IMHO is to know when to be persistent and when not, and those two qualities by OP seem to be quite related to it: "raw algorithmic skill" (to know whether something is optimizable or not) and willingness to say “I don’t know” when you don’t know" (seek help, get the right person for the job, etc).

Edit: I know because I was like that in the beginning; it was okay to learn e.g. micro optimizations when learning programming for fun at university, but it'd have been a big issue if I had not been able to correct it.


I can’t overemphatize how much persistence really paid off in past jobs I had. Coworkers that had persistence were such a delight to work with because they didnt give up the first time they got stuck. They did the nitty gritty work of tailing/grepping logs, using a debugger, endless Googling, print statements, and anything to find out the root cause of a bug... or even to understand a legacy codebase.


It's probably such a delight in part because bright kids are frequently not persistent. They are used to things coming easily. If it doesn't come easily, many are quick to throw in the towel.

Someone who is both bright and persistent can move mountains. But it often has a big social downside. People don't like change. Being bright, persistent and also socially savvy enough to sidestep drama is practically a unicorn.


I've done that sort of work. Eventually you realize that nobody notices the guy who tracked down and killed that vexatious heisenbug in the legacy code base. You need to be working on the new, shiny, high-visibility projects or your career is going to stagnate.


Or change the culture.

Friday demo day (to sales and cust success and everyone else): "hey everyone, know that thing customer X keep complaining about that we've never been able to solve? It's been super tricky. Just wanted to announce that James here figured it out and it's fixed forever. James you're a hero."


Guess what you need strongly depends on what you are hiring for.

Being good at alghorithms only is great if you work in a very isolated area.


Saying “I don’t know” is also very important for devops people, but for slightly different reasons; we’re likely to have a little bit of knowledge about everything, and it’s import for us to be explicit about where our expertise dries up.

I say a lot of things in the form “I’m a little out of my depth in this subject, but my best understanding is that the behavior should be such-and-such; does that sound right to you?”


i dunno. the machine has beaten me more more times than i care to admit. some problems are hard.

i’m L7.


Honestly speaking, since it's an anonymous forum, what's the difference between L5 and L7? Like if I was asked the same question - the difference between L3 and L5 - I'd say that it's just experience with a few interesting known projects that's paid so much better, while the cognitive ability is the same.


Could you give a short explanation of what this L3/L5/L7 thing is?

I'm assuming you're not talking about spinal cord problems, which is the main search result.


I think [1] gives an explanation of it.

[1] https://levelsfyi.com/google-levels-salary/amp/

EDIT: found a better source


Hey I'm from www.Levels.fyi - this is exactly what our site was built for. The link above is a copycat site and you can see the latest leveling info / compare with other companies on ours.


Bound to happen when you call yourself techslave ;)


There’s a reason google doesn’t hire into L7, it’s because their hiring process for algorithms is garbage and provides lots of false positives and negatives when you are looking for real skills.

If saying “I don’t know” and being good at algorithms was enough, you could hire straight to L7 easily or could promote to in a year. Neither of these things happen.


It does always amaze me that literally what you learn in HR 101 in an undergraduate management curriculum is controversial. The r^2 of a ton of different methods for predicting job performance is something large companies are highly incentivized to get studied by academics (and they do).

Spoiler alert: work-sample tests are the only practical and generally tolerated one to use (for software engineers) that is actually correlated with job performance.

@tptacek: how do you feel about people who have large amounts of open source work that is easily reviewed? This obviously doesn’t add new data to your company’s test but is definitely some kind of work sample, albeit not necessarily in a similar environment to the one being hired for.


But it’s not generally tolerated. You end up optimizing for sub-prime candidates because those are the only ones desperate enough to take 4-6 hours out of their free time for a company that hasn’t even bothered interviewing you, yet.

If they want to turn the onsite into one big work sample, by all means, that sounds very effective (and something I’ve seen work well). But in my experience, you’re going to deter qualified candidates by forcing them to do take-home assignments.


Again: this is true of "work sample" tests in their mainstream implementation in our industry (spend 6 hours jumping through a hoop for the privilege of running a standard, nondeterministic interview gauntlet), and those hiring processes are a scourge.

But there's a right way to do it: give work sample challenges and then, at least for the most part, end the technical qualification part of your process there. You spend 4-6 hours at home instead of spending 4-6 hours in front of a whiteboard doing dumb coding challenges.

If I was looking for a job right now, I'd probably refuse to interview anywhere that gave me take-home problems that couldn't promise that a strong result on those problems would take me all the way through technical qualification. But a company that could offer me take-home problems and conclusively make the technical part of the decision on whether to hire me based on those problem would be a strong prospect.


We subscribe to the work sample test as your best option for technical validation, but we also limit ourselves to a small window of time, on site. We pick something we recently worked on, distill it down to something we can knock out in 15 minutes, and give the candidate about 45 minutes to work through the distilled problem and then we explore for about 15 minutes their solution and how they would make it production ready (as a conversation). We then have another 45 minutes of designing a solution to something more complex we recently had to work on, again distilled down. It involves a white board and boxes and arrows. The last 15 minutes is time for them to ask whatever they want of us. We have two interviewers on this technical portion to level out false reads by the other interviewer (I thought this part was a poor answer, but the other interviewer has a different view of it).

I really like it so far. The part I'm battling on it is if the current coding part selects against Java developers. Part of the code we want right now requires a unit test with dependency injection to match an interface. So many of our Java candidates simply can't set up a running unit test. They are used to layers of framework already set up in the IDE and just clicking on stuff. They have full access to Google. Maybe it is good we are filtering out these candidates, but I'm not sure. Still thinking on it.


> The part I'm battling on it is if the current coding part selects against Java developers.

Yes, it does. And let me put it this way: I've used C# and Java at most of my jobs, those theoretically would be my comfort languages, yes?

I do not use those languages on interviews. I often just use C++ (!), or Rust (free unit tests!) if the company tools allow for it, or worst case I'd learn some Python basics. C#/Java are very awkward and boilerplatey in such a small time frame as 45 minutes.


Interesting. To clarify, you are saying that these boiler-platey languages, it is not fair to write a unit test in a 45 minute time block with access to Google? I'm not a Java guy. All the languages I've used, this would not be an issue. Again, I'm wrestling with it because it I feel it should be easy but like 80% of our candidates who chose Java struggle with it.


It sounds like this is a pick-your-own-language type test. I'd suggest scoping down to one, or at most, two, simple allowable options that your team is already pretty comfortable handling.

I've been on the reviewer side of a handful of "choose your own language"-style take-homes recently and found that it's really not good for the candidate if they actually end up using something that wouldn't have been the interview committee's first or second choice. There have been cases where choosing a less-trendy-but-still-totally-viable toolkit has effectively disqualified a candidate, with several committee members not even considering it necessary to look at the code. This is very unfair and lame but an unfortunate reality. I asked that the test be changed to constrain the options at least to a list that wouldn't be immediately disqualifying.

You could advise the candidate ahead of the interview something along the lines of "you'll be asked to write a code sample in either Ruby or Python -- you'll have full access to Google, but you may want to brush up on the basics of these languages if you haven't used them recently".

This does two things: first, it prevents the issue you have, where you're essentially not sure if you're correctly scoring the samples produced. Secondly, it really requires you to constrain the problem to things that someone who has barely used language X or Y can do within 45 minutes.


The theory with pick your own language is they should be able to feel fully comfortable (interviews are already stressful enough). If they picked Rust, Scala, or a lisp dialect, (or anything the interviewers are unfamiliar with), it can even be a better interview because we get more insight on how the candidate communicates and their ability to walk someone else through their solution. A potential other bonus is less biases leak through from an interviewer on "that is a strange way to do that in language X."


> The theory with pick your own language is they should be able to feel fully comfortable (interviews are already stressful enough).

Ah, but there's the rub. Candidates are trying to please the interview panel. If you don't provide guidance, the odds that they'll just use whatever they think the interviewers most prefer are just as good, if not better, than the odds that they'll actually use whatever makes them most comfortable.

You said yourself that since some candidates pick a language that you don't know well, you can't really tell if the failure of a large number of those candidates is reflective of a bad test or just a mismatched candidate pool. IMO, if you're going to stick to the "pick any language" thing, you should at least find out and ensure that any language the candidate picks will have a fair shot.

> it can even be a better interview because we get more insight on how the candidate communicates and their ability to walk someone else through their solution.

You can still get the candidate to communicate and explain his choices if you give an option: "either Ruby or Python" or "either JavaScript or Visual Basic", etc. The problem with having this happen in a language that the interviewers don't know reasonably well is that they are much more vulnerable to the smooth talker who can present incorrect information confidently, and they won't have enough background/anchoring in the subject matter to know the difference.

> A potential other bonus is less biases leak through from an interviewer on "that is a strange way to do that in language X."

I would say that if you're worried that interviewers will load in biases toward their preferred shortcuts etc in a specific language, that you should be equally worried that some good candidates are being excluded for choosing the "wrong" language in an any-language-goes test.

Above, you mentioned that there'd be a positive response if a candidate used "Rust, Scala, or a lisp dialect" -- these are all relatively trendy. What if the candidate used nim, Pony, or some other language that hasn't pulled in to the hype superstation yet? What if the candidate used a language that's not-so-trendy anymore, like Visual Basic, Cobol, or bash? What if the candidate used a programming language of their own design, and brought a copy of the compiler with them on a flash drive?

I'm asking because I've seen this in practice. Candidates for a devops position who chose to use bash to implement the very simple take-home task they were given were laughed off by several other members of the interview committee, despite being potentially high-value senior people -- they were at least senior enough that they're more comfortable performing sysadmin-style tasks in a shell, rather than using a massive CM framework or an awkward amalgamation of Python scripts running os.spawn.

It feels like this type of thing happens a lot, in the same sense that very often, "unlimited PTO" just means "guess whatever amount of PTO is acceptable around here and hope you get it right".


I guess it depends how far back you're starting from. I suspect most Java developers don't spend that much time creating projects from scratch (I do, but the work environments I'm used to suggest I'm an outlier).

I tend to give candidates a simple project already to go, with junit and hamcrest, possibly mockito already available, and ask them to go from there using a provided IDE (which I attempt to get set up as near as they prefer to work as I can). This generally works out fine. I certainly don't feel the boiler-platiness of the language gets in the way, mostly because the IDE is generally capable of doing most of the lifting with that respect anyway, but also because over the timescale of an interview question, we're generally only talking about a couple of classes at most.


It starts from scratch. Familiarity with one's tooling is important. Setting up a project seems like it should be part of the basics. Would it not be unfair to others who choose a different language if Java gets hand holding in terms of initial classes?

For 20% of Java candidates, they do it just fine. Heck, a few echew the IDE and are fine working completely from the terminal (these tend to be particularly very solid at coding). Still wrestling with the idea.

The folks that already work with us (who wrote Java in a former role) see no problem with setting up a project nor do the folks that we hired recently (seeing as they likely passed that technical interview). But that all could be bias.

Maybe you are right. Maybe the next candidate or five in Java will get a base project and we can see how it goes.


Do you give them "their" tooling? I can set up a project of the type you describe in about five seconds, because I have a template for it in my IDE. The best defaults for this that I've seen are provided by IntelliJ (do you provide this in interviews? It seems legally challenging to do) and would probably take me 5-10 minutes to navigate.

I think Java depends much more heavily on powerful tooling to do the heavy lifting, and my experience of using that tooling when I haven't had a chance to configure it in advance has been pretty miserable.


If they're coming in, I'll hand them a laptop with the project set up and ready to go, with IntelliJ running. If they're normally an Eclipse user I'll swap to Eclipse shortcuts and help them manage the IDE as we go.

If it's remote, I'll ask them to share their screen with me with a project set up and ready to go, having emailed them a copy of the interface we're going to implement about 30 minutes before the interview.


Fair enough, seems like you're giving them a fair shake.


It depends on context as well. Where I currently work, they wouldn't have a chance. The corporate firewall will prevent them talking to Nexus, for example. It just wouldn't be fair to expect them to navigate that sort of thing in an interview.

In general, if I'm asking them to code on my (or the company's) hardware, I'd start from an existing project, if only because I wouldn't expect them to be familiar with the installed tools (oh, you use maven? I'm a gradle user ... etc.) On their own laptop, I'd expect them to be more comfortable.

I'm doing a remote interview on Wednesday. That'll be on the candidates machine (because screen-sharing is easy, getting them inside the corporate network, not happening), and they've been told they'll need an IDE ready to go. I'll expect a project set up and ready to go before the call even starts.


What happens if they can't work from home? They're expected to take time off, AND prepare beforehand?

What round is this?


On your own hardware, I'd expect you to be able to have a blank project up in minutes, so yes, I expect somebody who's not travelling to our site to be able to take 5 minutes out of their busy day to create a blank project.


> Familiarity with one's tooling is important. Setting up a project seems like it should be part of the basics. Would it not be unfair to others who choose a different language if Java gets hand holding in terms of initial classes?

Also, how much of their day-to-day work is going to be setting up new projects? I sometimes feel a better test is to be thrown into an existing code base and asked to make a change. It's far more indicative of the sort of work somebody is likely to actually be doing.


>Part of the code we want right now requires a unit test with dependency injection to match an interface.

What exactly do you mean by this?


We ask that the code they write have a unit test. The nature of that unit test is that it has a dependency. They can mock however they like, but passing in an object that represents the dependency (dependency injection) is the easiest and most straight forward way to do that. That object should have a method on it (a know method, with a known signature, also known in some circles as matching an interface).


You ever see or suspect fraud in a take home? I can't imagine it doesn't happen giving that cheating is so prevalent in colleges.


If I was doing this at Google, I would spend a lot of time thinking about test fraud. At a 40-50 person company? Not so much. We do simple follow-up things that raise the amount of effort you'd have to put in to fraudulently submitting work sample tests, and we know pretty exactly how we'd randomize our work sample tests to make it hard to cheat (at least, hard to cheat without doing something we'd be interested in anyways) --- but it's just not worth it right now.

Work sample test fraud is one of those things that sounds like a huge deal on message boards, but when you game through what would be involved in doing it in real life, it makes very little sense.


To my mind, the simplest and most straightforward way to combat any fraud is also beneficial because it gives you even more information about the candidate:

Talk to them about the code they wrote.

Have a conversation, as if they were already your co-worker, with the exercise as the subject. Go through it, ask them -- non-adversarially -- why they did X, what they thought about requirement 2, how they could get better test coverage for Z. If you see something that seems to be a mistake, talk about it. If you see something awesome, discuss it. I can't imagine someone incompetent being able to bullshit their way through detailed discussion of code they were supposed to have written.

And this provides valuable info about how this person thinks and communicates about the work you want them to do.


That seems like a good idea, but does somewhat detract from the notion that once a candidate does the work-sample he is done with the technical part of the process.


True, it deviates from tptacek's recommendations; about scoring and identical questions for everyone as well.


A nice approach I liked (back when I was interviewed by Thoughtworks years ago) was that one of the interview stages was to take the assignment you'd done and apply some new requirements to the story. It rapidly makes it clear if the candidate actually did the assignment or not.


How can I sign up to take these tests for people? Sounds INCREDIBLY lucrative.


Are yo saying most of candidates commit fraud? Do you have any data on it or just following common sense nonsense?


> Are yo saying most of candidates commit fraud?

Nope.


If majority doesn’t, why create a process that punishes everyone? People are lying on the resumes is a very common argument made by creators of insane interview processes. It feels weird to start a relationship with a company with assumption that I’m a cheater.


Who said anything about punishing anyone? 1) tptacek advocated for a specific kind of interviewing, 2) I asked him if he had ever seen a problem that I thought might be an issue with that method, 3) he explained that he wasn't worried about it because of X, Y, Z but acknowledged that for other companies it could be a problem, 4) I thought his answer was very reasonable. Also wool_gather and swish_bob added some useful ideas.

I'm not sure why you felt the need to come out guns blazing.


I agree with that. Sounds like a good process. Although, I am too cynical to believe any company that tells me this will be the only technical part. Too often do recruiters lie/misrepresent the recruitment process. Some seem to operate on the sunk cost fallacy, where you just see it through because what's one more round after already doing several?


I understand your cynicism but: who's got two thumbs and can offer an existence proof that there are companies that really do hire this way? :)

My best guess is that there aren't going to be many companies that will give you definitive statements about what their process will be after challenges who are lying about how they digest work-sample responses. But I don't know and am prepared to be wrong about that.


I've only had one take home assignment task that really worked like that. I did pretty good at it and the interview.

I was co-owner of a startup at the time, the guy I interviewed called me up and said you did really good but I think you really don't want to work here - my wife said 'Wow, that guy was really smart' considering how some things went later it might have been better if I'd said no I really really do


The extremely obvious problem with this is that there's no way of preventing someone from completely blowing a hole in your interview process by simply paying for or hiring another developer to do the take-home problem for them. At that point they've gotten past the technical requirements and now only need the soft skills to execute on it once the rest of the interview process continues.

This is why take-home problems are almost completely irrelevant except for filtering out good candidates. Eventually those problems optimize for perfection which help out those who 'cheat' at the process and people that otherwise put in earnest efforts are rewarded with denials. This is something I've experienced before in my job search where I would put in an honest effort and get 90% of the problem solved, but get denied because my solution wasn't flawless.

So allow me to call bullshit on your own claims that this is the right way of determining technical qualifications.


> get denied because my solution wasn't flawless.

Not all companies handle it like this. I had a take-home exercise as part of an interview last year. I hit a real snag on a fairly small part, I couldn't figure it out, and I ended up leaving a bug in my submission because I simply ran out of time. Very frustrating.

It was raised at the interview; I admitted that I knew it was there, and that I hadn't been able to figure it out. We discussed possible causes: it actually turned into a pretty interesting, though minor, technical conversation. The interviewer eventually told me that he had figured it out after a little investigation (and I expressed my gratitude for the explanation!)

I ended up getting an enthusiastic offer from them.


I'd couple the take-home with a substantive discussion following submission. Harder for a fraudster to talk about how they came up with or tested their solution and how they'd improve it in a real production version.


> I would put in an honest effort and get 90% of the problem solved, but get denied because my solution wasn't flawless

It's possible that the company stated the take-home work in terms of non-negotiable deliverables, and their baseline for allowing a candidate to move forward is 100% of those, and they prioritize candidates who take initiative and do more work beyond the base requirements.

... not that I would agree with such a process (it biases towards people with more free time, like people with fewer dependents and people who are currently out of a job / underemployed), but it's very possible that this is what you faced.


I’m really mixed about this.

As a new immigrant, my wife had trouble finding work as a designer until she was given a take home work sample test. After that, the company that gave her the test hired her quickly and she has been doing well with them ever since.

I’m currently doing the gauntlet thing myself, but none of the interviews I’ve done have asked any technical questions pertaining to the role that they actually want to hire me for. It’s all generic stuff that frankly I would have done better on 20 years ago.


The best interview I ever had was a 2-hour onsite work sample, followed by 1/2 an hour discussing what I'd come up with. I was offered the job the next day.

Surely most people would prefer this to whiteboard tasks?


The best interview I had was a few hours of friendly conversation about the details of my resume.

Then I was hired under probation, as everyone there was, and the understanding was that I could be easily dismissed if it was clear that I wasn't working out.


There is no way in hell I'd ever sign up for that. To be expected to quit your current job on the hope that the probationary period works out is insane.


Much of the States is at will employment, and probation is standard practice where I am.

It's rare that people fail probation; so long as you apply for work you are actually capable of doing...


Also the UK the first 2 years are effectively at will


Yes, I've learned that the hard way. Provided they give you payment in lieu of notice, they don't need much of a reason to let you go.

Myself and a co-worker were once let go for "performance reasons" at 10 months - just after project completion (successful). It was beyond my probationary period, and no issues where raised in the 2 performance reviews.

Their notice period was just 1 month. We were effectively cheap contractors.

My advice now is to treat offers with a low notice period (of them telling you) as a red flag. The norm is 3 months, after probation.


Rather than just make you redundant on statutory terms - doing you on performance risks an industrial tribunal.



The norm where I am, in BC, is two weeks for notice. It increases slightly the longer you're employed.


That's common in Switzerland. But it goes both ways, i.e. you can leave the company any time if you don't like it without breaking the contract.


This would make me damn uncomfortable and stressed out once I started the job, and I'd likely take any other offer over this.


You've had this condition at every job thus far. The real interview is always the work you do.


This is only true in the most extreme case. Many companies have formal procedures around firing people for performance issues. Short of hr violations or literally refusing to do anything, I can't imagine someone being fired before 6 months.


I am confused by the world you live in.

Yes, above a certain size, companies typically have some formal procedures. But typically those are a fig leaf.

In many labour markets, there's a legal 90-day probation or equivalent. You bet your boots some people get dismissed at 80 days. Or the job was contract-to-hire, and the contract doesn't "get renewed".

But on top of that, literally every company I've worked at or any of my friends have worked at (including lotsa startups, two of FAANG, and some in-betweens) will terminate when they want to terminate. In most non-European labour markets that I'm aware of, there's a penalty for doing so, and the company just pays that penalty and gets on with it.

Sometimes there's more security than that, I've heard (but not experienced). And sometimes the company puts in large effort to cultivate the underperforming employee first (had that happen to me once; they tried and I tried but it didn't work out). But the overwhelming majority of cases of my first-hand and second-hand experience, dleslie's summary is about the whole story:

> The real interview is always the work you do


Most places I've worked at in the US have a 90-day probationary period. There's still red tape, but not as much. More common now, however, is to hire people on as contractors for 3 to 6 months. If you work out well, they'll fast-track you to becoming full-time without a second thought. Otherwise, your contract is up and they choose not to renew it. Which makes things less dramatic if there are issues.


Probationary periods are industry standard here in British Columbia.


Fair enough. It's to some extent a cultural thing (there's no need for explicit probation in the US since most employment is at will), but I too wouldn't necessarily like a probationary period, even though I don't foresee it actually being an issue.


In the US there often is a formal probationary period at larger companies which mainly accomplishes one thing: reduce the HR red tape if a new hire isn't working out. During the probationary period it's generally easier to make a case (i.e. little or no documentation needed) that 'they're not working out' and HR will be OK with it vs. after the probationary period, you typically have to 'document' them out of the company.


I'm in the USA, and this (probationary period) has been the case with every job I've had in the past 30 years. I've never heard of a company not doing this in fact.


Same here. Though many mid-size / smaller companies might not advertise this fact (their HR policies are often a bit more ad hoc than larger companies if they haven't been involved in as many labor lawsuits)... but pretty much if there's an HR department, the probationary period exists.


When done well it's great for everyone. A healthy employer wants you to succeed, after all your success is their success.

Thus, probationary periods can be a time of training and growth for the new employee.


I'm not sure how much this is considered but in my state, which is an at-will employment state, being unable to preform job tasks due to lacking the knowledge or technical skill is explicitly defined as NOT a demonstration of cause for termination that would absolve the employer of their financial responsibility toward unemployment compensation.


This is how all hiring works by law where I'm from. It's fine.


Sounds like you may also be from a country with a decent social net along with it?


I guess you could say that.

I suppose you do sort of feel a little stressed during the trial period but I've never seen anyone fail it and it applies at every company, so there's no escaping it anyway. When it was introduced some people got quite upset but I can't really say I think it's had a bad effect.

I guess from the companies perspective if they realize they made a grave mistake they can back out of the hire, but they are still very careful and rigorous in the hiring process just like always. It also allows the candidate to bail if they realize the company wasn't what it said it was. It goes both ways. Again, in practice it seems mostly harmless.

Perhaps the US wouldn't do so well with a similar policy maybe even just due to the crazy healthcare situation going on over there. I couldnt say.


> I suppose you do sort of feel a little stressed during the trial period but I've never seen anyone fail it and it applies at every company, so there's no escaping it anyway.

I'm not sure what you mean by "it applies at every company". Getting hired and then fired a week later is virtually unheard of. This is not a fear I have, at all.

But if you told me it's probationary, that is totally a fear I'd have, I'd get paranoid, so I'd rather work somewhere else. You're basically telling me it's not a real offer in my eyes, and I should not expect stability.

> Again, in practice it seems mostly harmless.

It's extremely harmful in a place with poor labor protections that is the US, for reasons that I don't feel like expanding on and that you can educate yourself on if you wish.


By "it applies at every company" I mean it's part of the contract no matter where you work because by law the employer is allowed a 90 day trial period, so they all put it in the contract, so one offer is just as real as the next and it's just something you have to go through and yeah I guess it's probationary in nature.

Not everyone likes it or agrees with it, and I can only comment on the software industry here and not other industries but it's not the end of the world and the sky doesn't at all fall. When they introduced it a lot of people tried to make arguments like it would be abused etc and as far as I can tell there hasn't really been any drama. YMMV depending on country.


The best I had was a guy asked me to bring in my laptop and show him some code. We talked about it, he asked me to add a simple feature. I think it worked well for both side. Not much of my time wasted. No gotchas because of some configuration issue of setting up a project for the first time.


Yeah, I think 1-3 month probation is really the only way to do it.


I had perhaps the most amazing interview experience recently where it was an open discussion. It wasn't a technical drill down but more a Q&A where you discuss topics relevant to the area you claim to have knowledge of and how it relates to the job you're interviewing for. It was the complete opposite of code this technical problem and a missing semicolon will get you a flat out rejection.

What clicked was I was finishing their sentences and knew precisely what they were asking. It was an incredibly rewarding experience which led to a same day offer.


Sounds great for first job out of college not so great if you already have a good job.


As a junior I’d much prefer a whiteboard where I just need strong CS fundamentals and reasoning to a work sample where there are 10+ different dimensions I could be judged on like style, maintainability, whether I used the latest language features, how I solved the problem ( use libraries or from stratch?) there are just way too may variables and potential bias in the judges VS did you correctly turn the problem in a DP algo.


When I was a junior, I didn't have strong CS fundementals! But I had about 5 years more practical coding experience than most juniors.

Ideally you would be given guidance on what they will judging you on.


Hah, I had a sample test that I had to do after first being met on-site, which I thought was great.

Then they said my work sample was amazing, and they’d like to do an on-site Q&A about it, but when I arrived the engineer hadn’t even seen my work, and proceeded to just quiz me on obscure JS trivia.

Which I promptly failed.

They then rejected me even though they were happy about my work :/


In addition to covering all interview expenses such as travel food and lodging, candidates need to be paid for their time interviewing by the companies interviewing them.

Six hour take home tests is fine, but I want $1350 for that in advance as a consultation fee, and if I hit 6 hrs and it's not done yet they can keep paying until I am done or we can just end it, no refunds.

The idea that I should spend six hours doing free programming for random companies that say they are desperate to find anyone qualified is absolutely absurd and insulting. No one should put up with that, particularly anyone with an established and verifiable career.


It's even worse when you're a consultant. At the moment I'm faced with scheduling my fourth round with a company, and of course it's 5 hours onsite. I have the option of skipping the free lunch, they said. It's nice of them, but my monetary loss for that time (including commute) is enough to pay lunch for a number of people. All that, and they can still can you a few weeks or months into the job if you're a dud.

It's employment, not marriage, guys, lighten up with your strenuous and time-draining processes. We senior employees aren't as, is the word excitable, as the entry-levels about joining your workforce.


That does sound excessive. I've never had more than 2 rounds of interview, and even at that the first round was typically over the phone and the second on-site.


This is pretty silly. Candidates routinely spend integer multiples of six hours running interview gauntlets for tech companies that are notorious for negging candidates. None of them expect to get paid for interviewing. An at-home work sample challenge is strictly less onerous than an interview gauntlet, but because it has the appearance of something people have heard other people get paid for, it's commonly suggested that they should be paid, too.


why are you even interviewing for these "random companies desperate to find anyone qualified" if you are so disgusted by them?

The elitism of some engineers is mind boggling, instead of being grateful working in an industry that has so much demand that you can easily find a job at anytime, you complain about the process being insulting to your oh-so-important persona.

If i really want to work for a certain company because what they are doing excites me, then yes, i'd do a 6 hour take home test where I can probably also learn a thing or two and I am willing to bet a lot of other developers would too. As an interviewer, someone charging >$1000 for a take home test would be an immediate red flag and our mindsets probably don't match up.

If this works fine for you, kudos.


It's good to see a full gamut of opinions on here. I'll say that mentalities like yours are why I'm a contractor in the first place -- you should be grateful for the opportunity, etc and so forth. No. I exchange my time for your money. If that's entitlement to you, we come from opposite worlds.

I'm not the guy who says he should be compensated for take-home tests above. I am however a senior engineer. I'm spurning any long interview processes because they cost money. I don't think it's entitlement, but if that's what you want to call it, so be it. It's called hours worked, hours paid, and in the West it's been a concept since at least the 18th century.


They cost money for both parties though so in that sense it's fair.


I agree with the statement "they cost money for both parties" but I disagree with "in that sense it's fair."

Individual consultant: loses 5 hours of interview time (and commute time) or take-home exam. Let's call it $800 for the sake of argument.

Company: loses 5 hours of interview time, plus the time it takes to "quiz" the exam.

Individual loses money that he / she uses to pay their mortgage.

Company loses profit because the time spent interviewing the candidate could have been spent working on feature Y of the application. So shareholders / VC's lose.

So you're saying it's fair that the individual contributor who loses half a day's pay in your interview process is equivalent (you DID use the word "fair" so that's an equivalence argument you made) to a company's loss of a few hours out of the many thousands of man-hours they rely upon? It's a .0001% of their profit, assuming the employers don't work a few hours more to make up for the lost time, because they will (they're salaried!)


Consultants also do not get paid vacation, or paid sick leave. They also don't get any of the benefits that a lot of regular employees get. But that's part of the game, they have to account for all of that which is why they earn much more per hour. A lot of those "customer acquisition" tasks cost time and may not necessarily yield a return, but that's part of the extra risk consultants have to assume and why not everyone is willing to do it.


Maybe all true with regards to cost considerations, but it doesn't support the notion above that a candidate's wasted time and money (vacation time and sick time costs money for an FTE) and a company's wasted time are somehow .. 'equal'.


That sounds like disorganization and poor internal communication, two traits of the organization probably not just evident in their evaluation of you. You may have dodged a bullet.


I don’t have an opinion on whether or not it is done synchronously or asynchronously and I hope I didn’t imply work-sample = take home test.


Gotcha, yes that makes more sense. I find a lot of unicorns are already doing this sort of interview, anyway. While I have never observed studies, it seems to provide better signals than white-boarding.


Well, I actually willing to spend time on a problem, because it is interesting and I can do it in a fun way and learn something new.

But I have 0 tolerance policy towards puzzle whiteboarding tests. It is total waste of time.


>It does always amaze me that literally what you learn in HR 101 in an undergraduate management curriculum is controversial. The r^2 of a ton of different methods for predicting job performance is something large companies are highly incentivized to get studied by academics (and they do).

Large companies are not "highly incentivized" to optimize they're hiring process. Once a certain throughput is achieved, there's very little reason to revisit it, because if they revisit it, "they're lowering the bar", or not "optimizing for recall" or whatever.

Mindlessly copying large companies isn't all that useful. Microsoft famously loved brain teasers. Can't get the fox, chicken, and grain to the other side of of the river? You must not be able to code either. Google, loved asking your SAT scores because, "obviously" someone that got some arbitrary score on a standardized test, a minimum of 6 years ago, certainly means something today.

Large companies aren't immune from bullshit. In fact, they have the habit of metastasizing bullshit, because of "Well, X does it, so it must be good." X only does it, because they had a stupid idea, became successful because of completely unrelated means, and then fooled themselves into thinking their process was good, "Well, I've been hitting candidates with ball-pein hammer for years, and I'm successful, so screw you."

Interestingly enough, eventually both Microsoft and Google abandoned these interview questions, because eventually, they realized that one had nothing to do with the other, but only after years doing it, and others copying them.


Microsoft and Google do not collectively hire that many people and are not who I am talking about. “Brain-teaser” type questions are explicitly not work-sample tests and, yes, have no correlation with job performance.


The number of people that they collectively hire is irrelevant. They have an outsized influence on interview practices industry wide.


IQ does correlate with work performance, and SAT tests are IQ tests. Brain teasers are an attempt at seeing how well/quickly your brain works to solve problems, is a rudimentary IQ test.

The problem is that requiring IQ tests for employment introduces liability that employers do not want.


No. None of this is true.

Neither IQ nor SAT scores correlate with job performance. SAT scores were requested at Google for years. Explicit aptitude tests have been used in the past, and continue to be used. They are quite legal, as long as they are used for their intend use.

https://www.forbes.com/sites/theemploymentbeat/2014/03/07/em...

https://www.shrm.org/resourcesandtools/tools-and-samples/too...



Allow me to provide a link from the NY Times article your shared.

The TL;DNR: Nuh-uh!

https://www.nytimes.com/roomfordebate/2011/12/04/why-should-...

In the future, you really shouldn't link to dueling articles in an opinion section. It makes this all too easy.


One of those opinions is from a professor of psychology, the other sells test prep services for a living.


SATs are IQ tests? A large portion is math, and the other written language, both taught disciplines. You obviously haven't taken a real IQ test, which is more abstract and has a large portion of spatial geometry type questions.



And isn't one large chunk of the SAT a written language test - that's gong to suck for dyslexic / neurodiverse candidates.


At one point, MENSA would accept a high SAT score as a reasonable proxy for a high IQ.


Hr isn't exactly known for rigorous scientific research and I don't recall it being hr that introduced this trend of hazing candidates.

Also have you got any examples of this research that actually proves this correlation I haven't heard of any and I would have as I have spent decades semi involved in IR (industrial relations) in the UK


Hazing candidates is mostly negative reinforcement introduced to keep people from wanting to go to interviews and change jobs. It also selects for those willing to do a lot and put up with a lot for a corporation.


> highly incentivized to get studied by academics (and they do)

Why do you think they are incentivized to get studied? If anything, power dynamics in large hierarchical organizations keep away any studying done that can threaten the power. Sometimes studies do happen though, but large organization can never apply the results to anything on their scale. They are more worried where to even get such massive stream of professionals to hire.


You’re getting too philosophical here for something that isn’t that complicated. This isn’t related to power dynamics. Asking someone if they can flip burgers is obviously not as effective as asking them to flip a burger and seeing if they can do it correctly. Getting employees who are unquestioning robots with no ambitions for power is a different interviewing issue.


"Everything in the world is about sex except sex. Sex is about power." -Oscar Wilde


Totally agree. I suck at interviewing and freeze up on the spot. Even on-site coding tests I'm not great at. Give me the time to sit back in my own comfortable environment and I'm confident I can solve 95% coding problems (and reasonably quickly).

I've been involved in a lot of hiring and I always advocate a choice of a couple of fairly in-depth (maybe 4-6 hours) take-home coding problems. I think people should be paid for their time on these as well.


I’ve got a friend who is a lot like you. I’ve worked with him, so genuinely know he’s good, but he sucks at interviews. He freezes up and can’t answer simple technical things, things he knows and has successfully done often. Some people simply suck at interviews.


I think practice help with this "Some people simply suck at interviews".

If you practice at home, after work, try with other companies before applying the ones you really wants and you will improve your interview skills by a lot.


Absolutely. Pretty much everything we do can be improved with practice.


Equal problem: Trying to find the best chess player trough friendly conversation about chess.

We could talk all day, but I could not pick Magnus Carlsen over any other professional chess player in conversation. Worse. I can't even differentiate someone below my level from grandmaster level if they are huge chess enthusiasts.

If you want to find programmers who are better than you or rank people close to your level, only the actual work shows the differences. I can recognize someone who is better programmer than me only from the code they write.


Reading https://sockpuppet.org/blog/2015/03/06/the-hiring-post/#work..., it looks like your process is better than most!

However, I've walked away from interview processes that require a work sample that'll take more than a few hours or require lots of domain knowledge to get started. It just isn't worth my time, especially as a barrier to entry before talking to anyone.

Things I am happy to do to see if I'm a good fit:

- Pair with a developer on the team. I learn just as much about the team I'd be joining as the hiring company does about me. - Talking through code that I've written, talking through how I came to the implementation and how I might refactor/enhance further. You can't really bullshit this without the interviewer being a nit.

Things I'll do as a 1099: - Fix a real bug - Refactor real code - Do a project that isn't trivial, start to finish


I recall an interview I was brought into early in my career to talk with someone being brought into be a senior architect... in my mind this meant this dude must be crazy smart right. Everyone in room had all basically made up thier minds and loved the guy. I gave a variable like

a = [1,2,4,8,9];

And asked if he could write a loop and print out the values of each element. He embarrassingly could not.

I always ask this question and then talk and then have everyone solve a problem a simplistic one but it always involves an Http request and some loops...

I like to see how people type too... if you can’t type it’s usually easy to tell if you spend most of your time with a computer or not...


> I like to see how people type too... if you can’t type it’s usually easy to tell if you spend most of your time with a computer or not...

I disagree. When nobody's looking I type pretty fast, but when the whole interviewee team is looking at my screen I suddenly forget all the shortcuts and mistype a lot due to anxiety.


I work with a fantastic developer who types with his index fingers only, hunt and peck style. It boggles the mind.


Being a fast programmer has nothing to do with how fast you type.


But are architects supposed to write code? The example you give is embarrassingly simple and anyone should be (expected to be) able to write code for it. But aren't architects designing architecture. I don't think the architect or anyone in his team at my previous organization's IT dept could write code, but he could design system architecture.


How can you design something that needs to be implemented in code if you yourself can't write code?

If you can't produce in the trenches, what makes you qualified to dictate what those that are in the trenches are doing?


This reply applies to several here.

I know a few architects (as in for buildings and civil structures) who wouldn't be able to pour a foundation, frame a wall, or run plumbing properly. They're working at a different level of abstraction and are concerned with different problems.

Or to put it another way, if you can code does that mean you should be able to design a CPU, even a very basic one? After all, how can you write a program if you don't understand how it works at the instruction, transistor, etc level?


I think this is a linguistic strawman.

Civil Architect is to Builder as Software Engineer is to Datacenter Technician. Software architect is to software engineer as partner at an architecture firm is to architect, or something.

You don't expect a plumber[0] or a building constructor[1] or a civil engineer[2] to become an architect[3], they're different degrees or certifications. A software architect is normally considered a software engineer who works at a grander scope of design. That doesn't map correctly.

[0]: http://dlca.vi.gov/businesslicense/steps/mppcrequirements/

[1]: https://bc.gatech.edu/master-science-building-construction-a...

[2]: https://arch.gatech.edu/bachelor-science-architecture

[3]: https://ce.gatech.edu/academics/undergraduate/bsce


Civil Architect is to Builder as Software Engineer is to Compiler.


Exactly (from experience working in a top 5 consulting engineer) an Architect is nothing like a Chartered civil engineer.


I wouldn't expect a building architect to do a great job pouring a foundation, framing a wall, or running plumbing. But I would expect them to be able to do it, at least to a basic level.

Architects that don't understand how to do the fundamentals of implementing the designs they make tend to make unrealistic designs, that are expensive to implement and may look pretty, but function poorly for the owners and occupants.

This applies to architects of buildings, architects of software, and architects of hardware. Probably to most high level design and supervision type positions.

Edit to add: many of the best architects of buildings are able to do amazing things specifically because they've studied the fundamentals and are able to do innovative things with materials. Of course, time will tell it the innovation worked out well or not.


> I wouldn't expect a building architect to do a great job pouring a foundation, framing a wall, or running plumbing. But I would expect them to be able to do it, at least to a basic level.

waaaaat? That is completely utterly 100% unreasonable. Unless you mean that you expect every adult member of society to do those things?


I agree with GP. If the civil architect doesn't know that you need to put up plywood to pour the concrete in, they will make impossible to build designs.

And I'm not saying that everyone needs to do everyone's jobs. But you shoupd be able yo do at least a bumbling job of the layers of abstraction you work on top of if you want to make a decision in how those things should work.


If an architect that hasn't coded in 5 years or more dictates technological decisions to you, when they are out of touch of current best practices, how do you bridge that gap?

Building codes change. Architects are required to design buildings according to the new codes. There is no such thing in software.

When I first started in industry, storing plaintext passwords in the database was what everyone did. We have moved on from that. I've worked on j2ee web apps, where the EJB tier absolutely had to be on a separate server or servers. Times have moved on from there. I've met very few "software architects" that have kept current without writing code at least 5-10% of their job.


Times do change, but it doesn't mean you need to know every single detailed implementation. If the standard practice was plain text, and then became hash...the architect dictates that passwords needs to be hashed. They don't necessarily need to know how to code it.

In my company my 'enterprise architect' oversees at least 40 applications in my space in various languages and platforms (and he works in other spaces too so I can't give an exact number). I need him to give a good direction on what apps are needed, how apps talk to each other and certain detailed design. He also pitches to management and the business to defend how/why we do certain things.

Last thing he would be required to do is remember to write basic code (though I would hope he can do a swag).


Most software architects aren't working at a level of abstraction that's as much higher as that. They're making decisions like "we should use this framework" or "we ought to have 3 9s of reliability", which can't be meaningfully analyzed without knowing the details of how the coding will be done.


> Or to put it another way, if you can code does that mean you should be able to design a CPU, even a very basic one? After all, how can you write a program if you don't understand how it works at the instruction, transistor, etc level?

You'd be surprised, but this is what computer engineering actually is. From p-n junctions and transistors and all the way to distributed systems.


Difference of terminology I guess, but I worked at a Pharma company and I am talking about the Enterprise Architect, so maybe not the same kind of architect you are talking about. He and his team made decisions related to tech stack. They never made software design decisions. Over at the Pharma company IT higher ups weren't technical anyway.


If he can architect worth a damn, he should be able to code his way out of a paper bag.


I don't understand how one could possibly design a system that they couldn't even begin to implement. Not framework/tool specific stuff, but can't right code? At all?


I'm not sure if this is a common sentiment, but wouldn't an architect that does not know how to code have a very hard time convincing the other (seniors, but in particular juniors I guess) engineers to follow their designs? I guess it is possible to fake a while , but only a while...


I don't expect my architect to know the ins and outs of Go, I expect them to have a strategy that is forward thinking a few years out. Plans on cloud integration, ci/CD requirements, updating legacy systems to be containerized, managing kube clusters, etc. They should know the pros and cons related. They don't need to write a for loop.


Hi, author here, thanks for your feedback. I think you might be right, thinking about rewriting that paragraph a bit.

For me it's been really obvious in interviews whether someone knows what they're doing or not. But probably I've just been lucky in this so far.

Looking forward to reading your post about it. Haven't read it yet, but if it's not in the post, could you give a few examples of what people said, that turned out to be bullshit? And the other way around, too? Not questioning you, just genuinely curious.


I think your key point is less that testing is easy than you should let the candidate prove their worth in their own way. This sounds key to me. Your original quote is… yeah, “easy” happens, but it’s not always elementary.

Testing a candidate is actually very candidate-dependent, I found out; however, it is a lot less hard when you let the candidate pick their own approach (and you supplement it subtly). I found it easier than most interviewers who tend to do scripted, one-size fits all tests.

I always ask candidates to talk about what they’ve done (I’m in data science, so university projects typically can be relevant for day-work; standard CS might not be so lucky). If there are omissions, ellipses, I might prod a bit.

The best candidates can easily juggle between business objective, method, architectural and technical choice, how they scaled a release into marginal improvements from prototypes to a full-fledged platform. It typically takes me one question, two follow-up and 10 minutes to know they are strong. From there, I can shelter them from more technical tests, or at least apologise for the process; ask if they are comfortable with white-boarding, etc.

The worst candidates are not that different: they can sweet-talk their way, but might not be able to scale up or down abstraction. You can prod from them if they are comfortable receiving feedback (based on how they react to you re-framing the question), their learning style (based on how excited when I ask details about specific implementation).

If you are uncertain, which remains most candidates, I like to put them in front of what I have been working on recently. It gets Legal a little nervous (it really shouldn’t) but it puts them in a rare but effective situation: I might not know as much as they do. I probably have had more time to explore, but no idea that they suggest, I can dismiss: either I had the same or it’s new to me and worth trying. I’m not the most structured worker, so I often have a bug, something that could be improved but it’s a priority, some high-level conceptual concern, etc. I make sure that they go for something that excites them or the problem that I’ve identified earlier.

I rarely need more than an hour to have a clear idea, including data preparation.


Hey, thanks for your detailed response. I agree with a lot of what you said, really helped me put my thoughts together.


I'm really not comfortable talking about specific patterns of questions and answers that turned out not to be predictive, because I feel like they'd be traceable (or at least appear, to the people involved, traceable) to particular candidates.

Happy to answer any other questions you might have about our erstwhile process at Matasano (which is more or less our current process at Latacora).


Right, totally understand.

Thanks for the offer! I'll read your post some time tomorrow and might send you a PM, if that's ok.


I have recently gone through interviews in few companies, and in almost all of them I've been skipped the first small technical test. I have a strong online profile [0][1][2] so I was very glad not to have to spend several hours doing some simple fizzbuzz. Don't get me wrong, you could say that I even enjoy doing these small puzzles, but with a fulltime job and trying to select several companies I just didn't want to spend a lot of time on that now and I'm glad they all skipped it.

However, later in the process they all had a more technical test with people. If there was no technical test at all I'd feel the companies didn't know what they were doing and hiring people who didn't know how to code, but if there was too much BS initially I'd feel that they also didn't know that they were doing in the way that they were missing senior devs with many options because they were making it difficult. This just seemed the perfect balance.

[0] https://francisco.io/

[1] https://github.com/franciscop/

[2] https://stackoverflow.com/users/938236


> spend several hours doing some simple fizzbuzz

When I see people mentioning fizzbuzz, it's always to filter out people who really can't program. If you just give me ten minutes to solve a fizzbuzz-like problem, that should be plenty of time for me to solve it given that I do know how to program, and just that single test would be enough to filter out those who don't. I don't see why it would have to take several hours.


Fizzbuzz is an example, I've never seen it on its own. A company doing a pre-screen test would normally put 2-3 small tests, the easiest one fizzbuzz-like, for a total of 1-1.5h.

Multiply that by the number of companies you are interviewing and that can easily be 8-15h in total. I'm glad I didn't have to do that.


I think this might be the best thing I've read on hiring.

As an engineer, I love work-sample tests. And I also like the insight you had, that a test should test code-reading as well as code-writing.

Some people complain about the amount of time that they take, but if you're good at what you do then you should only need to complete one work-sample. Whereas you could easily need to attend many interviews that accurately assess your skill. Some companies even have many-stage interview processes, and these definitely take longer than creating a work sample.


The most annoying thing is when you show yourself perfectly capable of doing the technical challenge and still get passed up on the interview because they completely ignore the results and/or then check your ability to script on a whiteboard. I’d say the work sample is a waste of time if I didn’t view it as a learning exercise.


I couldn't possibly agree more. The current cargo-culted trend of take-home problems is a scourge. If you can't hire almost directly off the results of your challenges, you're wasting time and shouldn't do them at all.


> The current cargo-culted trend of take-home problems is a scourge

I hire based on take-home problems. But then again I don't have candidates whiteboard.

I give candidates time to produce something in the comfort of their own environment that goes beyond solving puzzles on the fly and is actually a reflection of the work people will do day-to-day.


I think something people tend to miss is when these problems are appropriate. If you are hiring from a pool of recent graduates for a junior position, or any such situation where you have a demand for the position, then this might be a good idea as a first step.

For more senior people this kind of test might even be more appropriate, because they are not always working on exactly the same type of thing in recent memory but in their own time will come up with a good solution quickly. However, for more senior people this type of test should certainly not be one of the first steps of the interview process.

You should ideally have much shorter and simpler technical tests during the first interviews to try to weed them out while also giving them exposure to what kind of environment they can expect to be working in, but also tell them this kind of task will be coming and why you are doing it. Once they're more committed to actually wanting the position they will be more likely not mind spending their personal time doing this kind of work exercise, or you could even offer to pay for their time on it to show you are not the kind of company that expects them to do work for free.

If you don't do this, you'll only come across more senior people who are desperate enough to jump through these hoops, which is rare and you better know for sure why they are desperate.


>I give candidates time to produce something in the comfort of their own environment that goes beyond solving puzzles on the fly and is actually a reflection of the work people will do day-to-day

100% this.I find whiteboarding to be the actual scourge, as it involves unrealistic pressure in an unrealistic scenario. It reminds me of tests that inadvertently just test whether people are good test-takers versus assessing their command of the actual subject matter.

I believe hiring devs who are also good at whiteboarding have a bias--even blindspot--where candidate whiteboarding is concerned. They oddly believe it to be a prerequisite for a job that infrequently or never requires it.

As such, they may miss out on good devs who simply thrive in an environment more akin to reality.


I see whiteboarding as a skill that is very useful in presenting or defending design ideas. I've had several cases where I used a whiteboard to layout a design interactively to an audience in a design review. I find I am at my best when I have marker in hand as it gives me control of the conversation.


Yes, but that's a different application of white-boarding.

In the case you describe, you've already done the more "valuable" dev work of thoughtful analysis/design, and you are merely presenting that work.

That's far different from being forced to devise that design at the whiteboard, singlehandedly, with onlookers.

That's not to say that devs don't hash out designs collaboratively via whiteboard (yet a different application). That certainly happens and is useful. But, you're likely to find that--even in those situations--team members have considered the problem offline prior to the collaboration.


> Bullshit. Sounding credible in technical interviews is a skill, not the same skill as actually being a good programmer, and might even (statistically, in the large) be close to orthogonal to it.

I don't agree with this. I think the issue here is that your interviewers aren't going deep enough in the conversation. If you ask someone to tell you about some interesting technical project they worked on and they give a good high level answer, push a level deeper. Still get a good answer? Push another level deeper. Repeat. At some point one of two things will happen: The candidate is capable of talking about the project intelligently at a lower level than the interviewer or the reverse. If the first happens, re-think whom is doing the technical interview, if the later happens, you've found their level of competence.


Strongest possible disagree. There are people who can talk all the way down to the point where they're basically dictating code to you, who nevertheless cannot deliver on a real project. There's a difference between knowing, intellectually, how to solve a problem with code, and actually being able to deliver working code.

In practice, though, what really seems to happen is that technical interviews are split between whiteboard-grade programming puzzles and "friendly deep discussions" about technical problems that both (a) can be prepped for with rote memorization and (b) are extremely game-able by people with strong interviewing skills (for instance: the skill to respond to questions with questions, or to redirect the course of a line of questioning back to safe ground for the candidate).

It's not like we were asking candidates "hey, how did you enjoy your last software security job?". We were doing something closer to "here's a piece of code with a heap overflow in it; talk us through how you'd write an exploit". It didn't work: there were people who could answer those questions indistinguishably from an exploit developer who nevertheless couldn't write a 1995-era stack overflow exploit if their life depended on it.


> It's not like we were asking candidates "hey, how did you enjoy your last software security job?". We were doing something closer to "here's a piece of code with a heap overflow in it; talk us through how you'd write an exploit". It didn't work: there were people who could answer those questions indistinguishably from an exploit developer who nevertheless couldn't write a 1995-era stack overflow exploit if their life depended on it.

Something about this example is not registering for me. I can't imagine how this situation is possible. Going from a detailed, step-by-step description to a code implementation is a trivial rote task in my mind. I must be missing something.


I mean, I've seen very technically proficient devs fail to deliver because there's too many cooks in the kitchen, or because they drown themselves in tooling or bikeshed their project to death. Maybe something like that happened?

There's definitely tons of senior devs out there that care way more about linting and unit testing than they do about writing the code that implements the spec.


Nothing hit quite as hard as the punch to the gut from the bit about linting and unit testing... spot on. I've already decided to move on and such, but my current company is so this that it hurts.


I don't have a perfect theory for it! It's just something I've repeatedly seen happen.


Happened to me. I hired someone onto my team who did great during the interview. The questions weren't puzzles, but live coding sessions. He got all the way to the point at which he had written a semi-functional web server.

This guy couldn't deliver a single project and would spend days debugging issues before he asked for help and I or someone else on the team would solve it in a couple of hours or less. I have no idea why this happens, but I've seen it twice.


> Strongest possible disagree. There are people who can talk all the way down to the point where they're basically dictating code to you, who nevertheless cannot deliver on a real project. There's a difference between knowing, intellectually, how to solve a problem with code, and actually being able to deliver working code.

I'm a little confused by this statement. Is your issue that they aren't _capable_ of writing the code or that their job performance isn't there once hired? i.e. Are the technical chops not there or is the work ethic not there? It seems like you are describing the later to me, or we are talking about different levels of conversation.


Being able to talk intelligently, in depth, and at length about code is not the same skill as being able to write or debug code.


I haven't experienced this, not if you are digging into detail on how they solved previous issues. I'm not saying it doesn't exist, but I've never dug into detail and had coding be the problem. If you leave the conversation at a conceptual level, then I can see it happening much more often.


I'm not OP, but when I've experienced this, it's very similar to something like "I've seen a million heist movies, but absolutely could not steal shampoo from a CVS". There's a very big difference between talking about doing something and actually doing it; there's a level of executive function that's required.

More specifically, you make so many small executive decisions when coding. Naming, how functions and classes are designed, which file/module this goes in, is this already tested, oh I found an unrelated bug, do I make a ticket, a new PR, or include it in this one, etc etc etc. The only thing I've seen that can answer those questions is homework--maybe previous work, but homework is better.

But I think you're remiss if you're not asking your senior engineering candidates how they make those decisions, not because the answers to relatively trivial questions are important, but because you should know--and they should be able to articulate--how they make those decisions, and they should know the pros and cons of their approach and other approaches.


> It didn't work: there were people who could answer those questions indistinguishably from an exploit developer who nevertheless couldn't write a 1995-era stack overflow exploit if their life depended on it.

Do you think the problem was that they couldn't write that particular kind of code, or was it the more general problem of not being able to code well in general?


If you've never been around real bullshit artists it's hard to imagine, but there are people who can talk the talk so well and say all the right things, and yet their actual programming is terrible.

There are also the opposite. Folks who are not great at verbalizing programming principles, but do deliver solid code in reality.

In summary, hiring is hell...


Bullshit artists are honestly easy to spot. They are easy to trip up because they think they are good at fooling people. If they said all the right things, the questions were probably low quality. I would throw a zinger in to see if they can react to it. I honestly expect a candidate to say, "I am not familiar" or "I'm not sure I'd have to look that up" once during an interview.


Well... There are different kinds.

The people I talk about aren't even deliberately fake. They really think they're awesome. They can say all the right things. They're charismatic and sound real smart. They have great anecdotes, dripping with wisdom.

It's just that their work is mediocre at best.


> The people I talk about aren't even deliberately fake. They really think they're awesome.

I guess you don't have to be good at fooling people when you believe it yourself.

(cf. Theranos)


What happens to every candidate who gets flustered by the questioning and seems to bow out earlier than their technical depth?

How do you hire people who are more technical in a subject area than your current staff, if your test bottoms out at the depth of the current interviewer in the room?


> What happens to every candidate who gets flustered by the questioning and seems to bow out earlier than their technical depth?

I'm not sure I get this question - I'm not talking about a white-boarding exercise. I'm just talking to the candidate about a technical project they have worked on in the past, the pain points, how they resolved them, etc. If that results in the candidate being flustered it's not a good sign on numerous axes.

> How do you hire people who are more technical in a subject area than your current staff, if your test bottoms out at the depth of the current interviewer in the room?

That is a _very_ different hiring situation. If you have no one in the company that is equipped to properly technically interview the candidate they obviously you can not properly technically interview the candidate. You'd likely have to leverage the connections of the management team to find the right person to lead up unknown territory.


One of the very best people we ever hired was so flustered in his final interview with us (a formality, after work-sample challenges) that he was visibly shaking during it. If we had taken the "numerous axes" on which the was "not a good sign" seriously, we'd have missed that hire --- and, I think, an entire business unit in our firm wouldn't have been started.

I generally think software developers (I count myself among them) know far, far less about psychological assessment than they think they do.


Of course that happens, but you're taking a real risk in hiring someone who can't get through the interview. I'd rather miss out on a good hire than make a bad one. You only have so many ways to determine whether or not someone is competent.


I don't know what to tell you. We switched to pure work-sample hiring and hired several dozen people that way. We retained all of them. I can't say that about our interview process prior to that: not only were we not picking up the "moneyball" candidates I'm talking about here, but we were also hiring people we (painfully) ended up having to let go.


The only time people face the specific pressure of an interview is in an interview. I promise you, whatever generalization you're making doesn't hold.


My point is that you can't easily discern why they are performing poorly. All things being equal I'll take the person who could answer the questions.


I have an anxiety disorder, so in depth conversation with a stranger is going to cause me to be flustered. If you asked me to give the answer in writing I could. If you asked me to code on a whiteboard I could. If you asked me to do a takehome I could. But a verbal conversation with a stranger judging me in an interview setting is the worst out of all possible options for me.


100x this. Whiteboarding is so outside the typical engineering process that it just feels cruel, especially to people with any kind of anxiety. It's like if you were hiring construction workers and you asked them to perform an interpretive dance about concrete in the interview. It's just not at all the job and very difficult for many of the people who go into programming in the first place. It's hard for me to think of it as anything other than hazing.


> If that results in the candidate being flustered it's not a good sign on numerous axes.

Are you hiring sales reps or engineers? Why are you asking for an engineer to sell you on their past project(s)?

Interviews by their nature are about judgment, and knowing you're being judged by a complete stranger who holds the power over this potential path of your future is enough to get anybody flustered, especially a brainiac that just wants to work in relative peace behind their desk.


> Are you hiring sales reps or engineers? Why are you asking for an engineer to sell you on their past project(s)?

I don't ask them to sell me, I ask them to dive into the technical issues they faced and how they solved them. It tells me a lot more about how they will handle similar issues in the future than scribbling DFS on a white-board would.


Just to clarify, when you say "how they would handle similar issues" do you mean talking to a hiring manager about a past project during an interview setting?

I don't understand what you're calibrating for...


I'm a bit unclear on your question. I'm assuming a technical person with relevant knowledge performs the technical interview, if that is what you mean.


The interview setting, with someone who might or might not be technically capable, with your job / career / earnings at stake is a very different situation than explaining past projects to a colleague at work.


Programmers in a company work with other programmers. If they cannot explain some work they've done then they are useless. Or would you hire a programmer who does not speak English?


I think what the OP is hitting is the difference between being able to explain something and being good at it.

With a deep enough interview process you’ll find people who know what they are doing. But it won’t tell you anything about their practical coding skills; it won’t matter if you are hiring an architect, but if it’s for a main developer you’ll want a decent level of productivity, proficiency with the tools, application of best practices and reviewing skills.

Those aspects are a lot easier to understand with actual code produced.


There have been instances where I have forgotten vocabulary that makes me seem knowledgeable about a project, but I have nevertheless done the work. I sound incompetent but I actually am competent.


IHMO you're comparing two different selection processes.

The process you tried was to spend loads of time on carefully designed interview questions which then get used to filter out candidates.

The OP was not suggesting this approach, but instead suggested engaging in a friendly conversation about technical interests and recent projects.

I can see why the first approach might fail, only because it isn't that difficult to memorize several answers for well know questions and topics.

However, to hold a conversation on a particular topic would require a greater depth of knowledge, so I can see how this would be much harder to fake.

I suspect the second approach is a lot like how police catch people out who are lying. As number of lies grow it just gets harder and harder for the individual to keep all the lies consistent and coherent.


You’re assuming the friendly conversation doesn’t have a structure that shows viable experience. Get somebody to geek out in detail about something they worked on and you’ll be able to tell very quickly if this person legitimately knows their stuff or not. You also have to be able to translate and validate this type of conversation though.


How do you get senior candidates to spend any time and effort on submitting "work samples"?

I'm a senior engineer. Last time I interviewed, I received multiple offers from brand-name employers. None of them required any work samples, just hard technical interviews.

A few times I saw a small unknown company that piqued my interest. I talked to them. Occasionally one would require me to do some sort of a task.

Why would I do it?

I don't need to do any of that for the top employers in the market. Why would I spend hours of my time, and precious coding time, especially on some small company I never heard of before?

I haven't done that in years, and doubt I'll ever do it again. The only way you could get me to do it is if I was desperate for a job.

By requiring this sort of time and effort from candidates, you are screening out the best ones, those who can get multiple offers from great employers without jumping hoops for you.


This seems extremely off to me - from my POV who the hell has time to study algorithms and implementing data structures every time they need to find a job? Trying to make sure I can whiteboard a random CS problem from a massive set of potentials on the fly in front of observers is drastically more involved and requires much more time that serves no other purpose than facilitating interviewing, versus just spending an hour of my time programming a solution to a problem at home.


I think it may depend on how much your day-to-day work involves data structures and algorithms.

Almost any work on compilers will involve good working familiarity with graphs

Work with concurrency will tend to involve linked data structures, like treiber stacks, or modeling computations as directed acyclic graphs of data dependencies

Working in a memory constrained environment tends to involve data structures and algorithms tailored to that domain, etc.

Now it's totally true that many algorithms questions just reduce to brain teasers, or are so esoteric and interview-specialized (Find out if this linked list has a cycle in O(1) space!) that they only serve to stoke the egos of the interviewer. But I wouldn't say that using algorithms or implementing a data structure is a skill that only exists to facilitate interviewing - and being able to explain your thought process as you work is really important when making contributions to other teams' code. I try to focus my interview questions on miniature versions of problems I've actually had to solve in the course of my job. For example, the futex state machine I wrote here:

https://android-review.googlesource.com/c/platform/art/+/810...

If you subtract the interrupt and timeout handling, you have a real problem with a simple, approachable (for someone with some concurrency knowledge) solution, and you only need to know about two futex calls which have simple behavior and are easy to explain in limited time. I wouldn't ask this question of someone looking for a web development position; but then again, I don't interview those candidates because I know nothing about web development. But it's a nice question for someone who listed concurrency on their resume.


Most of the top employers in the market have similar types of interviews so you have to study anyway unless you want to rule out all of them and any time spent preparing for such interviews amortizes across all of them. Meanwhile, a take-home test is almost always a one-off and takes much longer on a marginal basis and rarely is even used to make the final decision - every time I've done a take-home, I've also had to go through the full interview loop.

Also, "an hour of my time programming a solution to a problem" is likely not an actual work-sample/take-home - it's likely a replacement for a phone-screen that doesn't require an engineer to conduct, thus can be given to many more people to keep the top of the funnel large. The only time I've done a real take-home in less than 10 hours, it was rejected for not being polished enough.


> The only time I've done a real take-home in less than 10 hours, it was rejected for not being polished enough.

Because you were competing with folks who were unemployed and probably invested 30+ hours in it.


Not sure what you're arguing.

I'm fine going through tough technical interviews.

I'm not fine spending 3-5+ hours on some trivial coding task that can be gamed and was issued to me by folks who may not have even thoroughly checked my resume.

Given the current supply/demand curve, and the fact that top employers like FAANGs do not require this sort of time and effort (and for a good reason), I don't have to jump through that hoop.


I think what they're arguing is that since most engineers don't use the algorithmic knowledge on a day-to-day basis (and are thus likely to be rusty), the time they'd need to spend reviewing/prepping to pass a whiteboard coding interview is greater than the time it would take to do the work sample.


I kind of agree and kind of dont. I have a super simple inital question I always ask in interviews: Write me a function, in any language, that takes an array (or list) of integers and returns the sum. 90% of people that can do that are decent, and sadly i'd have to say i'm shocked by how many people struggle with this question, or the follow up of "ok interesting implementation, now what can go wrong at runtime?"

It's a great 15 minute question that I find often correlates highly with the people that get offers.


>90% of people that can do that are decent

I can't imagine this being an effective qualifier for any but the most junior of devs.


Honestly, just try it. Have a significant preface about how it’s simple by design, and you are not trying to trick them, if you are hiring, say, a VP or a CTO; call it a warm-up if you want. But absolutely try.

I’ve had a majority of candidates with five-year experience as a data scientist not be able to load a csv file using np.read_csv() The problem wasn’t that it didn’t work (the csv intentionally had a minor quirk that demanded that one uses a non-default option) but the reaction to it not working, or, in a majority of cases, the inability to search (on Google) for “what function allow me to import a csv file into a table” was incredibly telling. From there, you can have a conversation about what they did and it often leads to a more honest portrait of their previous functions. Sometimes, that led to offering them a more suitable position.


>if you are hiring, say, a VP or a CTO

If I were hiring a VP of engineering or CTO, my only goal in administering such a basic test would be to see whether they are offended and prepare to politely leave the interview. Not kidding here.

Sure, CTO duties can run a broad spectrum, depending on company size, etc. It could essentially mean "dev lead with a small team", all the way up to public company CTO with a good command of financials, managing budgets, large multi-level teams, etc. But, at either end and anywhere along that continuum, I would want to skip to probing higher-order thinking around architecture, strategy, people-management, etc.

So, if I've brought in people who I believe are adept at skills like these, I'd expect them to be confused and perhaps even insulted when I pull out a fizzbuzy integer summation problem.


It's a highly effective and quick disqualifier.

100% of those that couldn't do it are not qualified candidates.

Some of those that could do it are good candidates.

When you bring in someone that has a nice looking CV and experience and recommendations and you ask them a very basic question and they can't respond at all, it is a safe disqualifier. It also doesn't take six hours.


>Some of those that could do it are good candidates.

Looks like you edited your comment. In any case, that's a non-statement, as some is quite different from nearly all (i.e. 90%).


>100% of those that couldn't do it are not qualified candidates.

OP stated nearly the inverse: that 90% of those who can do it are qualified candidates.


So you discard 100% of those who can’t do it, knowing that 90% of the rest are qualified.


That's a different argument that gets us back to the original premise I disputed, so you're now just begging the question with numbers.

That is, your numbers are only valid if I accept the premise that I'm rejecting. I don't believe that 90% of people who can answer such a basic question are qualified.


The reality is that some jobs are social and some are heads down technical. More and more companies have no problem keeping an engineer in 20 hours of meetings per week which make people feel good but offers little in technical accomplishment. The rare exception is the shop where a team lead participates in many meetings leading the junior developers. This is an effective multiplier, most meetings are not.


I don’t know what your work-sample looks like, but unless it’s time actually working with your team on a real project, it gave you no indication of whether or not someone was “capable of delivering”

You may think you filtered out people who failed your test, but you have no indication that that’s the case – you could just as easily have filtered out people who don’t like tests or people who don’t test well

I trust you’ve found some decent hires, but I’d suspect you passed on many as well

And, I don’t know about your work environment, but I’d prefer someone who can have a friendly conversation and do good work... I’m surprised you bothered giving your work-sample test to people who didn’t pass the earlier test


> but unless it’s time actually working with your team on a real project..

I've ranted about this before on HN, but that's very much illegal in my neck of the woods. As soon as someone does useful work for you, they're an employee. YMMV but you can't expect new candidates to do actual work until you've hired them.

There are plenty of programming assignments, exercises or questions you can have them do instead.


So have a basic filter, hire based on who meshes well with the team and have a trial period during which they actually get evaluated for competence. They’re technically an employee during the trial period, but with the understanding (and contractual agreement) that they will be evaluated at the end of the period and that the employment will be terminated if they do not meet the criteria.


Trial periods, too, are no longer allowed over here. They used to be permissible up to 6 months (which was an insanely long time to essentially live with the knowledge that you could basically be sacked any day). But now they're gone. Firing people isn't terribly easy either, might need to give two months notice!

Of course, the net result is that self-employed contracting is on the rise. Especially the kind where people are self-employed only in name, but really work for a single employer. The only difference is that they send an invoice at the end of the month, and the employer can terminate the contract at any point (depending on the terms in the contract, of course).

Hiring is hard. Especially so when hiring the wrong person can cost you two months' wages.


> Trial periods, too, are no longer allowed over here.

Ouch. That does make it really hard, impossible even, to make sure you get a good fit.


I will ask you about you projects, then I drill down till I find something technically interesting and have discussion about it. I will concentrate on failures, not on successes.

Gives me better signal than solve a problem on whiteboard question.


What do you do when the person you selected ends up being a wrong fit? Do you try to rehabilitate (makem them fit, if that’s even possible in a small team), of do you cut your loses? What’s worked out for you?


Do you think it's always worth the effort to get that lower false positive rate? Other professions seem pretty happy to do a trial/probation period after less elaborate interviews.


Was there a specific attack vector you commonly saw from candidates in interviews? It would seem pretty difficult to fake having built a real product or made significant open source contributions. In fact would probably be as difficult or more difficult than faking having written a book, and as far as I know book publishers aren't constantly trying to weed out would-be authors who have faked their previous publications.

I'm not saying you're wrong, especially since we hear about this kind of thing all the time, but I just don't see anything about coding that would seem to make it uniquely conducive to this sort of deception.


Interesting examples.

I have interviewed candidates who: * Have impressive products that they "designed and implemented. * Co-authored a book. * Founded and ran a {programming-language} user group.

In all 3 cases, the candidates came across very well when talking at the surface level. Digging deeper, they (in the first example) overqualified their contribution, and none of the 3 could write the simplest of code.

There is a reason why coding tests exist.


When a guy tells you his experience, and you call bullshit, it's hard to tell whether you doubt his sincerity about having the experience at all or whether the experience itself was not representative of your reality. Either way, you're telling us not how it is, but how you are, and that's simply not useful.

In "The Last Starfighter", the recruiter/hiring manager Centauri is constantly accused of using "the Excalibur test" with derision, and which he denies. While I've never seen it explicitly defined, I'd imagine an on-the-spot work sample test to be what it means. "Pull this sword out of this stone and you get the job."

What proponents of the Excalibur test fail to realize is that their methods often spoil the data. Treat me with enough derision and pressure off the bat and I guarantee I'll fail your test even though I do fine at other companies.


I won't apply at, or respond to recruiters representing, those companies that have a reputation for a long and/or torturous hiring process.

While I have no interest in being interrogated with high-stress puzzle solving, it is not the interview process itself that causes my disinterest; it is the understanding, from experience, that how a company treats its potential candidates is often reflective of its general culture.


I was referred to a company that eventually handed me “6 hour” (likely much more to complete sufficiently enough to advance) take home project a few weeks ago and I still can’t bring myself to start it. I read their Glassdoor reviews and the number of folks are reporting either being ghosted or getting a negative response minutes after submitting, after putting days of effort into this exercise, is disturbing. It seems like you lose by not playing the stupid game of asking questions while working on this inane project, so correctness and effort isn’t even enough — you must also pretend to be a moron who can’t figure things out on their own.

I also submitted one of those exercises for a different employer — that one took me about 6 hours but of course they estimate it at “about two”. I submitted it a week ago. They acknowledged receiving it immediately. And that was the last time I heard from this employer.

I’ll follow up tomorrow, but I’m kind of ticked as I was pretty happy with my code, and it’s not even something I can open source if they don’t like it. I also just got an offer elsewhere and I bet mentioning that fact will expedite their process more than me putting in effort into dancing around like a monkey.

Anyway, I’m done ranting, and I’m done doing take homes. I used to be a fan, but only because I preferred them to whiteboard algorithm puzzle questions.


> it’s not even something I can open source if they don’t like it

What? Of course you should... make sure you create a repo with the company name and people can create issues to collaborate on a better ToDo sample code.


You’ll get hit with a DMCA takedown notice if you do that.. most companies protect interview questions they ask and solutions to them quite severely.


There is no way to copyright someone else's work. Whoever produced the solution owns the copyright.


That is probably incorrect in the United States if it is "Work made for hire".

https://www.copyright.gov/circs/circ09.pdf


It is mostly correct. Unless you sign very carefully worded IP assignment agreement to that effect, even work you get paid for remains under your own copyright. And frankly anyone who would sign that kind of thing on a employment test exercise is... Naive.


I doubt they paid for the interview work.


To be a work for hire, the work must be produced by an employee within the scope of their employment. Consequently, you cannot do work for hire if you have not been hired.


This is not true. If the problem statement itself is a unique creation, and the solution is publicly viewable and gives details on the question itself, the question writer may have the copyright even if you signed no NDA when solving it. Even if not, if a company wants to sue you to try to keep their interviewing IP private (as they see it), they can tie you up in court at a trivial cost to themselves but a life destroying cost to a lone developer.

Source: I once had a dragged out legal dispute with a company that used a GitHub DMCA takedown to tell me to remove a public repo in which I solved their takehome challenge (I had signed no NDA or otherwise any type of agreement of any kind prior to solving it).


If it's not an original question find the original source (a source that is older than the company would be especially nice), and post a solution to the question posed by the original source. The world is a big place, a lot of question are not original.


Heads up, tomorrow is a holiday many places in the US, just do you expect at least a day of delay.


More people are getting the day off but it's still less than half at 45 percent.


Downvote if you want but according to Bloomberg 4 hours ago that's the official stat.


Downvote facts. I don't care about the fake worthless internet points. It is just pathetic that people downvote actual facts without having an actual rebuttal or intellectual counterargument. Nope just be a coward and downvote because I don't like it.


Not sure why you got downvoted, but I’m also not sure what point you are trying to convey. Facts are facts like 1 + 1 = 2, but it’s unclear what that fact brings to the conversation.


The top companies with these leetcode tests probably don't care that good people being rejected or avoiding them because of the amount of preparation required. Middle sized companies and startups doing leetcode tests are missing good people and probably can't afford the same number of false negatives as someone like Google with an endless supply of candidates. It seems like smaller companies often lazily copy these processes despite it likely costing them good engineers.

Having gone through the process recently I avoided companies with puzzle solving hiring procedures because I was confident that I could find well paid, interesting, work without jumping though those hoops. As you get more senior there are more people you have met and worked with with employment opportunities. It gets easier to find work by asking around and finding someone who will hire you based on recommendations or having worked together before.


This also applies to when you're actually interviewing. I once went through three rounds and they wanted a fourth. No is such a powerful and unexpected word. That they're still talking to you means they're interested but that they haven't offered you the job means they're not 100% sure. This one scenario I said No they came back with an offer. But it was rejected.


Same for assessment centers. Huge red flag.


Did you counter or flatly reject and it ended there?


I read that as "rejected by the candidate." Hey said "no," they offered, he said "no" a second time.


Would you do a 4-hr take-home assignment? What if I gave you a $100 amazon gift card?

There's a big difference between making someone jump through countless, one-sided hoops and asking for a little effort. The challenge IME is finding the balance.


Personally, if I can get to an onsite with another company after a recruiter chat and phone screen, I’m not even going to do a “2 hour” project. In my experience, actually spending the recommended amount of time on one of these take home projects is a losing proposition when there are people who will spend more time on it.


I'd rather spend those 4h with my kids, and the gift card would be somewhat insulting.


He’s implying the author fell victim to a cognitive bias.* he’s not doubting their honesty , he’s just being harsh on the lack of discipline when it comes to battling your own brain’s tendency to lie to you.

Avoiding biases is hard and painful work, which goes well with being resolute and non compromising. When this accidentally bleeds through to others, because you’re human and you forgot to throttle back, it can seem blunt and disrespectful. But it’s often not meant that way, at all.

I think it’s fair to give him benefit of the doubt, here. From what little I know, Thomas is a man with great respect for data. :)

* Selection bias? Whatever, it’s a bias. Apply the “pick one to sound smart” rule. :)


> Apply the “pick one to sound smart” rule. :)

Confirmation bias! :p


> "Treat me with enough derision and pressure off the bat and I guarantee I'll fail your test even though I do fine at other companies."

If only more companies realized who they are able to hire is a function of how they hire. And it starts with the job / opportunity / role description.

As a side note, it seems to me, all this friction in hiring (senior level talent) makes a good case for developing and promoting from within. I realize it's not that simple. None the less, taking a known and grooming that person isn't rocket science either.


If management is rude and naive enough to be bad at hiring, they can be expected to be equally bad at promoting and retaining internal talent.

For example, there's the simple pattern of ignoring employee careers and expectations until, after the least patient ones have left, suddenly there's a lack of personnel, and the best remaining people are needed in their current dead-end role.


The mistake I see __consistently__ is that when I'm discussing an opportunity / relationship with someone who is "hiring" is that they don't realize I'm interviewing them as well. Fairly often, those who don't realize this are the same ones who will hint / complain about retention and such. They want better people but that have no idea how better people think / feel and how their approach will never get them the type of people they need.


Once I interviewed at a large fashion company for a Python programmer position. I asked to see IT offices (no way, even if I had required a second interview expressly to see them; the building layout suggested some sort of overcrowded basement), I asked about salary (matching my current one would have been a stretch), about benefits like a car (oh no, only for top management!), about training (studying their software after hours, on my own time), about lunch (internal mess hall only, and nobody can leave), about their software engineering approach (overtime, and proud of it). They unconsciously went out of their way to demonstrate a corporate culture of paranoid secrecy and to show that IT had very little consideration within the company. I tried hard to find a reason to like them, and when I found none and refused their offer I got an angry call from the manager: I wasn't "motivated". He thought I had wasted their time.


Wasn't that "test" the video game itself, at least for humanoids like Alex?


This is a sendup, right? Down to quoting the last starfighter?


Ironically, Patrick, Erin, and I did a company after Matasano that was based (sort of) on work-sample hiring (really, more based on CTFs) that was called Starfighter.


(that was my point...)


>So, err on the generous side with your scoring, and down-select with interviews. Iterate until your interviews almost feel pointless. We got there!

I'm having hard time parsing this. First sentence suggests that I should be permissive in the scoring part, and rely on the interview to weed out candidates if we had too many. So interview becomes important. The second sentence says that interviews will stop being important. Can you clarify?


Funny. I agreed with the author. It's relatively easy to gauge technical ability vs those who can pretend. I only use a simple take-home problem when I truly can't tell which type they are. The really great part is that I usually tell a faker from the resume. Maybe I should blog about this.


I want to highlight a third possibility that isn't discussed which is people who can talk about code, but can't deliver (or can't deliver well), but really want to learn how to deliver.

It's fine if you don't want to hire that person, but I'm really annoyed that they often immediately get labeled a bullshit artist in a lot of these threads.

Considering how many shitty positions there are, it's honestly surprising we don't reflexively go "oh, you never learned to do X well because you haven't been with a good company that can teach you to do X well".


I feel the same way. Obviously I can't measure false negatives, because they don't get hired, but I can't remember having been duped before. My process is to ask them to describe their role on some projects on their resume and then keep asking them deeper and deeper questions about it until one of us hits the limit of our knowledge. I only ever ask people questions about things they claim to have done. This probably creates some false negatives with really shy people, but it's easy to suss out the bull shitters.


If you're right, demand 2x your salary, because the industry's track record on this is abominable.


There's a company that turns up here on HN sometimes that claims to find the best engines. They were looking for part time remote screeners/ evaluators. I figured I'd apply. Filled out some stuff online, and then their system wanted me to take a test on java script or something - not my language. It was then that I knew they don't really know how to interview people, so I stopped the application process.


Good to hear you say take-home problems, as I believe it's a much more realistic assessment than is whiteboarding per what the job actually entails.

Being a good dev is about being thoughtful in assessing a problem and designing/implementing a solution. This vs standing up in front a group of people and hastily trying to devise a solution to a random challenge.


Off topic, but:

> DSA/ECDSA is terribly easy to mess up

It’s kinda unfortunate that most crypto seems to be this way…


I've turned down roles in companies that took this approach to hiring. It raises a massive red flag in my mind on the organizations technical pedigree and the sorts of people I would be working with.


Sounds like the problem lies within your organisation. My guess is that you lack someone who can actually evaluate a candidate properly based on conversion and need to resort to other means.


How did you go about creating your work sample test?


It's a process similar to creating a minimum failing case for a regression: take a tightly scoped problem which you'd actually reasonably assign an e.g. intermediate engineer to in your company. File off the serial numbers if you need to, then start reducing all of the unnecessary scope from the production environment until you get to just the core bit that can reasonably be explored in $TIME_BUDGET.

Package that up in a self-contained Vagrant image / Docker image / whatever with all of the skeleton required to run a full answer to completion. Then, take out the full answer. Consider leaving a reduced test suite so that candidates can see if they're making progress in a positive direction.

Now write the rubric by which you'll assess candidate answers. Generally, you'll want to pick ~5 areas which are important for you and write prose describing what 0 points through 4 or 5 points are worth in each of those areas. Then, determine what a passing score is, perhaps by calibrating through running it with existing engineers at the company. And then (this is the brutally difficult part): convince your organization that the rubric now makes hiring decisions.

Then, create a way for people to submit these into your hiring process. This might be as simple as "Create a private gist of this file and email me the link" through something with material tooling developed.

(This answer may or may not be exactly the same as Thomas' answer, but we were co-founders at a company which was quite related to this problem.)


And then run several existing hires through it to validate.

It’s amazing how often hiring processes are built and not tested.


Wouldn't you also need to check that the person wasn't an asshole and could show up on time? Or is this only the technical part of the interview?


"if another engineer is doing the interviewing,..."


I would advice reading "Software Craftsman" by Sandro Mancuso. It contains a chapter about hiring "craftsmen" (best kind of developers).

And many things from the article check out in regards to that.


I'm rather good at interviewing and got myself projects that were over my head a few times.

But, well, I don't want to fail, so I don't get myself in such projects anymore.


This is the only correct answer. White boarding is to coding what Foosball is to soccer.

In my last in-person interview they asked me a bunch of trivia that was a simple google search away - so of course, why would I bother memorizing those things? I got a dear john letter back saying I wasn't smart enough or some such, and it's like "oh great, being called an idiot by an idiot". And that's most of my interviews if I can get past head hunters that don't even bother to do their homework. Do you know "go long"? How would you solve this "cash-ay" problem? Yuck.

Sometimes I just wish people had some common sense and were reasonable and could think. But by and large people just do not give a crap to be good at their job because people by and large are just lazy and suck. Call me a pessimist but man are there some stinkers out there.


  White boarding is to coding what Foosball is to soccer
My preferred analogy is that whiteboarding is to coding as rotisserie baseball (playing GM) is to playing baseball.


That reminds me of the infamous "You, Jeepies?" programmer interview.

https://thedailywtf.com/articles/You,-Jeepies


Agreed. In my experience, senior engineers are even better at glibly skirting what specific contributions they made at a company were, or what their true technical skillset is. They need to be tested.

The fact that a lot of time senior engineers will get pulled in to interview from other companies because someone knows someone and has worked with them for a long time only exacerbates this problem.


> because someone knows someone and has worked with them for a long time only

And this is the #1 thing that gets a near auto-hire from me. I'm not saying don't interview them, but most interviews don't really tell you what it's going to be like working with someone. On the other hand, WORKING with someone (for a long time) 100% tells you what it's like working with someone.

If someone has worked a long time with someone else and is willing and wanting to do so again, then hire them.

My experience is, most interviews suck trying to figure out how someone is going to do on the job. Working with someone a long time and wanting to work with them should weight heavier than anything else


I can't tell if you're saying "look at their resume and see if they've successfully worked on teams for extended periods of time" (don't do this; it's a debacle of a signal) or "hire them if you yourself have worked with them before" (this works fine but network hiring can lead to brittle teams that fall apart as soon as 1-2 people move on).


I read it as someone already with the company has prior work with the candidate and wants to work with them again.


Correct. Best way to know what someone is going to be like to work with, and how good they are, is to work with them. Interview is that helpful


> WORKING with someone (for a long time) 100% tells you what it's like working with someone

Not sure how that's not clear...


Someone I've worked with before and whose work quality I respect may get an "auto-hire" from me, but that doesn't necessarily translate to a "auto-hire" from the team that is going to be working with them. I'm fine introducing good candidates I know if into a recruiting process, but not forcing them through just because I know them. In that sense, I want them to be able to demonstrate their abilities in an interview to the rest of the team, as well.


And yeah, I have strong opinions on this - I've been on both sides of the table for a long time, and don't think the average interview is a good indicator of how they'll do in the job.

I wrote a book about this :) https://www.amazon.com/dp/B07FJ6N8P1


Applications are open for YC Winter 2024

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: