Hacker News new | past | comments | ask | show | jobs | submit login

A lot of this post seems pretty reasonable. But:

In my experience, it’s fairly easy to judge technical skill. A friendly conversation about technical interests and recent projects can often be enough.

Bullshit. Sounding credible in technical interviews is a skill, not the same skill as actually being a good programmer, and might even (statistically, in the large) be close to orthogonal to it.

We found this out the hard way. At Matasano, we started our work-sample hiring process[1] as a way of filtering out the smooth-talkers. Before work-sample tests, we'd spent loads of time on carefully designed interview questions; interview design was something close to a hobby for some of us.

Of course, it was only after we started doing work-sample challenges that we discovered that not only were a lot of excellent-seeming candidates actually not capable of delivering, but an even greater fraction of the candidates our "friendly conversations" were selecting out were in fact perfectly capable. It was a bad deal all around.

Whatever you do, don't fast-path "senior" developers. Everyone should run the same process for the same job. Not only do you risk hiring people who won't work out, but you're also depriving yourself of the most important data you need to iterate on your hiring process.

[1]: https://sockpuppet.org/blog/2015/03/06/the-hiring-post/




I disagree. If you can’t tell the difference between a smooth talker and strong technical competence you are probably interested in the wrong qualities. I have interviewed enough now to see why some companies cannot figure it out. Ask yourself if you really actually want a senior or a strong junior.

It comes down to interest. A good senior got that way because they like solving challenging problems. They are not interested in trendy framework bullshit. In other words talk about the problems and potential solutions. Are things in place to make the job easier? Seniors don’t need easier and this is a huge turnoff.

When I hear companies try to sell me with frameworks and process I know they are blowing smoke. At the very least they are boring and at worst you will be working with incompetent people who are as self-diluted as the company. I agree that filtering candidates is a good idea though.

The reason why some companies cannot figure this out is because they don’t value the problems at hand. They need bodies to put fingers on keyboards and are willing to pay more for people who don’t completely suck. Experience and competence are not the same as excellence but considering the candidate pool I can see why companies compromise on quality.


I hear what you're saying, but I think you're maybe creating a straw man argument here. A smooth talking engineer isn't one who sounds like a used car salesperson or is just hyped about the latest trendy framework.

There are people who can talk through challenging problems at their former companies and how the problems were solved. They can tell you everything you'd want to hear because it's true. Except…they didn't implement it. Maybe they are best friends with the person who did and understand in detail about the tradeoffs and the neat hacks and the insights learned along the way, but couldn't build it themselves.

Those are the "smooth talkers" of the engineering world. Those are the people you can't catch just through a verbal interview.

On a related note:

> If you can’t tell the difference between a smooth talker and strong technical competence you are probably interested in the wrong qualities. I have interviewed enough now to see why some companies cannot figure it out. Ask yourself if you really actually want a senior or a strong junior.

My dude.

Look at who you're replying to.


> Those are the "smooth talkers" of the engineering world. Those are the people you can't catch just through a verbal interview.

I agree with this. I was a hiring manager, and there are those that can really talk technical, in detail. You really think they know what they are doing, how to solve complex problems, how to come up with solutions. You put a keyboard in front of them (or pencil and paper), and they go "uhh, errr, ummm." and fail miserably.

I think until you have interviewed a LOT of people, it can be hard to quickly spot this. Some people are masters at telling you how someone else solved the problem as if they solved it, but they can not solve it themselves.


and don’t forget the reverse problem mentioned originally: false negatives.

you can have someone who is a whiz at practical and specific solutions, who thinks critically and analytically and just gets an enormous amount done WELL. And empowers those around them to boot!

they have the reverse problem to speaking about other peolle’s work as their own. instead, they speak of their own work as teamwork.

this effects many great people. also women and poc are particularly likely to do this because they have been socialized to not speak too highly of themselves. “model minority” etc.

if you as an interviewer are already skeptical of what someone says, you will increase false negatives with people who you are asking to verbally “prove” their work and yet have cultural memories of being penalized for “bragging”. they’ll describe a solution and downplay it as challenging or hard because women aren’t likes le when they’re the smartest person in the room, etc.

an interview process should seek to understand many skills: practical, implementation, execution, problem solving, design, high level, communication skills.

a varied process that focuses on a few specific skills, one at a time, is likely to convey the most accurate signal.


The problem with hiring is that a false positive is much more damaging than a false negative. Getting the group of people together to vet a candidate is expensive; recruiters are expensive; for the candidates, taking the time off is typically pulling from a very limited bucket of just a few weeks every year; flying people in to interview is expensive; and ultimately, to go through all that and hire someone bad makes you go through the whole process again. If it takes you a few months to figure out it's not going to work, it's unlikely anyone desirable from your original candidate pool is still available.

False negatives are expected, and honestly probably good overall and in aggregate, because it decreases the odds of a false positive. One of my first bosses that involved me in the hiring process told me one day that the point of interviewing is not to find reasons to say yes, it's to find a reason to say no.


The "find a reason to say no" can be very damaging as well if taken to the extreme.

I've seen people that were entirely qualified for the position be rejected at the company I work for because they made some totally understandable mistake - I'm talking about people that took the time off to take a 5-hour on-site coding assignment, and made a mistake but would have had a passing (and possibly good) grade on an academic evaluation.

And now we have 5 open positions and nobody hired for them.


> I think until you have interviewed a LOT of people, it can be hard to quickly spot this.

What is your threshold for "a lot"? I have given a few dozen interviews. I always ask at least a couple background questions. The number who even have a polished delivery for that part at all are a minority, and the couple that tried to bullshit me we're painfully transparent. Maybe I just haven't done enough yet.


These problems are very simple to solve. Get them to go into detail. Get them to explain their thought process while looking to solve the issue.

If you feel that they're talking about a problem someone else solved, ask them that directly. (Did you work with others? etc) If they're lacking on the technical details either it's been a long time ago or they didn't do it.


> explain their thought process while looking to solve the issue

I would think they and their colleagues discussed alternatives together, collaboratively in for example a Slack chat — so an interviewee can give you good replies about the thought process and alternative solutions that were considered and discarded. I would assume. Or maybe the interviewee him/herself came up with ideas, that his/her colleagues realized weren't going to work, and explained why, for him/her. Then s/he might be really good at describing the thought process.


If they understand their friend's work deeply, doesn't that imply they've done something comparable themselves?


No, let me give a specific example. Imagine you're interviewing a candidate and they're talking through how to design an analytics service. They begin talking about e.g. database architecture, and how this type of data is most appropriate for a star schema. They start talking about the tradeoffs of row versus column orientation. They mention they'll need to do indexing for performance and talk about the index space versus query speed tradeoff. They say they'll do joins on the x and y tables.

Basically, they volunteer technical challenges they're aware of while simultaneously telling you what the high level solution is. But then you put a terminal in front of them and ask them to set up Postgres in a star schema with some dummy data, and then to write a query joining the two tables they were talking about before. Despite Postgres being on their resume, they'll completely flounder and not even know they need semicolons to terminate commands. Their joins won't just be wildly inefficient, they'll be syntactically incorrect and refuse to run. They won't be able to create, insert, select, truncate, drop, etc. They don't know how to create an index and can't mention any of the options for indexing, let alone the default provided by Postgres.

Keep in mind this example is just meant to be illustrative. Thinking through how to fix the scenario might not generalize to all the ways this can manifest. The kernel of how this arises is a person like so:

1. They read a lot about technical solutions at a high level. They can follow that if you have problem A then you need, roughly, solution B.

2. They have no contextual flexibility or practical foundation for understanding their solutions. They might have read Designing Data Intensive Applications, but they can't actually code and have never administered a database. To the extent they understood the book, they only internalized low hanging fruit.

3. They are charismatic, or ar least comfortable talking about technical topics. They will try to lead the conversation as much as possible, which is where you see them volunteering technical challenges and then offering solutions. But if you force them to answer heavy technical questions which drill deep into a specific area, they'll probably try to zoom back out.


I may be miss-characterizing your point but there have been many times that I’ve sat down at a terminal and not been able to remember how I did something a few months ago. On the job the important thing is knowing the high level goal and the fundamentals of what you’re doing. The syntax can be easily googled.

edit: yeah upon re-reading you’re talking about completely obvious lack of practical experience... fair point


I came multiple times across similar individuals as you have just described. They are very good at talking in details about a tech subject, charismatic, defensive and argumentative all day long over their solution even though it does have lots of holes. But when it comes to actually implementing a task. They usually fall short. They will quickly grap an existing/similar solution from ie github., spend many hours trying to understand it, Then copy-pasted and present it as their work (half-baked solution)

Even though the whole thing could be very simple. Another thing is, they will come up with reasons that the issues with the user story/task for code/solution is due to environment or some other reasons like tools, framework, scalability and "bs".


Wow, I feel like I'm the sort of person this comment is calling out. What advice would you give me so I can be the real McCoy? The only solution I can think of is to keep writing as much code as I can, so I can get real experience instead of just hot air.


Turn off your internet when doing work. Buy a stack of Postgre books. Memorize them. Never, ever, ever use stack overflow. EVER. Don't use google. Memorize everything.

That's what that asshole is looking for. :)

Though to be fair, if you come up with that strategy and can't do -anything- at a sql console, I'm going to ask how you normally interface with the database, because that's like a Linux expert not knowing how to use tar or ls or something.


Most humans use a GUI when they can(with their connections stored and autocomplete and a few other nice-ities. Of course if you only ever use MySql or Postgres through command line, congrats - you know how to do basic connections(I assume using -h makes you an imposter)

As for the Linux/tar piece, I've used Linux on desktop for a few years(both Ubuntu & Fedora) & have used Suse and CentOS for servers for much longer.

I can tell you tar means tape archive. I can tell you I mostly use it with gunzip to compress it. I still google/reverse terminal search what flags to use with it both when archiving it and unarchiving it. I could probably remember some flags if I spent enough time thinking - why would my brain waste that much effort though?

I don't work as a backup administrator. I have better things to worry about knowing/having present at the forefront of my (admittedly human sized) memory.


> That's what that asshole is looking for. :)

Do you suppose you could have made this point without calling me a name?


I don't really intend to call out anyone, so please don't feel that way. Keep in mind what I'm trying to construe is someone who has a lot of broad (not deep) book knowledge, but who can't do even basic practical implementation.

If you can explain how to design a system and you can do it, but don't know the exact commands off the top of your head, my comment isn't describing you. I don't expect people to e.g. know awk like the back of their hand, or to write perfectly compiling code on their first try.

But even if you don't have perfect recall of the commands, it should be pretty clear whether or not you've ever opened an editor and done basic implementation. If the GUI is your thing that's fine. But your knowledge must have some practical foundation which demonstrates you can actually walk the walk.


Writing more code is, indeed, the solution. This may feel challenging or even impossible because it takes away time from doing the research of actually understanding why everything works. Ie, due to time constraints, you may need to implement things more often without understanding how they work. This might make you less good at talking about solutions, but on the other hand better at actually providing them.


> If they understand their friend's work deeply, doesn't that imply they've done something comparable themselves?

Imagine you're interviewing a candidate and they're talking through how to design an analytics service. They begin talking about e.g. database architecture, and how this type of data is most appropriate for a star schema. They start talking about the tradeoffs of row versus column orientation. They mention they'll need to do indexing for performance and talk about the index space versus query speed tradeoff. They say they'll do joins on the x and y tables.

Is this demonstrating deep understanding though?


Well, do you think my comment demonstrates deep understanding of database design? I don't feel I have deep understanding of databases, but I can certainly talk to you about very basic things like indices and joins.

Basically it's like someone else said. They read a book and know a lot of answers, but they can't do the most basic implementation of a solution.


Well, do you think my comment demonstrates deep understanding of database design?

No, it seems about on about the same level as being able to paraphrase the abstract of a paper about the system. I would not take it as showing that someone has read past the first page. A high-level overview just isn't enough for that. You have to ask your own probing questions too. Limiting the conversation to the particular problems they bring up is essentially taking them at their word when they claim to be skilled. I've seen lots of occasions where trying to drill down for a bit more detail on some part of what they talked about consistently came up empty (without going anywhere near sitting down at a computer to write a fizzbuzz equivalent).


How do you suggest getting into designing and working with distributed systems without job experience? Maybe virtualizing a data center on a home server?


There is a huge gulf between being able to follow a thought path and being able to find it oneself. I couldn't do the same work in the same timescale as them, I wouldn't make the same decisions or spot the same pitfalls. I am simply not as good. But I spent a bunch of pub/work time discussing it and could project an aura of authority on the projects if I wished.

Look at it this way. I was able to understand all of the maths proofs taught at my degree, but I could not have come up with them myself.


Yeah, that's a great analogy for the core problem.

Imagine putting a math problem in front of someone and asking them to solve it. They correctly identify it as a system of linear equations. They volunteer that they would solve it using x algorithm which has a time complexity of y.

Then you ask them to actually solve it, and they can't even make the first movement towards doing so. They mentioned LU decomposition, but they can't even do Gaussian elimination on paper. They don't know what elementary row operations are. They can't obtain an augmented matrix or put it into (reduced) row echelon form. They don't know anything about linear independence or the rank of a matrix. You put an inconsistent system in front of them and they keep banging away at it, determined to find a solution...etc.

That's what it's like interviewing one of these senior engineers. It's surreal - they confidently pattern match the problem using limited heuristics, and they toss away low hanging fruit to demonstrate knowledge. But when you ask them to do something practical and specific, they either refuse and zoom out into abstract-land again, or they hopelessly fail.


so for data scientists I guess you could ask them to hand calculate the variance and standard deviation for a sample and see how they do on that


Unless they were pair programming through that problem. They can't and won't. They won't have the hard lessons that it brought.


"Principal engineer"? Do you actually have an engineering degree? Have you formal education in engineering?

What is system engineering?

It's not just you; the entire IT industry is suffering from systemic curriculum vitae bloat. That's why the working conditions are so bad in the professional sense.


The subject of what constitutes an engineer or what entitles a professional to the "engineer" title has been discussed ad nauseum here on HN.

Spoiler: "engineer" as a title for someone who does computer programming and software development without a license is perfectly fine and acceptable in the USA. In Canada, however, it's probably not.


You say the USA as a whole, but IIRC there are a couple states that do require it to be able to use the title "Software Engineer".


Those states you mention do require it, but what I was getting at is that in general terms Americans don't blink when someone refers to an engineer without reference to licensing.


[flagged]


On what basis are you claiming software engineers are not engineers? What constitutes being an "engineer"?

For the record, the National Counsel of Examiners for Engineering and Surveying, which is the US body that regulates engineering licensure and "Professional Engineering", recognizes Software Engineering as a branch of the engineering disciplines:

https://ncees.org/engineering/pe/software/

"Professional (Licensed) Engineer" is not typically used in software development, but if you care about having this credential, you can become a credentialed Professional Engineer in Software Engineering.

Google defines engineering as "the branch of science and technology concerned with the design, building, and use of engines, machines, and structures". An engineer is "a person who designs, builds, or maintains engines, machines, or public works". Software engineers certainly fit that definition from my perspective, because software is a type of virtual or abstract machine.

Software engineering certainly qualitatively feels like other branches of engineering to practice, such as electrical engineering or computer (hardware) engineering. I've never designed a structure, but I imagine the principles are the same: understand the requirements of the customer or sponsor; devise a design that accomplishes those goals within the constraints of the medium you're working with, using principles of scientific reasoning to evaluate what is possible and whether a design will meet your needs.

Last but not least, software can be just as critical to human life and safety as the artifacts designed in other kinds of engineering. (But being critical to health or safety is not a prerequisite for something to be engineering. It's still engineering if you're building a rocket that only robots will fly on, after all)

There may be people whose work on computers does not constitute software engineering. Running some calculations in Excel is probably not engineering. But I think there are plenty of us who build large scale systems that need to operate with high availability under demanding requirements, the properties of which need to be painstakingly planned and analyzed and sometimes mathematically proven -- we who are trained as scientists and engineers, and who apply these principles in our work -- many of us consider our work to be engineering, and consider ourselves engineers. I certainly do.


An engineer gathers requirements through a formally defined process and writes a formal engineering specification which translates those requirements into actionable norms. For example: "server operating temperature shall be between -40 and +70 Celsius at 400 m above sea level and the power supply shall operate at Voltage between 110 to 240 Volts at either 50 or 60 Hz with peak load of 35 Amperes". Or: "the program shall support following options", with a detailed specification on what those options do and how they will do it. Or: "the software shall respond within 25 ms of receiving the request on port 4096. The protocol used shall be TCP". RFC documents are often good examples of system engineering and architecture.

Simply knocking in a program as a code monkey or hacking haphazardly on a program until it works in some way which isn't formally defined isn't an application of scientific theories in computer science into a practical product, which is what defines engineering.


Most programmers I know aren’t simple code mnkeys, they solve actual problems based on the constraints of the problem domain. They model the data and devise algorithms to best process it. Sure, sometimes you haphazardly hack on a program to get it to do what you want, but a lot of the time you have to think critically about it, what you are trying to accomplish and how to do it within the constraints you are given. That sounds like engineering to me and sinse “engineer” isn’t a regulated concept where I am, I have no problem calling myself an engineer. I don’t see how its lying.

You have a very narrow definition of the term.


That's why? So if we merely fixed job titles we wouldn't have this problem of fraudsters?


It would be a good first step measure. There is lots to clean up in the IT industry.


Hate to break it to you I worked for a well known civil engineering consultancy and some of our pitch's where economical with the actuality about our engineers experience :-)


> "Look at who you're replying to."

> "Principal engineer"

"Look at who you're replying to." meant "tptacek" or "tyre"? Or both - a general advice?


I do, in my home country it is not legal to sign off projects as engineer if not certified by the Engineering Order as such.

A certain level of quality is expected, not labeling one selfs as engineers after 1 month bootcamp.


I have no idea if there are rules in my country about calling yourself an "engineer", but you can only use the title 'ir' or 'ing', both of which mean ingenieur, which translates to engineer, if you've graduated from a technical university or high-level technical school (technical college? higher trade school? 'ing' is from a level lower than a university degree).


> What is system engineering?

https://www.incose.org/systems-engineering


I know what it is as I am an actual system engineer; the question was for our self-proclaimed "principal engineer" to see if he has a clue.


> Have you formal education in engineering?

Good point. In many countries getting a degree in engineering takes way, way more effort than in computer science.

Then it takes 10-15 years of work to be called "senior engineer".


And from family experience (uk) its more a case of who you know not what you know.


This is a good post. Have been in the UK startup environment it was crazy how many companies got funding but didn't had a strong senior in their software department.

Because of this their hiring process was aimed to hire more "strong juniors" even tho they didn't realize it. A strong senior is a person who completely changes how you are even approaching the problem or someone who shows you problems you hadn't seen before. They often see tech in the context of business as well.

As you said, usually, they can't be bothered about learning the new framework of the month or new backend language of the month, but, they have a set of battle proven tools to solve tech problems for businesses.

P.S: I do find it hard to distinguish between bullshit talkers and people who actually get stuff done in the interview process.


There's times when i think i'm coming across as the bullshitter. "Tell me of an accomplishment you're proud of". I struggle with this one, but I've given this example years ago.

Worked at a company which did nightly data imports. Things worked until 'companyx' became a cEient, and the imports were huge. They would take 18-20 hours. Then longer. eventually they were touching the 24 hour mark - unacceptable. Client's data would be more than a day behind. Granted it was a moderate amount of data, but shouldn't take that long.

I was 'new' there - only started a month before - and the rest of the team who'd put this together had been there a year or more. I reviewed what was in place, took a couple of days, and got it down to an hour. Then worked with the existing team and we got it down to under 30 minutes with some tweaking.

I do see some eyes rolling when I tell that, as I know it can sound terribly self-aggrandizing. However, I had a decade of experience at this point, and the rest of the team was just out of college; they'd never faced this problem before. I basically just took the data and imported in small chunks in to in-memory tables (to avoid hitting the disk), and copied those to disk every X rows, and dropped indexes until everything was done. It wasn't rocket science, but did take someone who had a deeper understanding of DB mechanics.

As I'm telling this, I always realize they have no way of verifying this, and essentially I'm just another bullshitter. The more believable I sound, there's an equally high chance I'm either really good, or just a really good bullshitter, and nearly every time, the person I'm talking to has no idea how to tell the difference. It's worse as you get older, because the younger folks just think you're waxing nostaligic about the 'good old days'.


It doesn't help that in our field the context for many of these past achievements is quickly lost and forgotten.

You can't just say "I did X in Y by using Z": you need to begin by explaining that once upon a time there used to be a thing called Y, and on that thing it used to be very hard to achieve X, but in those ancient days there was also a tool called Z, etc.


This could be an example of the start of a competency based interviewing question. These questions usually start similar to "tell me about a time when". You ask for an example of them displaying some trait you care about, and then dig in to the details of what happened, why they did what they did etc.

CBI is a fairly effective technique for general interviewing, because you uncover how people actually behave rather than how they like to think they behave. Most of the gold is in the follow up digging questions, which should separate the bullshit answer from a real one.


I think it's also important for seniors to look at new ideas at a regular basis. People who don't, tend to overrate the "good old [insert language/framework here]". You should not stop improving yourself, just because you are already able to get stuff done. (Example: Java introduced lambda functions in Java 8 and many Java seniors are still not using them)


I think the difference a senior person can make is to see what things the new language/framework/paradigm still doesn't solve and think of ways to evaluate their risk to a particular project's requirements and how to mitigate that risk. Things like downtime for upgrades or migrations, handling dependencies on ecosystems of community developed libraries without quality and feature change controls, the ability to update the system when it's been in production for 3 years and see few changes have been made and the original developers have all moved on, etc.


> A strong senior is a person who completely changes how you are even approaching the problem or someone who shows you problems you hadn't seen before.

I think this is overstated. Disruption for the sake of it is often not that helpful in the context of the business (although, in fairness, you do go on to state they often see tech in that context).

I much prefer people who are delivery focused to those who are overly idealistic or want to change everything out of the gate: a good senior understands priorities and knows when to make a trade-off to live with a sub-optimal situation or solution in one area in order to deliver greater value elsewhere.


> I much prefer people who are delivery focused to those who are overly idealistic or want to change everything out of the gate

Do you have problems to solve or not? Often the problem isn't really a problem, except that your current 'solution' is making it so.

I've untied a lot of gordian knots in my career. It really is a thing.

And if you don't have big problems to solve, why do you want a senior developer then? Just hire a junior and keep on going as usual.

Delivery focus is good, till you realise that it's often just delivering status quo for years on end.


Great post


The higher you go the more compromises you have to make.

That said, nobody is talking about disruption, just wisdom.

It just so happens, sometimes that wisdom will tell you that shipping shit out the door in the name of delivery focus is going to cost you more than its worth in the long run. Calling them overly idealistic to justify your laziness just makes you look bad.

I'd say a truly good senior can tell the difference between a startup and a mature company and adapt accordingly. It takes one set of skills to ship shit out the door as quick as possible and an entirely different set of skills to come in and clean up the mess the kids left behind.

This is why you are selling. Not buying. Because nobody wants to clean up your delivery focused mess.


or... nobody wants to pay to clean up the mess. i know people who will do it. I will do it, but I can't clean it up, and delivery loads of shiny new features, and do it all while trying to justify every 30 minute block of time and asking if xyz is 'really' necessary. oh, and you also need to be able to answer some real questions about your own business process, because often your existing practices are just 3000 lines of crap in a function file. when you say "it's broken" and I ask "what's the correct behaviour?" and you don't answer... it'll never get 'fixed'.


I think this falls under the why are you hiring a senior?

Because hiring a senior to ignore the problem is a good way to waste a lot of cash.


Who's talking about shipping "shit"? You are overreading by a very wide margin.

I also don't appreciate being called lazy, and especially by somebody who knows nothing about my business or my team.

Please try to keep your tone civil in future.


I'm not overreading, you said you prefer shipping sub optimal solutions. Sounds like to me.

I didn't call you lazy, I said that shipping in the name of delivery focus was lazy, or at least your argument about idealism was.

Fact is, shipping quality is the far more optimal solution and always will be. Making the trade off and adding technical debt is never a worthy trade long term. The only people who gain from it are you and your team. The rest of the company eventually grinds to a halt and begins taking more and more shortcuts around the code which just reinforces everything in a viscous cycle.

You might find this useful: https://youtu.be/DngAZyWMGR0


> I'm not overreading, you said you prefer shipping sub optimal solutions. Sounds like to me.

I was talking about prioritisation. I am very wary (or perhaps jaded) with people who want to fix everything all at once with no regard to the wider effects on the business of doing so.

I didn't mention anything about quality with respect to what we do choose to deliver, although I can assure you that user experience - of which quality is a key facet - is our utmost concern.

I thought my intent was clear, but sorry if not, and hence my comment about overreading. I wrote very little from which you (and you're not the only one) appear to have extrapolated quite a long way.


No. The difference between talking (very intelligently) and doing is huge, and a sports analogy might be illustrative. If you ask detailed questions about football to hire a NFL quarterback, your most intelligent responses will probably be from a coach. “How should you change the position of your right shoulder if you see that a fast edge rusher is approaching you from the left side and you have two open receivers?” A coach like Bill Belichick might have a fantastic, detailed answer demonstrating a thorough understanding of every aspect of football past and present, but he could never make the throw himself.


> hire a NFL quarterback

That analogy highlights what’s irritating me about this post and this entire thread discussion - there are 1,696 players in the NFL, each with an average salary of $1.9 million. When people talk about hiring “senior engineers”, they behave as thought they’re auditioning an NFL quarterback - the 1 out of 100,000,000 people who can actually perform at that level. For a salary of, on average, about $100,000. When you start out - offering comparatively little but looking for Tom Brady, you’ll pass over pretty much everybody, because the person you’re looking for won’t just not interview with you, they probably don’t exist. After a few months, you’ll relax your standards, and after a few more, you’ll relax them even more and end up hiring a comparative retard like myself - somebody with only a mere 25 years of hands-on development experience and a bachelor’s and master’s degree in CS along with a couple of publications. But if I presented in the first few rounds of interviews, back when you were looking for the guy who could derive the tortoise and the hare algorithm in 30 seconds in front of five people in a boardroom, you would have passed.


> The difference between talking (very intelligently) and doing is huge

To be fair it is environment dependent more than anything else. Forget competence, charisma, intelligence, and everything else about the candidate that could bias their selection and instead look at processes and code already in place the new candidate is jumping into. Does the environment strongly favor original ideas/solutions or does it dictate the narrow acceptance in the most narrow of boundaries?

I have been on both ends of this as well. It is common in software for shops to define success in extraordinarily shallow terms such as whether you are using spaces versus tabs and indenting properly. Another example I experienced last year is whether you should write a giant monster configuration or a small function that receives a single argument. The reasons why shops coat themselves in blankets of code style and configuration is because they typically don't trust their developers and instead strive for a normalized baseline. They are looking for wonderful solutions, but rather task completion.

The lack of conformance isn't necessarily an indication of lower capability, but it is an indication of incompatibility. Competence and conformance are wildly out of sync when the candidate is misjudged relative to the work available. That is completely an assignment failure opposed to a candidate failure. Having gone through this myself it has taught me to ask very probing questions, as a candidate, during the interview. If, as a senior, I can determine I will not be a good fit I will happily disqualify myself.

> A coach like Bill Belichick might have a fantastic, detailed answer demonstrating a thorough understanding of every aspect of football past and present, but he could never make the throw himself.

Would you really hire a coach to be your quarterback? Is that a thought you would really entertain? Even if that coach could do that job he/she would be more valuable doing other work. I would consider this a solid example of interviewer/assignment failure.


> Having gone through this myself it has taught me to ask very probing questions, as a candidate, during the interview.

Like what questions? I'm assuming the work you're looking for is more along the lines of "wonderful solutions" as opposed to "task completion" - is it as simple as asking, "How anal are you guys about code syntax and whatnot?" and, "How hard are your problems actually?"

Or is it more subtle than that?


Here are some I thought of off the top of my head. These are based upon things I actually encountered.

What if I were to provide a solution that executes much faster, requires less documentation, passes test automation, and is a quarter of the code but ignores the framework or standard code style?

The standard DOM methods perform thousands of times faster than other options for interacting with markup. I can prove this with numbers. Code like that is not popular. Will I be allowed to write unpopular objectively superior code?

If I can reduce the application build from 5 minutes to 5 seconds will you let me rewrite the build from Java to Typescript?

A/B testing is a powerful way to determine preferential user behavior and a measured increase in conversion. Will I be allowed to write inward facing experiments to test developer behavior?

What if I provide a function as a solution the makes use of scope and nested functions but offers no support for inheritance?

Is it better to complete a task in 1 hour with original code plus tests or extend existing components with risk of regression and 4 days of effort?


A couple of these questions would be at least yellow flags for me as an interviewer or hiring manager, as they indicate a strong bias for throwing away existing systems ("ignores the framework", "rewrite the build", "original code").

There's a great quote from Lou Montulli[1]:

> I laughed heartily as I got questions from one of my former employees about FTP code the he was rewriting. It had taken 3 years of tuning to get code that could read the 60 different types of FTP servers, those 5000 lines of code may have looked ugly, but at least they worked.

Those frameworks and existing components most likely have a lot of hard-won experience embedded in them, and I would be uncomfortable hiring someone who did not appear to understand or appreciate that.

See also: Chesterton's fence[2].

[1]: https://www.joelonsoftware.com/2000/05/26/20000526/

[2]: https://en.wikipedia.org/wiki/Wikipedia:Chesterton%27s_fence


That is why I would ask those questions... to excuse myself out of your organization. I have found it painful to be at organizations who repeatedly and intentionally make really bad decisions so that their developers deliberately don't have to solve problems (invented here syndrome). If that is what I were looking for I wouldn't have developed the skills that I have.

https://en.wikipedia.org/wiki/Invented_here


Perhaps it is better to leave the menial tasks to the machines. They're far better at it - "on file save, reformat code to this template."


Maybe this is unique to sports? Do other fields exist where a BB-esque expert is totally incapable of practicing? The greatest movie critics probably can't direct or write for crap, but they're not commenting on how the movie was produced, just its output


Could Albert Einstien have actually built a particle accelerator or an atomic bomb?


From what I have seen, most of the academic research principals in computer science are not actually practitioners anymore. They are more like coaches. The actual code is written by graduate students, and most of it isn't even very good because the code isn't the point in most cases.


Not unique. One could convincingly argue that this effect exists in many fields. Art (ie art history major vs artist). Music (ie violin player vs film score composer). Restaurant reviewing (ie critic vs chef). I'm sure there are many more.


You obviously haven't seen Mark Kermodes series on the BBC where he covers exactly this.


Why talk to someone so condescendingly? I mean, what percentage of people do you epxpect to HAVE seen this bit of obscure media you seem to hold as seminal?


If the parent makes the unjustified statement that all cinema reviewers no know nothing about how film is made why not.

And I was not being condescending BTW.


Parent made no such statement. Read it again:

> The greatest movie critics probably can't direct or write for crap

They may know a lot about how a film is made. It does not follow that they could sit in the director's chair, or write a decent screenplay. (And yet, they could still be great at critiquing films!)

This is a valid point. Just knowing how something is made doesn't necessarily mean you know how to make it. It's the difference between a designer and an implementer. Or architect and builder.


I have been amazed at how many "data science" interview questions can be answered with "look it up in the hash table" or "look it up in the literature".

That last one is a real problem-solving strategy, I can't tell you exactly how to balance a Red-Black tree but I know what a Red-Black tree is and where to look up the algorithm for balancing it and I think that's good enough.

One of the master skills for interviewing is "don't choke." Often good people will make mistakes under the pressure of interviewing and will drop out.

When it comes to a practical session there is the same issue that some good people will choke. Just the sample of who chokes in which environment when is different.

Issues like this turn up in all cases where people try to measure merit. For instance, it is known that some low-SES (socioeconomic status) people will choke on standardized tests like the SAT or IQ tests. It is also known that some high-SES people are as dumb as posts and will have that revealed by standardized tests which are harder to bribe or bullshit.

The hostility towards testing is explained by the combination of those two groups: primarily the Emperor doesn't want you to see they have no clothes, but people have sympathy for poor people.


> The reason why some companies cannot figure this out is because they don’t value the problems at hand. They need bodies to put fingers on keyboards

Exactly this.

Too many companies are availability focused instead of ability focused.

The reason? When your processes and engineering are weak, you need people available 24/7 to put out fires.

When I interview for a position, I'm interviewing them as much as they are me.

One of the biggest asymmetries in the whole hiring process, is that of course every company will tell you they have the best development processes, frameworks and code to work on. They may even believe it because they don't know better.

Then you take the job and are stuck fighting fires on a big ball of mud codebase that takes over your life.

If you want to standout as a candidate, try and probe and really find out how good their 'culture is.' Interview them back.

One of the best questions for getting to the real answer is: "What are your expectations for availability?" If they expect availability from you after hours, then their stack is likely unstable because the need availability to keep it running.

If you are lucky enough when you are senior and good at what you do, you don't have to put up with that.

I often tell them about a the third of the way through the interview: "I'm not an availability guy, I'm an ability guy." They often are surprised by the statement. And the discussion that follows it usually tells me whether I want to work for them or not.


> Are things in place to make the job easier? Seniors don’t need easier and this is a huge turnoff.

What? It requires a special craft to design simpler components. Most of the people don't even see it. This is where your senior skills will shine when you'll make it work like a charm as compared to the previous state.

If you don't see the friction or can't reduce it then please take the first exit out.


I think you are conflating simple and easy. They aren't the same. Simple suggests less code, fewer pieces, and a shorter path between code and solution. Easier suggests least effort and less to look at. Simple isn't easy.

The primary difference between simple and easy are decisions. A good senior will spend more energy on considerations for appropriate decisions than the actual work.


One of the best engineering talks is about this notion that simple!=easy : https://www.infoq.com/presentations/Simple-Made-Easy

This is surprisingly often not understood, even by people I showed the video. And I am not sure why. But I do think it's necessary in out field to start understanding this much more deeply, especially for senior engineers.


Good definition of 'Simple'. Simple and easy may not be the same but there are situations where they are similar.

For e.g. - for production deployment, I could either put the entire office upside down. Fires across the departments. Broken applications. Rollbacks. Or, I can automate all of our QA test cases and have a one-click deployment. No huss or fuss. I'm sure after having this experience, someone will definitely say - well, this was easy.

Anyway, good discussion.


> I think you are conflating simple and easy.

I like to state it as complex vs. difficult.

Finding a bug in ten thousand lines of crappy VB code code is 'difficult', but not fun.

Writing an space/time efficient implementation of level set topology optimization is complex.

Senior developers usually love complex problems, but hate merely difficult problems.


Are things in place to make the job easier? Seniors don’t need easier and this is a huge turnoff.

What? I feel like you have a very narrow-minded idea of what sr engineers want.


Well, unambitious seniors are a good bet on who not to hire. They’ll only work what you pay them for, those lazy bastards!


I think I've read in quite a few places that most great programmers are essentially lazy.


"We will encourage you to develop the three great virtues of a programmer: laziness, impatience, and hubris." -- LarryWall

Laziness: The quality that makes you go to great effort to reduce overall energy expenditure. It makes you write labor-saving programs that other people will find useful and document what you wrote so you don't have to answer so many questions about it.

Impatience: The anger you feel when the computer is being lazy. This makes you write programs that don't just react to your needs, but actually anticipate them. Or at least pretend to.

Hubris: The quality that makes you write (and maintain) programs that other people won't want to say bad things about.

(That quoite's originally from Programming Perl 1st edition in 1991, the explanations I think didn't show up until edition 2 in '96 or so...)


A special kind of lazy where they will spend inordinate amounts of effort not to do the same routine task again.


I differentiate between simple and easy. They aren't the same.


He does and he is correct.


I agree with your comments, but let me share my 2c on "trendy framework bs".

Most of interviews I've done to candidates ended up in discussing technology trends and googling around for cool open source projects, libraries and so. This, to me, is a good indicator - as long as you bring them up when discussing relevant problems, this means you thought about a problem and researched prior art to avoid re-inventing the wheel.

Also we tend to end up discussing pro and cons of any given technology and so on. This at the end is a key indicator you're talking to a passionate developer. And passion usually makes someone good at programming.

When it comes to soft skills, usually having such a kind of discussion you can figure out also someones behaviour in most work scenarios (to me, having a good "discussion" mode means you're likely to be fit for team work).


Reinventing the wheel is a cliche I have grown to detest. I have found that cliche is abused 9 times out of 10 to avoid the deep dive into a problem. On the other hand, approaching the problem again allows the developer to consider the precision of current solutions or consider a simpler solution. Good seniors aren’t afraid of making decisions.


I don't know how much experience you have as a professional developer but I can barely remember the problems I solved last week let only any interesting ones I've solved in the past 10 years.


I have maybe half a dozen really interesting stories, and maybe a dozen or two interesting but only to the right crowd stories - from a career spanning a bit over 24 years. And a bunch of war stories about why we shouldn't do things that way, because of the problems we discovered trying that in 1998 or 2003... (And occasionally the "we should try that again because it didn't work back when our colocated production servers had 486's and 320Mb disks - but it'll probably work just fine as a mobile app now"...)


The question to ask is "why" they did something a particular way. If they start rattling off something that sounds like the marketing page of shiny new tech, the they are probably doing resume driven development rather than actually solving the problem in the best way that they can. But then some people seem to value that - I don't.


You recognize good senior by not being interested in new technologies and things that makes the job easier? That is a bit dubious.


To be honest, I don't think your questions were good enough to filter out the smooth talkers. I was in a senior position at a successful agency and was hired before they changed their interview style to match the eye-rolling "technical interview" that is now gold standard. I repeatedly argued they were getting too many false negatives and only a certain kind of developer with this style, but they believed "if Google and Apple do it, it must be good".

Then I was put in a position to hire backend devs and, even though I couldn't entirely change the format of the interview, I came up with questions I believe could filter out the BS artists/not-yet-good-enough from the talent. And it worked.

Example questions that are truly hard to BS:

1. I want to build a new service that crawls the internet for used bikes and presents them for sale. Roughly sketch out the architecture you would use and how all the parts fit together.

- Why this question is good: it is impossible to successfully answer this question if you don't have experience building systems. Even if you try to BS it, your answer will come off shallow and break under any sort of probing. It also reveals how the dev's mind works when approaching problems.

2. Name your favorite tech stack. What is your favorite thing about it and what is your least favorite thing about it?

- Why this question is good: any dev knows that every tech decision comes with good things and terrible things. I love Python, but GIL, circular imports, shitty deployment/package management, 2.x vs 3.x nonsense all suck. If you haven't been in the trenches, you can't answer this _specifically_... you can only answer it _broadly_. And it's very apparent right away to interviewer.

I had about 4 - 5 of these questions. None of them required a single line of code to be written.


I gave an example of one of our erstwhile technical interview questions below. Batting around interview questions is a hacker sport and I've met dozens of developers who delighted in relaying and evangelizing their preferred questions. I'll admit, candidly, that "what's your favorite tech stack and what don't you like about it" --- a question you can literally just look up blog posts on, and a question I feel I could answer credibly for several languages/frameworks I don't even know --- is a uniquely bad one. But even if I felt it was a good one, I wouldn't trust it. Interview questions don't work.


"a question you can literally just look up blog posts on, and a question I feel I could answer credibly for several languages/frameworks I don't even know"

Really? You mean you could memorize a blog post for random languages, go into an interview and answer a question and not completely fall apart from follow up question 1, 2, n?...

You make it sound like I'm asking this question, getting a monologue response and just moving on. These questions facilitate a dialogue between interviewer and interviewee, which quickly reveals your deeper understanding of a subject.

Now, I think your above approach (lookup, memorize, pretend) works great for the kind of technical interview questions that I've been refuting this whole time.

Finally, I don't think my questions are perfect, nor do I think the interview process in general is that great. I think the best way to find talent is to do a trial run for 1 - 2 weeks and see how good they are at taking fuzzy problems and breaking them down into actionable steps, then executing. This is regrettably hard to do for most people and companies.


> I think the best way to find talent is to do a trial run for 1 - 2 weeks and see how good they are at taking fuzzy problems and breaking them down into actionable steps

How do you get people to do a trial run for a week or two?

Most people start looking for the next job when they are already doing their current one. So how can we ask them to do a trial run for a week or two?

Also what about the remuneration(what kind should be given if at all for this period) and does this work for freshers and absolute newbies too?


> Really? You mean you could memorize a blog post for random languages, go into an interview and answer a question and not completely fall apart from follow up question 1, 2, n?

To me, this is the crux of why I agree with you, rather than the questions themselves. Anyone can rehearse an answer to any question, but a dialogue needs to be created in order to actually find out anything about a prospective hire's competencies.

To be useful, an interview needs to be a conversation, not a monologue.


>I think the best way to find talent is to do a trial run for 1 - 2 weeks and see how good they are at taking fuzzy problems and breaking them down into actionable steps, then executing.

Except even if you could get someone to do that it doesn't filter for the right candidates. Someone who can quickly get up to speed in a foreign domain is not the same skill as being able to build great systems/code/etc in a domain you're familiar with. The day to day of a programmer after the first X months if the latter and not the former.


Aren't you two hiring for completely different position? One is looking for web system developer and the other is looking for the kind of coder that will write exploits and smaller tools. In one position you expect the senior to have general idea about wide range of relevant technologies, because you expect senior to choose between them occasionally (plus hiring manager knows that stuff too). Second position does not require it at all.

Plenty of people can do both or can change from one to another. But it still seems to be that the interview focus should be different as both positions require somewhat different focus.


Does one require actually writing code and the other just require talking intelligently about code? If so: I concede.


Writing code is trivial part of what I would expect from senior - making correct decisions about architecture and talking intelligently about code are less trivial parts that make difference between senior and junior. Ultimately, when senior cant write code, that can be found relatively quickly. When he fails at decisions it takes more time and causes more damage. When he cant talk about code, then he is suitable to some positions but not to many others (so it all depends on position at hand).

I don't mind interview having also part where code is written at all. Be it dummy feature or breadth/depth first search or fizzbuzz or any other simple reasonably sized assignment. But in case of senior as in partly decision maker with not much direct supervision in the long term, you need those other capabilities too.


Ask them about the challenges which they've faced when using x, y framework.

Ask them why they prefer one framework to another.

Ask them how they view testing

Walk through with them on a simple whiteboard problem and ask them where they would write test cases.

Watch for the amount of detail they give you. That will give you an indication of what kind of a developer and how deeply they go into problems.


It's even simpler, "tell me about a challenging bug you've vanguished, what it was and how you solved it." People in the trenches love to tell war stories. I do.


For me it tells me a lot about how a person goes about solving the problem, how they're going to avoid it later (if the are), context about the conditions that the person worked in, where their interest lies, and what motivates them.

If their answer involves "well I searched around stackoverflow a lot and asked there" that's a no go for me.


Does it have to crawl the whole internet or can it just scrape the top 5 or so most important sites?


Asking clarifying questions is a good interviewee practice. Well done.


Their response reminds me of the Monty Python sketch about the speed of an unlaiden swallow.


Please define what is "the internet" first ^^


Define if you mean "an internet" or "the Internet" before that :-)


I really wish more interviews asked these questions. They're not only good questions for the reasons you mention, but for the interviewee (at least for me) they tend to be pretty low-stress to answer. Those sort of questions are, after all, something that I might be asked to do or think about during my everyday duties at a new company.

I always sigh in relief when a company I'm interviewing with asks me something like that vs something like "implement x algorithm to solve y niche computer science problem you probably learned in college at some point".


Just to add some thoughts here: I’ve interviewed people who were google L7+ (IC) a couple of times who weren’t very good engineers, at least on the work sample stuff.

I’ve found that the highest correlation to performance in senior engineers is raw algorithmic skill and willingness to say “I don’t know” when you don’t know.

This is not true of hiring devops or sre’s. For those positions, you want the gopher archetype. This is the person who loves hunting down weird behavior and doesn’t give up when debugging. Let’s be real, debugging is you versus the machine and you’re eventually going to win if you can just be honest and patient.

Hiring senior engineers is really hard, I wish everyone luck.


If you're asking questions related to "raw algorithmic skill" you're filtering for people who either: 1) Have had a computer science education and happen to remember the algorithm at hand. This is also a function of recency so senior engineers are less likely to remember any given algorithm. 2) Study algorithms so they can do well at job interviews.

Neither one is something you want to be selecting for. Some of the best engineers I've worked with haven't had a proper CS education. I've known extremely strong engineers with Neuroscience, Mathematics, Physics and Public Policy degrees. I've got a business degree.

Unless you're working in certain extremely hard (and extremely rare) areas you do _not_ need to filter for algorithmic skill. Most ML doesn't count. Neither does Data Science. In 99% of engineering jobs it's more important to be diligent, rigorous, and organized. (Of course, filtering for those is another issue altogether)


Spot on. The reasons are exactly right too; seniority and "ability to recall obscure minutia from college years" are inversely correlated, for obvious reasons.

Companies need to understand that not only are they mis-selecting, but they're broadcasting that they're doing so to all the candidates that go through that process.

Approaching candidates with textbook-style algo or data structure questions merely informs that they're going to be working with an educated but overall somewhat junior lot. That's not necessarily always a deal killer, but it's probably not the image that these interviews are hoping to project.

For well-qualified candidates not applying at an industry headliner like AppAmaGooBookSoft, the interview process quickly inverts itself, and it becomes more about the company selling the candidate on their offer than the candidate selling the company on their skillset. Tread carefully.


The only people that say that a CS education doesn't make you a better programmer are people without a CS education.


CS is not programming just like 99% of musicians don't have music theory degree.


It's probably a great analogy because the best musicians all know basic music theory, whether they learned it in school or on the bandstand. As for the advanced theory that they teach in graduate programs, it isn't even applicable to most genres of music.


> the best musicians all know basic music theory

Do you have any evidence for such a bold claim or is this just speculation?


They might know it instinctually but Funk brothers / James Jamerson (the bass player on a lot of Motown) didn't go to uni to study music.


Interesting analogy.

I bet there's scope to twist it beyond all sensible bounds, and compare the ability of the 99% to the 1%.

I suspect there's top level classical, jazz, and session musicians - who're the industry equivalent of 10x programmers. (And all the other stereotypes probably exist too, I bet there are occasional untrained but gifted musicians who can produce 10x output, but who're amazingly difficult to collaborate with compared to degree level music theory trained musicians... And I bet there are "10 year" musicians with one years experience repeated ten times over.)

The other interesting point there is that probably 99% (or more for, five, perhaps six nines) of "programming" doesn't actually require that much hard-core CS theory. You can get paid well playing covers in bars with a good ear and not being able to read a single note from a chart, just by listening to the originals and copying them over and over in your bedroom. Same as you can make a decent living building basic CRUD websites/apps without having written your own compiler that can compile itself or defended a phd that advances humanities start of the art understanding of something fundamental.


The only problem at that level of musician ship you lose the fun and can end up with some very sterile music that's only of interest to other people who have degrees in music theory.

Btw years ago I did work with a top session guitarist (top 10 hits) who after an accident taught himself to program from his hospital bed.


Good thing parent didn't say that then.


> If you're asking questions related to "raw algorithmic skill" you're filtering for people who either: 1) Have had a computer science education and happen to remember the algorithm at hand.

Probably true. But perhaps that could be accounted for in the assessment process. After all graduates from Neuroscience, Maths and Physics degrees have got to be some to smartest people around.


Agree. A lot of coding is simply banging your head against the wall, search SO over and over, changing things around, until it does what you want.

Raw algo quizzing skill isn't necessarily the same thing, though you'd think it was somewhat related because when you're learning to code up "find longest continuous run" you also need to change things around for a bit.

Difference is in real life there's never an end. The algo quiz leaves you at some optimum eventually due to being quite a small thing.


Coding is the easiest part. Understanding the actual problem and solving it is the hard part.

> A lot of coding is simply banging your head against the wall, search SO over and over, changing things around, until it does what you want.

It doesn't look like programming to me. Yes, sometimes we miss something, so our code doesn't do exactly what we want it to do, but when we realize it we just fix the code. This view of coding resembles an improved way to write Shakespeare with monkeys.


> A lot of coding is simply banging your head against the wall, search SO over and over, changing things around, until it does what you want.

I don't find myself in these situations nearly as often as I did back when I was a junior engineer. But damn, I'm sure I looked busier (and more stressed out) back then.


I was mostly through that phase of my career before any of those things were available. Toward the end of it, stack overflow had just launched. I mostly relied on printed books for help with languages and frameworks I was using.


the ability to prepare for an interview is likely correlated to the ability to do the required work, so that point is moot.


It's not correlated at all. I can ask you questions about implementing a b-tree and then give you a job to fix CSS. Which is the case in most jobs and job interviews.


if you were able to prepare for the btree stuff, the css stuff will be a joke to learn. that is my point.


Algorithmic ability has no correlation to the ability to write maintainable code, though. Most time at work is not spent demonstrating ability to regurgitate algorithms.


> Algorithmic ability has no correlation to the ability to write maintainable code, though.

This is not true in my experience. I usually see a strong correlation between algorithmic ability and writing maintainable code. At various organizations I have worked for, I have seen that the ones with strong algorithm skills also happen to be critical thinkers who put a lot of emphasis on simple, elegant, and robust design and code.

So I am very surprised to know that this correlation I observe may not be true in general. How did you come to this conclusion?


I don't have data but my impression is that there is an inverse correlation. My guess as to why is that people with ability to manage a lot of algorithmic complexity don't seem to suffer when code is complex so they see no need to simplify things.


I'm afraid I can also only offer anecdata. Mine is based on hiring and then working with dozens of SW engs since 2008 (my first tech lead role), plus our student Incubator and open source projects (dozens of more junior devs).

The correlation between emphasis on simple, elegant design and code and algorithmic chops is indeed uncanny.

And I'd add "clarity of articulation" to that -- being able to express your thoughts and the problem/solution structure clearly and succinctly is a great indicator as well. Huge overlap with both code maintainability and algo quality.


I haven't seen this ever. Algorithmic skills have never been correlating with productivity or maintainable code in any of those 7 companies I worked in.


No causation, I'd believe any day. But correlation? I'd want to see a statistically rigorous test done before I'd believe that. Correlations are everywhere and pretty much all good traits are correlated with each other. Hell, even vocabulary size and reflex speed is correlated. And this is probably why even terrible methods for selection usually kind of work.


Unfortunately, often (and maybe more so because I live in Europe) it's also the other way round:

> raw algorithmic skill

It's been ages that I've been asked anything remotely algorithmic. My interviews are mostly about frameworks, how you fit in a team and whether you know / can be "agile".

Not even a Fizzbuzz, much less so quicksort or more special algorithms.

> and willingness to say “I don’t know”

That never got me anything in any interview/company. To be fair, I found a few smart and cool friends because of this, but they themselves don't look as if they've found a good job either.

Being hired (valued?) as a senior engineer is really hard.


It really depends on your domain. If you're into low level hacking and distributed systems, there is a lot more algorithmic work. There is demand for software that's cheap to scale and/or low latency. Some fields are bottlenecked by hardware (machine learning, realtime rendering, etc.) and so benefit from better software. Some production systems still need a large amount of optimization to satisfy economic and product constraints.

I don't think the number of jobs requiring fairly deep systems or algorithmic knowledge has gone down, but the ratio has.


I’ve had trouble finding those systems or algorithmic jobs too, like the grandparent, where some kind of engineering quality matters, be it performance or correctness or something. Everyone wants to hire a “full-stack engineer” to write application code that a junior dev could write, but they want someone senior anyway.


I once had a FizzBuzz question in an interview, The interviewer started, I interrupted him, "We are not talking about FizzBuzz, are we?". He apologized afterwards by saying that there was actually on applicant some time ago, writing 100 printfs.


Why doesn't the gopher archetype also apply to development roles? Persistence pays off if you measure your own results. Knowing how to divide a problem is really important, even if you're great at algorithms.


While persistence is good, it has to be applied correctly. It can become the opposite problem as stubbornness, spending too much time in a small thing that is not actually that relevant because the person really wants to get it done.

More important than persistence IMHO is to know when to be persistent and when not, and those two qualities by OP seem to be quite related to it: "raw algorithmic skill" (to know whether something is optimizable or not) and willingness to say “I don’t know” when you don’t know" (seek help, get the right person for the job, etc).

Edit: I know because I was like that in the beginning; it was okay to learn e.g. micro optimizations when learning programming for fun at university, but it'd have been a big issue if I had not been able to correct it.


I can’t overemphatize how much persistence really paid off in past jobs I had. Coworkers that had persistence were such a delight to work with because they didnt give up the first time they got stuck. They did the nitty gritty work of tailing/grepping logs, using a debugger, endless Googling, print statements, and anything to find out the root cause of a bug... or even to understand a legacy codebase.


It's probably such a delight in part because bright kids are frequently not persistent. They are used to things coming easily. If it doesn't come easily, many are quick to throw in the towel.

Someone who is both bright and persistent can move mountains. But it often has a big social downside. People don't like change. Being bright, persistent and also socially savvy enough to sidestep drama is practically a unicorn.


I've done that sort of work. Eventually you realize that nobody notices the guy who tracked down and killed that vexatious heisenbug in the legacy code base. You need to be working on the new, shiny, high-visibility projects or your career is going to stagnate.


Or change the culture.

Friday demo day (to sales and cust success and everyone else): "hey everyone, know that thing customer X keep complaining about that we've never been able to solve? It's been super tricky. Just wanted to announce that James here figured it out and it's fixed forever. James you're a hero."


Guess what you need strongly depends on what you are hiring for.

Being good at alghorithms only is great if you work in a very isolated area.


Saying “I don’t know” is also very important for devops people, but for slightly different reasons; we’re likely to have a little bit of knowledge about everything, and it’s import for us to be explicit about where our expertise dries up.

I say a lot of things in the form “I’m a little out of my depth in this subject, but my best understanding is that the behavior should be such-and-such; does that sound right to you?”


i dunno. the machine has beaten me more more times than i care to admit. some problems are hard.

i’m L7.


Honestly speaking, since it's an anonymous forum, what's the difference between L5 and L7? Like if I was asked the same question - the difference between L3 and L5 - I'd say that it's just experience with a few interesting known projects that's paid so much better, while the cognitive ability is the same.


Could you give a short explanation of what this L3/L5/L7 thing is?

I'm assuming you're not talking about spinal cord problems, which is the main search result.


I think [1] gives an explanation of it.

[1] https://levelsfyi.com/google-levels-salary/amp/

EDIT: found a better source


Hey I'm from www.Levels.fyi - this is exactly what our site was built for. The link above is a copycat site and you can see the latest leveling info / compare with other companies on ours.


Bound to happen when you call yourself techslave ;)


There’s a reason google doesn’t hire into L7, it’s because their hiring process for algorithms is garbage and provides lots of false positives and negatives when you are looking for real skills.

If saying “I don’t know” and being good at algorithms was enough, you could hire straight to L7 easily or could promote to in a year. Neither of these things happen.


It does always amaze me that literally what you learn in HR 101 in an undergraduate management curriculum is controversial. The r^2 of a ton of different methods for predicting job performance is something large companies are highly incentivized to get studied by academics (and they do).

Spoiler alert: work-sample tests are the only practical and generally tolerated one to use (for software engineers) that is actually correlated with job performance.

@tptacek: how do you feel about people who have large amounts of open source work that is easily reviewed? This obviously doesn’t add new data to your company’s test but is definitely some kind of work sample, albeit not necessarily in a similar environment to the one being hired for.


But it’s not generally tolerated. You end up optimizing for sub-prime candidates because those are the only ones desperate enough to take 4-6 hours out of their free time for a company that hasn’t even bothered interviewing you, yet.

If they want to turn the onsite into one big work sample, by all means, that sounds very effective (and something I’ve seen work well). But in my experience, you’re going to deter qualified candidates by forcing them to do take-home assignments.


Again: this is true of "work sample" tests in their mainstream implementation in our industry (spend 6 hours jumping through a hoop for the privilege of running a standard, nondeterministic interview gauntlet), and those hiring processes are a scourge.

But there's a right way to do it: give work sample challenges and then, at least for the most part, end the technical qualification part of your process there. You spend 4-6 hours at home instead of spending 4-6 hours in front of a whiteboard doing dumb coding challenges.

If I was looking for a job right now, I'd probably refuse to interview anywhere that gave me take-home problems that couldn't promise that a strong result on those problems would take me all the way through technical qualification. But a company that could offer me take-home problems and conclusively make the technical part of the decision on whether to hire me based on those problem would be a strong prospect.


We subscribe to the work sample test as your best option for technical validation, but we also limit ourselves to a small window of time, on site. We pick something we recently worked on, distill it down to something we can knock out in 15 minutes, and give the candidate about 45 minutes to work through the distilled problem and then we explore for about 15 minutes their solution and how they would make it production ready (as a conversation). We then have another 45 minutes of designing a solution to something more complex we recently had to work on, again distilled down. It involves a white board and boxes and arrows. The last 15 minutes is time for them to ask whatever they want of us. We have two interviewers on this technical portion to level out false reads by the other interviewer (I thought this part was a poor answer, but the other interviewer has a different view of it).

I really like it so far. The part I'm battling on it is if the current coding part selects against Java developers. Part of the code we want right now requires a unit test with dependency injection to match an interface. So many of our Java candidates simply can't set up a running unit test. They are used to layers of framework already set up in the IDE and just clicking on stuff. They have full access to Google. Maybe it is good we are filtering out these candidates, but I'm not sure. Still thinking on it.


> The part I'm battling on it is if the current coding part selects against Java developers.

Yes, it does. And let me put it this way: I've used C# and Java at most of my jobs, those theoretically would be my comfort languages, yes?

I do not use those languages on interviews. I often just use C++ (!), or Rust (free unit tests!) if the company tools allow for it, or worst case I'd learn some Python basics. C#/Java are very awkward and boilerplatey in such a small time frame as 45 minutes.


Interesting. To clarify, you are saying that these boiler-platey languages, it is not fair to write a unit test in a 45 minute time block with access to Google? I'm not a Java guy. All the languages I've used, this would not be an issue. Again, I'm wrestling with it because it I feel it should be easy but like 80% of our candidates who chose Java struggle with it.


It sounds like this is a pick-your-own-language type test. I'd suggest scoping down to one, or at most, two, simple allowable options that your team is already pretty comfortable handling.

I've been on the reviewer side of a handful of "choose your own language"-style take-homes recently and found that it's really not good for the candidate if they actually end up using something that wouldn't have been the interview committee's first or second choice. There have been cases where choosing a less-trendy-but-still-totally-viable toolkit has effectively disqualified a candidate, with several committee members not even considering it necessary to look at the code. This is very unfair and lame but an unfortunate reality. I asked that the test be changed to constrain the options at least to a list that wouldn't be immediately disqualifying.

You could advise the candidate ahead of the interview something along the lines of "you'll be asked to write a code sample in either Ruby or Python -- you'll have full access to Google, but you may want to brush up on the basics of these languages if you haven't used them recently".

This does two things: first, it prevents the issue you have, where you're essentially not sure if you're correctly scoring the samples produced. Secondly, it really requires you to constrain the problem to things that someone who has barely used language X or Y can do within 45 minutes.


The theory with pick your own language is they should be able to feel fully comfortable (interviews are already stressful enough). If they picked Rust, Scala, or a lisp dialect, (or anything the interviewers are unfamiliar with), it can even be a better interview because we get more insight on how the candidate communicates and their ability to walk someone else through their solution. A potential other bonus is less biases leak through from an interviewer on "that is a strange way to do that in language X."


> The theory with pick your own language is they should be able to feel fully comfortable (interviews are already stressful enough).

Ah, but there's the rub. Candidates are trying to please the interview panel. If you don't provide guidance, the odds that they'll just use whatever they think the interviewers most prefer are just as good, if not better, than the odds that they'll actually use whatever makes them most comfortable.

You said yourself that since some candidates pick a language that you don't know well, you can't really tell if the failure of a large number of those candidates is reflective of a bad test or just a mismatched candidate pool. IMO, if you're going to stick to the "pick any language" thing, you should at least find out and ensure that any language the candidate picks will have a fair shot.

> it can even be a better interview because we get more insight on how the candidate communicates and their ability to walk someone else through their solution.

You can still get the candidate to communicate and explain his choices if you give an option: "either Ruby or Python" or "either JavaScript or Visual Basic", etc. The problem with having this happen in a language that the interviewers don't know reasonably well is that they are much more vulnerable to the smooth talker who can present incorrect information confidently, and they won't have enough background/anchoring in the subject matter to know the difference.

> A potential other bonus is less biases leak through from an interviewer on "that is a strange way to do that in language X."

I would say that if you're worried that interviewers will load in biases toward their preferred shortcuts etc in a specific language, that you should be equally worried that some good candidates are being excluded for choosing the "wrong" language in an any-language-goes test.

Above, you mentioned that there'd be a positive response if a candidate used "Rust, Scala, or a lisp dialect" -- these are all relatively trendy. What if the candidate used nim, Pony, or some other language that hasn't pulled in to the hype superstation yet? What if the candidate used a language that's not-so-trendy anymore, like Visual Basic, Cobol, or bash? What if the candidate used a programming language of their own design, and brought a copy of the compiler with them on a flash drive?

I'm asking because I've seen this in practice. Candidates for a devops position who chose to use bash to implement the very simple take-home task they were given were laughed off by several other members of the interview committee, despite being potentially high-value senior people -- they were at least senior enough that they're more comfortable performing sysadmin-style tasks in a shell, rather than using a massive CM framework or an awkward amalgamation of Python scripts running os.spawn.

It feels like this type of thing happens a lot, in the same sense that very often, "unlimited PTO" just means "guess whatever amount of PTO is acceptable around here and hope you get it right".


I guess it depends how far back you're starting from. I suspect most Java developers don't spend that much time creating projects from scratch (I do, but the work environments I'm used to suggest I'm an outlier).

I tend to give candidates a simple project already to go, with junit and hamcrest, possibly mockito already available, and ask them to go from there using a provided IDE (which I attempt to get set up as near as they prefer to work as I can). This generally works out fine. I certainly don't feel the boiler-platiness of the language gets in the way, mostly because the IDE is generally capable of doing most of the lifting with that respect anyway, but also because over the timescale of an interview question, we're generally only talking about a couple of classes at most.


It starts from scratch. Familiarity with one's tooling is important. Setting up a project seems like it should be part of the basics. Would it not be unfair to others who choose a different language if Java gets hand holding in terms of initial classes?

For 20% of Java candidates, they do it just fine. Heck, a few echew the IDE and are fine working completely from the terminal (these tend to be particularly very solid at coding). Still wrestling with the idea.

The folks that already work with us (who wrote Java in a former role) see no problem with setting up a project nor do the folks that we hired recently (seeing as they likely passed that technical interview). But that all could be bias.

Maybe you are right. Maybe the next candidate or five in Java will get a base project and we can see how it goes.


Do you give them "their" tooling? I can set up a project of the type you describe in about five seconds, because I have a template for it in my IDE. The best defaults for this that I've seen are provided by IntelliJ (do you provide this in interviews? It seems legally challenging to do) and would probably take me 5-10 minutes to navigate.

I think Java depends much more heavily on powerful tooling to do the heavy lifting, and my experience of using that tooling when I haven't had a chance to configure it in advance has been pretty miserable.


If they're coming in, I'll hand them a laptop with the project set up and ready to go, with IntelliJ running. If they're normally an Eclipse user I'll swap to Eclipse shortcuts and help them manage the IDE as we go.

If it's remote, I'll ask them to share their screen with me with a project set up and ready to go, having emailed them a copy of the interface we're going to implement about 30 minutes before the interview.


Fair enough, seems like you're giving them a fair shake.


It depends on context as well. Where I currently work, they wouldn't have a chance. The corporate firewall will prevent them talking to Nexus, for example. It just wouldn't be fair to expect them to navigate that sort of thing in an interview.

In general, if I'm asking them to code on my (or the company's) hardware, I'd start from an existing project, if only because I wouldn't expect them to be familiar with the installed tools (oh, you use maven? I'm a gradle user ... etc.) On their own laptop, I'd expect them to be more comfortable.

I'm doing a remote interview on Wednesday. That'll be on the candidates machine (because screen-sharing is easy, getting them inside the corporate network, not happening), and they've been told they'll need an IDE ready to go. I'll expect a project set up and ready to go before the call even starts.


What happens if they can't work from home? They're expected to take time off, AND prepare beforehand?

What round is this?


On your own hardware, I'd expect you to be able to have a blank project up in minutes, so yes, I expect somebody who's not travelling to our site to be able to take 5 minutes out of their busy day to create a blank project.


> Familiarity with one's tooling is important. Setting up a project seems like it should be part of the basics. Would it not be unfair to others who choose a different language if Java gets hand holding in terms of initial classes?

Also, how much of their day-to-day work is going to be setting up new projects? I sometimes feel a better test is to be thrown into an existing code base and asked to make a change. It's far more indicative of the sort of work somebody is likely to actually be doing.


>Part of the code we want right now requires a unit test with dependency injection to match an interface.

What exactly do you mean by this?


We ask that the code they write have a unit test. The nature of that unit test is that it has a dependency. They can mock however they like, but passing in an object that represents the dependency (dependency injection) is the easiest and most straight forward way to do that. That object should have a method on it (a know method, with a known signature, also known in some circles as matching an interface).


You ever see or suspect fraud in a take home? I can't imagine it doesn't happen giving that cheating is so prevalent in colleges.


If I was doing this at Google, I would spend a lot of time thinking about test fraud. At a 40-50 person company? Not so much. We do simple follow-up things that raise the amount of effort you'd have to put in to fraudulently submitting work sample tests, and we know pretty exactly how we'd randomize our work sample tests to make it hard to cheat (at least, hard to cheat without doing something we'd be interested in anyways) --- but it's just not worth it right now.

Work sample test fraud is one of those things that sounds like a huge deal on message boards, but when you game through what would be involved in doing it in real life, it makes very little sense.


To my mind, the simplest and most straightforward way to combat any fraud is also beneficial because it gives you even more information about the candidate:

Talk to them about the code they wrote.

Have a conversation, as if they were already your co-worker, with the exercise as the subject. Go through it, ask them -- non-adversarially -- why they did X, what they thought about requirement 2, how they could get better test coverage for Z. If you see something that seems to be a mistake, talk about it. If you see something awesome, discuss it. I can't imagine someone incompetent being able to bullshit their way through detailed discussion of code they were supposed to have written.

And this provides valuable info about how this person thinks and communicates about the work you want them to do.


That seems like a good idea, but does somewhat detract from the notion that once a candidate does the work-sample he is done with the technical part of the process.


True, it deviates from tptacek's recommendations; about scoring and identical questions for everyone as well.


A nice approach I liked (back when I was interviewed by Thoughtworks years ago) was that one of the interview stages was to take the assignment you'd done and apply some new requirements to the story. It rapidly makes it clear if the candidate actually did the assignment or not.


How can I sign up to take these tests for people? Sounds INCREDIBLY lucrative.


Are yo saying most of candidates commit fraud? Do you have any data on it or just following common sense nonsense?


> Are yo saying most of candidates commit fraud?

Nope.


If majority doesn’t, why create a process that punishes everyone? People are lying on the resumes is a very common argument made by creators of insane interview processes. It feels weird to start a relationship with a company with assumption that I’m a cheater.


Who said anything about punishing anyone? 1) tptacek advocated for a specific kind of interviewing, 2) I asked him if he had ever seen a problem that I thought might be an issue with that method, 3) he explained that he wasn't worried about it because of X, Y, Z but acknowledged that for other companies it could be a problem, 4) I thought his answer was very reasonable. Also wool_gather and swish_bob added some useful ideas.

I'm not sure why you felt the need to come out guns blazing.


I agree with that. Sounds like a good process. Although, I am too cynical to believe any company that tells me this will be the only technical part. Too often do recruiters lie/misrepresent the recruitment process. Some seem to operate on the sunk cost fallacy, where you just see it through because what's one more round after already doing several?


I understand your cynicism but: who's got two thumbs and can offer an existence proof that there are companies that really do hire this way? :)

My best guess is that there aren't going to be many companies that will give you definitive statements about what their process will be after challenges who are lying about how they digest work-sample responses. But I don't know and am prepared to be wrong about that.


I've only had one take home assignment task that really worked like that. I did pretty good at it and the interview.

I was co-owner of a startup at the time, the guy I interviewed called me up and said you did really good but I think you really don't want to work here - my wife said 'Wow, that guy was really smart' considering how some things went later it might have been better if I'd said no I really really do


The extremely obvious problem with this is that there's no way of preventing someone from completely blowing a hole in your interview process by simply paying for or hiring another developer to do the take-home problem for them. At that point they've gotten past the technical requirements and now only need the soft skills to execute on it once the rest of the interview process continues.

This is why take-home problems are almost completely irrelevant except for filtering out good candidates. Eventually those problems optimize for perfection which help out those who 'cheat' at the process and people that otherwise put in earnest efforts are rewarded with denials. This is something I've experienced before in my job search where I would put in an honest effort and get 90% of the problem solved, but get denied because my solution wasn't flawless.

So allow me to call bullshit on your own claims that this is the right way of determining technical qualifications.


> get denied because my solution wasn't flawless.

Not all companies handle it like this. I had a take-home exercise as part of an interview last year. I hit a real snag on a fairly small part, I couldn't figure it out, and I ended up leaving a bug in my submission because I simply ran out of time. Very frustrating.

It was raised at the interview; I admitted that I knew it was there, and that I hadn't been able to figure it out. We discussed possible causes: it actually turned into a pretty interesting, though minor, technical conversation. The interviewer eventually told me that he had figured it out after a little investigation (and I expressed my gratitude for the explanation!)

I ended up getting an enthusiastic offer from them.


I'd couple the take-home with a substantive discussion following submission. Harder for a fraudster to talk about how they came up with or tested their solution and how they'd improve it in a real production version.


> I would put in an honest effort and get 90% of the problem solved, but get denied because my solution wasn't flawless

It's possible that the company stated the take-home work in terms of non-negotiable deliverables, and their baseline for allowing a candidate to move forward is 100% of those, and they prioritize candidates who take initiative and do more work beyond the base requirements.

... not that I would agree with such a process (it biases towards people with more free time, like people with fewer dependents and people who are currently out of a job / underemployed), but it's very possible that this is what you faced.


I’m really mixed about this.

As a new immigrant, my wife had trouble finding work as a designer until she was given a take home work sample test. After that, the company that gave her the test hired her quickly and she has been doing well with them ever since.

I’m currently doing the gauntlet thing myself, but none of the interviews I’ve done have asked any technical questions pertaining to the role that they actually want to hire me for. It’s all generic stuff that frankly I would have done better on 20 years ago.


The best interview I ever had was a 2-hour onsite work sample, followed by 1/2 an hour discussing what I'd come up with. I was offered the job the next day.

Surely most people would prefer this to whiteboard tasks?


The best interview I had was a few hours of friendly conversation about the details of my resume.

Then I was hired under probation, as everyone there was, and the understanding was that I could be easily dismissed if it was clear that I wasn't working out.


There is no way in hell I'd ever sign up for that. To be expected to quit your current job on the hope that the probationary period works out is insane.


Much of the States is at will employment, and probation is standard practice where I am.

It's rare that people fail probation; so long as you apply for work you are actually capable of doing...


Also the UK the first 2 years are effectively at will


Yes, I've learned that the hard way. Provided they give you payment in lieu of notice, they don't need much of a reason to let you go.

Myself and a co-worker were once let go for "performance reasons" at 10 months - just after project completion (successful). It was beyond my probationary period, and no issues where raised in the 2 performance reviews.

Their notice period was just 1 month. We were effectively cheap contractors.

My advice now is to treat offers with a low notice period (of them telling you) as a red flag. The norm is 3 months, after probation.


Rather than just make you redundant on statutory terms - doing you on performance risks an industrial tribunal.



The norm where I am, in BC, is two weeks for notice. It increases slightly the longer you're employed.


That's common in Switzerland. But it goes both ways, i.e. you can leave the company any time if you don't like it without breaking the contract.


This would make me damn uncomfortable and stressed out once I started the job, and I'd likely take any other offer over this.


You've had this condition at every job thus far. The real interview is always the work you do.


This is only true in the most extreme case. Many companies have formal procedures around firing people for performance issues. Short of hr violations or literally refusing to do anything, I can't imagine someone being fired before 6 months.


I am confused by the world you live in.

Yes, above a certain size, companies typically have some formal procedures. But typically those are a fig leaf.

In many labour markets, there's a legal 90-day probation or equivalent. You bet your boots some people get dismissed at 80 days. Or the job was contract-to-hire, and the contract doesn't "get renewed".

But on top of that, literally every company I've worked at or any of my friends have worked at (including lotsa startups, two of FAANG, and some in-betweens) will terminate when they want to terminate. In most non-European labour markets that I'm aware of, there's a penalty for doing so, and the company just pays that penalty and gets on with it.

Sometimes there's more security than that, I've heard (but not experienced). And sometimes the company puts in large effort to cultivate the underperforming employee first (had that happen to me once; they tried and I tried but it didn't work out). But the overwhelming majority of cases of my first-hand and second-hand experience, dleslie's summary is about the whole story:

> The real interview is always the work you do


Most places I've worked at in the US have a 90-day probationary period. There's still red tape, but not as much. More common now, however, is to hire people on as contractors for 3 to 6 months. If you work out well, they'll fast-track you to becoming full-time without a second thought. Otherwise, your contract is up and they choose not to renew it. Which makes things less dramatic if there are issues.


Probationary periods are industry standard here in British Columbia.


Fair enough. It's to some extent a cultural thing (there's no need for explicit probation in the US since most employment is at will), but I too wouldn't necessarily like a probationary period, even though I don't foresee it actually being an issue.


In the US there often is a formal probationary period at larger companies which mainly accomplishes one thing: reduce the HR red tape if a new hire isn't working out. During the probationary period it's generally easier to make a case (i.e. little or no documentation needed) that 'they're not working out' and HR will be OK with it vs. after the probationary period, you typically have to 'document' them out of the company.


I'm in the USA, and this (probationary period) has been the case with every job I've had in the past 30 years. I've never heard of a company not doing this in fact.


Same here. Though many mid-size / smaller companies might not advertise this fact (their HR policies are often a bit more ad hoc than larger companies if they haven't been involved in as many labor lawsuits)... but pretty much if there's an HR department, the probationary period exists.


When done well it's great for everyone. A healthy employer wants you to succeed, after all your success is their success.

Thus, probationary periods can be a time of training and growth for the new employee.


I'm not sure how much this is considered but in my state, which is an at-will employment state, being unable to preform job tasks due to lacking the knowledge or technical skill is explicitly defined as NOT a demonstration of cause for termination that would absolve the employer of their financial responsibility toward unemployment compensation.


This is how all hiring works by law where I'm from. It's fine.


Sounds like you may also be from a country with a decent social net along with it?


I guess you could say that.

I suppose you do sort of feel a little stressed during the trial period but I've never seen anyone fail it and it applies at every company, so there's no escaping it anyway. When it was introduced some people got quite upset but I can't really say I think it's had a bad effect.

I guess from the companies perspective if they realize they made a grave mistake they can back out of the hire, but they are still very careful and rigorous in the hiring process just like always. It also allows the candidate to bail if they realize the company wasn't what it said it was. It goes both ways. Again, in practice it seems mostly harmless.

Perhaps the US wouldn't do so well with a similar policy maybe even just due to the crazy healthcare situation going on over there. I couldnt say.


> I suppose you do sort of feel a little stressed during the trial period but I've never seen anyone fail it and it applies at every company, so there's no escaping it anyway.

I'm not sure what you mean by "it applies at every company". Getting hired and then fired a week later is virtually unheard of. This is not a fear I have, at all.

But if you told me it's probationary, that is totally a fear I'd have, I'd get paranoid, so I'd rather work somewhere else. You're basically telling me it's not a real offer in my eyes, and I should not expect stability.

> Again, in practice it seems mostly harmless.

It's extremely harmful in a place with poor labor protections that is the US, for reasons that I don't feel like expanding on and that you can educate yourself on if you wish.


By "it applies at every company" I mean it's part of the contract no matter where you work because by law the employer is allowed a 90 day trial period, so they all put it in the contract, so one offer is just as real as the next and it's just something you have to go through and yeah I guess it's probationary in nature.

Not everyone likes it or agrees with it, and I can only comment on the software industry here and not other industries but it's not the end of the world and the sky doesn't at all fall. When they introduced it a lot of people tried to make arguments like it would be abused etc and as far as I can tell there hasn't really been any drama. YMMV depending on country.


The best I had was a guy asked me to bring in my laptop and show him some code. We talked about it, he asked me to add a simple feature. I think it worked well for both side. Not much of my time wasted. No gotchas because of some configuration issue of setting up a project for the first time.


Yeah, I think 1-3 month probation is really the only way to do it.


I had perhaps the most amazing interview experience recently where it was an open discussion. It wasn't a technical drill down but more a Q&A where you discuss topics relevant to the area you claim to have knowledge of and how it relates to the job you're interviewing for. It was the complete opposite of code this technical problem and a missing semicolon will get you a flat out rejection.

What clicked was I was finishing their sentences and knew precisely what they were asking. It was an incredibly rewarding experience which led to a same day offer.


Sounds great for first job out of college not so great if you already have a good job.


As a junior I’d much prefer a whiteboard where I just need strong CS fundamentals and reasoning to a work sample where there are 10+ different dimensions I could be judged on like style, maintainability, whether I used the latest language features, how I solved the problem ( use libraries or from stratch?) there are just way too may variables and potential bias in the judges VS did you correctly turn the problem in a DP algo.


When I was a junior, I didn't have strong CS fundementals! But I had about 5 years more practical coding experience than most juniors.

Ideally you would be given guidance on what they will judging you on.


Hah, I had a sample test that I had to do after first being met on-site, which I thought was great.

Then they said my work sample was amazing, and they’d like to do an on-site Q&A about it, but when I arrived the engineer hadn’t even seen my work, and proceeded to just quiz me on obscure JS trivia.

Which I promptly failed.

They then rejected me even though they were happy about my work :/


In addition to covering all interview expenses such as travel food and lodging, candidates need to be paid for their time interviewing by the companies interviewing them.

Six hour take home tests is fine, but I want $1350 for that in advance as a consultation fee, and if I hit 6 hrs and it's not done yet they can keep paying until I am done or we can just end it, no refunds.

The idea that I should spend six hours doing free programming for random companies that say they are desperate to find anyone qualified is absolutely absurd and insulting. No one should put up with that, particularly anyone with an established and verifiable career.


It's even worse when you're a consultant. At the moment I'm faced with scheduling my fourth round with a company, and of course it's 5 hours onsite. I have the option of skipping the free lunch, they said. It's nice of them, but my monetary loss for that time (including commute) is enough to pay lunch for a number of people. All that, and they can still can you a few weeks or months into the job if you're a dud.

It's employment, not marriage, guys, lighten up with your strenuous and time-draining processes. We senior employees aren't as, is the word excitable, as the entry-levels about joining your workforce.


That does sound excessive. I've never had more than 2 rounds of interview, and even at that the first round was typically over the phone and the second on-site.


This is pretty silly. Candidates routinely spend integer multiples of six hours running interview gauntlets for tech companies that are notorious for negging candidates. None of them expect to get paid for interviewing. An at-home work sample challenge is strictly less onerous than an interview gauntlet, but because it has the appearance of something people have heard other people get paid for, it's commonly suggested that they should be paid, too.


why are you even interviewing for these "random companies desperate to find anyone qualified" if you are so disgusted by them?

The elitism of some engineers is mind boggling, instead of being grateful working in an industry that has so much demand that you can easily find a job at anytime, you complain about the process being insulting to your oh-so-important persona.

If i really want to work for a certain company because what they are doing excites me, then yes, i'd do a 6 hour take home test where I can probably also learn a thing or two and I am willing to bet a lot of other developers would too. As an interviewer, someone charging >$1000 for a take home test would be an immediate red flag and our mindsets probably don't match up.

If this works fine for you, kudos.


It's good to see a full gamut of opinions on here. I'll say that mentalities like yours are why I'm a contractor in the first place -- you should be grateful for the opportunity, etc and so forth. No. I exchange my time for your money. If that's entitlement to you, we come from opposite worlds.

I'm not the guy who says he should be compensated for take-home tests above. I am however a senior engineer. I'm spurning any long interview processes because they cost money. I don't think it's entitlement, but if that's what you want to call it, so be it. It's called hours worked, hours paid, and in the West it's been a concept since at least the 18th century.


They cost money for both parties though so in that sense it's fair.


I agree with the statement "they cost money for both parties" but I disagree with "in that sense it's fair."

Individual consultant: loses 5 hours of interview time (and commute time) or take-home exam. Let's call it $800 for the sake of argument.

Company: loses 5 hours of interview time, plus the time it takes to "quiz" the exam.

Individual loses money that he / she uses to pay their mortgage.

Company loses profit because the time spent interviewing the candidate could have been spent working on feature Y of the application. So shareholders / VC's lose.

So you're saying it's fair that the individual contributor who loses half a day's pay in your interview process is equivalent (you DID use the word "fair" so that's an equivalence argument you made) to a company's loss of a few hours out of the many thousands of man-hours they rely upon? It's a .0001% of their profit, assuming the employers don't work a few hours more to make up for the lost time, because they will (they're salaried!)


Consultants also do not get paid vacation, or paid sick leave. They also don't get any of the benefits that a lot of regular employees get. But that's part of the game, they have to account for all of that which is why they earn much more per hour. A lot of those "customer acquisition" tasks cost time and may not necessarily yield a return, but that's part of the extra risk consultants have to assume and why not everyone is willing to do it.


Maybe all true with regards to cost considerations, but it doesn't support the notion above that a candidate's wasted time and money (vacation time and sick time costs money for an FTE) and a company's wasted time are somehow .. 'equal'.


That sounds like disorganization and poor internal communication, two traits of the organization probably not just evident in their evaluation of you. You may have dodged a bullet.


I don’t have an opinion on whether or not it is done synchronously or asynchronously and I hope I didn’t imply work-sample = take home test.


Gotcha, yes that makes more sense. I find a lot of unicorns are already doing this sort of interview, anyway. While I have never observed studies, it seems to provide better signals than white-boarding.


Well, I actually willing to spend time on a problem, because it is interesting and I can do it in a fun way and learn something new.

But I have 0 tolerance policy towards puzzle whiteboarding tests. It is total waste of time.


>It does always amaze me that literally what you learn in HR 101 in an undergraduate management curriculum is controversial. The r^2 of a ton of different methods for predicting job performance is something large companies are highly incentivized to get studied by academics (and they do).

Large companies are not "highly incentivized" to optimize they're hiring process. Once a certain throughput is achieved, there's very little reason to revisit it, because if they revisit it, "they're lowering the bar", or not "optimizing for recall" or whatever.

Mindlessly copying large companies isn't all that useful. Microsoft famously loved brain teasers. Can't get the fox, chicken, and grain to the other side of of the river? You must not be able to code either. Google, loved asking your SAT scores because, "obviously" someone that got some arbitrary score on a standardized test, a minimum of 6 years ago, certainly means something today.

Large companies aren't immune from bullshit. In fact, they have the habit of metastasizing bullshit, because of "Well, X does it, so it must be good." X only does it, because they had a stupid idea, became successful because of completely unrelated means, and then fooled themselves into thinking their process was good, "Well, I've been hitting candidates with ball-pein hammer for years, and I'm successful, so screw you."

Interestingly enough, eventually both Microsoft and Google abandoned these interview questions, because eventually, they realized that one had nothing to do with the other, but only after years doing it, and others copying them.


Microsoft and Google do not collectively hire that many people and are not who I am talking about. “Brain-teaser” type questions are explicitly not work-sample tests and, yes, have no correlation with job performance.


The number of people that they collectively hire is irrelevant. They have an outsized influence on interview practices industry wide.


IQ does correlate with work performance, and SAT tests are IQ tests. Brain teasers are an attempt at seeing how well/quickly your brain works to solve problems, is a rudimentary IQ test.

The problem is that requiring IQ tests for employment introduces liability that employers do not want.


No. None of this is true.

Neither IQ nor SAT scores correlate with job performance. SAT scores were requested at Google for years. Explicit aptitude tests have been used in the past, and continue to be used. They are quite legal, as long as they are used for their intend use.

https://www.forbes.com/sites/theemploymentbeat/2014/03/07/em...

https://www.shrm.org/resourcesandtools/tools-and-samples/too...



Allow me to provide a link from the NY Times article your shared.

The TL;DNR: Nuh-uh!

https://www.nytimes.com/roomfordebate/2011/12/04/why-should-...

In the future, you really shouldn't link to dueling articles in an opinion section. It makes this all too easy.


One of those opinions is from a professor of psychology, the other sells test prep services for a living.


SATs are IQ tests? A large portion is math, and the other written language, both taught disciplines. You obviously haven't taken a real IQ test, which is more abstract and has a large portion of spatial geometry type questions.



And isn't one large chunk of the SAT a written language test - that's gong to suck for dyslexic / neurodiverse candidates.


At one point, MENSA would accept a high SAT score as a reasonable proxy for a high IQ.


> highly incentivized to get studied by academics (and they do)

Why do you think they are incentivized to get studied? If anything, power dynamics in large hierarchical organizations keep away any studying done that can threaten the power. Sometimes studies do happen though, but large organization can never apply the results to anything on their scale. They are more worried where to even get such massive stream of professionals to hire.


You’re getting too philosophical here for something that isn’t that complicated. This isn’t related to power dynamics. Asking someone if they can flip burgers is obviously not as effective as asking them to flip a burger and seeing if they can do it correctly. Getting employees who are unquestioning robots with no ambitions for power is a different interviewing issue.


"Everything in the world is about sex except sex. Sex is about power." -Oscar Wilde


Hr isn't exactly known for rigorous scientific research and I don't recall it being hr that introduced this trend of hazing candidates.

Also have you got any examples of this research that actually proves this correlation I haven't heard of any and I would have as I have spent decades semi involved in IR (industrial relations) in the UK


Hazing candidates is mostly negative reinforcement introduced to keep people from wanting to go to interviews and change jobs. It also selects for those willing to do a lot and put up with a lot for a corporation.


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: