Further, I think it needs to be said that at smaller companies (read: startups) whats really happening is that applicants are being evaluated for whether or not they're cool enough to join the club. We call it "culture fit" but we're really just trying to vet their personalities.
EDIT: To be clear I do think people's personalities need to be evaluated, especially with smaller teams. I just think we should drop the pretense that "implementing" a depth first tree search on a whiteboard tells us anything other than that they could summon that algorithm at that moment in time with that amount of coffee and whether or not they had a pleasant morning. First of all, I want engineers that are good at working with others, and second of all it takes months to truly evaluate an engineers ability to do their job. We gain nothing by pretending this isn't the case.
Whiteboard problems absolutely do work.
The vast majority of applicants cannot code at all. And I mean that literally: they're at a loss at how to write a function that adds two numbers or counts the number of elements in a list.
Worse is that these guys can be employed as developers (even 'senior' ones!) for years and years in 'serious' enterprises.
How, you ask? By using copy-paste and cleverly navigating their enterprise processes and dodging responsibility.
Maybe this is what you mean by 'being good at working with others', but it's definitely not what I want in a software developer.
Source: I've interviewed a great deal of people for lots of positions over the years.
Of course once I did land a job it took about a week to shake off the rustiness, and the company that hired me is thrilled.
The point is that companies like Google and Facebook can afford to miss out on those devs. But smaller companies should be looking for diamonds in the rough, not trying to mimic the FAANGs and getting their leftovers.
I speak from personal experience. I failed my first FAANG style interview both because I had not prepared nor understood how white board interviews really work and because a huge subset of my skill had gotten rusty over the years. But when I first failed I was really upset and very quickly wrote off the entire process as a ridiculous test. Looking back I was a true negative and needed to brush up on a range of skills.
When I was a junior dev I spent nearly all my time studying programming, CS and software. But as I got more senior I definitely relaxed a bit on all of that and coasted more on the inertia of past successes than I should have. Yes I was good at my current job, and the ones before it, but those only represent a small subset of the skills a senior engineer should have. What made me a great engineer in one specific company allowed me to let other skills that I wasn't using decline a bit.
By being a bit more honest with myself I spent a long time getting back into the things that I used to love and also learned how to practice whiteboards. All my white board interviews after that were a success.
I think a huge push back by senior devs against these interviews is that they don't want to admit that, while they have gained a ton of valuable experience, they might not be as strong of a software engineer as they once were.
However, and I think this is the crux of the problem, you're not paying senior developers for that. I've never had to actually do any algorithm slinging on this job. The fanciest it usually gets is chaining some maps and filters.
On the other hand, I have had to do "rocket surgery" on critical path legacy code, write business logic in a maximally predictable and readable way, figure out how to land a non-backwards compatible change with no downtime, convince other teams to help with an initiative my team is leading, design an internal API, etc.
Doing that stuff requires experience, rigor, resourcefulness, and I'm sure you can come up with more "senior" traits. My personal complaint about whiteboard interviews, even systems design interviews, is that they only indirectly measure those traits.
From this perspective, a technical whiteboard interview is one of many tools. Interviews I give usually start with “so your boss asks you to solve problem X ... where do you start?”. Then I throw more and more problems at them (technical, organisational, etc) and see how they respond. “It’s in production and people start complaining that it’s slow. Where do you look first?”. “What problems do you foresee with this design down the line?”. “If you had $1m/yr budget to hire a team to scale this system, what would your ideal team look like? How would you spend the money?”. “An inexperienced team implements this and it’s buggy. What mistakes are you worried they might have made?”
Ultimately we get the traits we hire for. Being able to code (and debug!) is important. But I also want employees who I can delegate to, and trust that they’ll figure things out. I’ve been able to pass whiteboard interviews since second year uni. But I have not stopped learning, and the non technical skills I’ve gained since then are at least as important. Test for them.
However I can confidently say that it only took a few weeks of being thrown back in the mix to shake the rust off and get going again.
If I was an employer and could extend contracts at fairly low risk, I'd give devs with a strong resume and a demonstrable open-source library a chance - despite them being a bit shaky on the whiteboard.
Good on you for having the introspective skills and awareness to identify the problem and do something about it.
I mean, sure go ahead and prepare for interviewing, brush up on whatever you think will help. But if a company has a policy of consistently rejecting candidates based on testing of skills that are never used on the job, it sounds like there's a lot of room to improve that interview process.
If someone is effectively testing for these more difficult subject matters then it's quite possible that they themselves and other co-workers are competent in them (as they passed the same test).
As a senior devops engineer, I write a lot of trivial Groovy code for Jenkins pipelines. But the interesting part isn't the code, which for the most part a monkey could do. It's redesigning the release process. The rest is just implementation details.
Thinking coding is important is a failure mode.
The thing that I find when conducting interviews is that people who have trouble writing a concrete solution to a problem often have trouble formalizing any solution. They can handwave stuff that maybe makes sense, and given enough good faith is "correct", or at least not obviously incorrect, but at the same time it depends on a whole suite of libraries that don't exist, or a domain specific language that someone would need to come up with, or something.
And if you need to invent a DSL to parse a string, I'm worried about how complicated your actual solution would be when redesigning the release process. Because sure, any monkey can write some groovy code that does something. But I'm more worried about if that code will be well designed. Note, not the system, but the code itself. Because in reality the code defines the system, and a beautiful architecture implemented terribly is still terrible to work with.
To see the second thing, I need to see concrete code.
This is far more real-world than a coding test, imho. Coding happens on the micro level, but understanding happens on the macro level.
This guy has the right approach imo: https://www.karllhughes.com/posts/rethinking-hiring
I'm honestly nervous that next time I have to go out and interview, I'll be in the same shoes as OP. Despite many years of managing software for small companies, I have no desire to go back and re-learn Leetcode just to get a job.
But being able to map the macro to the micro is a vital part of being a competent SWE. This includes being an architect. If your plan only considers macro-issues, but is difficult to actually implement on a small scale, its not a good plan.
I want to gauge both, and a coding test is a good way to measure the micro.
(And by the way I realize in a lot of companies, 'architect' is a completely bogus term for someone who's more of a flim-flam man than actual doer. So just substitute "staff engineer" or whatever you call it.)
But the main parts of my job I have to get right are picking the right approaches technology-wise, and setting up frameworks and patterns to make devs' lives easier in building out the actual features. You can't test that stuff on a whiteboard imo. You have to just talk it through and try to get a sense of how the potential architect/lead thinks about problems.
It also takes a good architect to interview an architect imo. There's plenty of great devs who just haven't acquired that level of scope yet - not of thinking not just about how easily it is for you to get something done - but how easily it will be to maintain as a team, within the greater ecosystem, over the life of the product.
And that's the underlying problem behind this coding-test nonsense. You don't ask an architect candidate to implement a binary tree in an interview because it's relevant - you ask that question because you don't know how to ask questions that are relevant. For anything but actual low/mid level coders, these coding tests are just evidence of a failure to interview effectively.
As an aside, I don't find most architects to be "flim-flam men". They are usually quite hardworking and competent, although their job is frought with risk. They're often asked to do the impossible, and they have to do the best they can with it.
Code is a liability as much as it is an asset.
Companies don't get excited about a dev who just passes. Even though that dev might be by far the best candidate - they just need a few days to chew on various architectures - or they take the test literally and don't add bells and whistles. Etc.
Companies get excited about a dev who aces it with flying colors.
which explains the paradox of too many developers chasing too few jobs versus all these companies complaining that they cannot find enough good developers
I'd say it's the opposite. Big companies can afford to take a shot on someone and miss without materially impacting the business.
If I'm hiring developer #2 at my 5 person startup, I want someone confident and cool under pressure who has done something similar to what I'm building so many times in real life that the coding test is a cake walk.
A dev hire on a small engineering team (< 5 people) can make or break the business. I'm trying to de-risk that hire as much as possible. I want to design a test that 90% of people will fail so I can find that top 10% developer.
Once I get to 15-20+ devs, I'm much more likely to relax my criteria and look for a diamond in the rough.
Seriously, Where are you finding these candidates? seriously.
I've worked at a number of mid-sized companies, and interviewed dozens of candidates, and I have never, ever, ever come across a candidate that couldn't write code on this level: "write a function that adds two numbers or counts the number of elements in a list".
Considering the rate of false positives in any software engineering interview process there is every incentive for the underqualified and unqualified to "fake it 'til they make it". It's also difficult to tell the difference between someone failing upwards and someone aggressively managing their career by switching positions with lateral raises in this hot job market, hence the need to distrust the skill of even senior developers and force this rigmarole on every level of engineering talent.
The included such brilliant things as cut and pasting an answer from a forum that was followed by a dozen comments of people explaining how wrong that answer was, and someone who answered what should have needed a short sentence with two pages from an Oracle manual giving an answer that did not apply to the question.
It's not that we expected everyone to be honest about not using Google - it didn't matter, it was an initial screening question. But we did expect them to at least bother to restate the answer in their own words if they looked it up. And get it right..
This has a surprisingly high failure rate even in cases where we just email it to the candidate and discuss in a phone interview the next day. I don't think it has anything to do with a persons ability to program but likely derives from a person's ability to understand a work request.
Innumeracy is the norm: I'd guess > 85% of people don't understand the concept of a function.
And they probably can code if they were working independently. Or they've done some classes, wathced some videos and think they understand it.
But when you add the pressure of an interview, your unpracticed skills fall apart. Also, you have to think on your feet to fill in the blanks in a question.
That's how it should be, because we're not hiring hobbyists; candidates need to be able to demonstrate that they're pros, and able to do so under the pressure of an interview.
I've done my share of phone screens who were flatly unqualified as developers. (Thankfully we've never had someone completely clueless land an in-person interview. That's also a disservice to the candidate as we should provide better guidance through the phone screen.)
Some of them are junior, possibly they lie on their resume and simply keep applying to job after job. That's the 99% that Joel wrote about.
You also occasionally get guys who were in management or similar roles and are looking to transition to being engineers. And I think these may have a similar problem to the senior engineers: they have lost the skill (or never had it) and are finding out the hard way.
It's not an analog. It's the actual skill up close. They should be explaining their thoughts as they go, and you're asking why they do A instead of B.
> I spend a minuscule fraction of my time writing algorithms in my daily practice
A good problem isn't simply an algorithm, but also tests how they break a problem down, how they compose a solution, how they think through engineering tradeoffs, and how they communicate all this to you.
Consider the difference between an artist and an amateur painter. That the artist has practiced brush strokes is not surprising, anyone can practice painting a lot. What really matters is the artist can take the image in their mind and composing it into a complete scene and then express it all that through their medium of choice.
> but problem solving and troubleshooting are way more valued skills in my group
Is that a good thing? If your group wrote better code, wouldn't they have less troubleshooting to do?
Yes, that's a tautology, but I've worked on code that was kludges on top of kludges. And while kludges can be inevitable, if they persist, it indicates the person doesn't have the mastery to see a better way to express a problem. That's a skill deficit.
When I'm analyzing someone's ability to code, I'm presenting it as a problem to solve. We solve problems by restating them in such a way that the solution falls naturally from the question.
The candidates who can do this well will put together well structured, coherent code, and my team will spend more time delivering features and less time troubleshooting.
Without giving away the question I ask, I can tell you the solution is a for loop and an if statement. If I told them exactly what they needed to solve the problem, I'm sure most could write the code (though honestly some would have still failed). It's a question I would think could fit as one of 5 on an intro to programming class final, yet I've had candidates with 10+ years of experience fail it. I even had one such candidate argue with me over asking a coding question when his resume shows so much experience at different roles.
Talented people frustrated at the process just don’t get how bad bad coders are. I would never have believed it myself until I experienced it.
People who would ace the entire interview would look at me funny when I asked the first question, and I just said, "I mean, look: about 25% of the candidates fail this first question." Lots of others got partway through it.
It is very true that you need to qualify someone's ability to write code at all.
I think there's usually a lot less utility in some of the "clever" coding challenges that require you to remember some difficult-to-derive-from-first-principles data-structure or algorithm. But on the other hand, if we literally just give fizzbuzz to everyone, we'll eventually see people who have memorized fizzbuzz but can not create any other program.
There's a real challenge to creating a coding problem that hits the sweet spot between "doesn't just test that you had a particular intuition," "does actually test real coding skills" and "isn't so common that people have memorized the solution."
At the time, their marketing department did all of their web development in-house. I don't recall all the specifics; there was a round table meeting between the manager of the unit, the team lead, and one of the senior developers. At the end of it, they sat me in the cubicle area with their other developers, gave me an MBP with MAMP on it, and a piece of paper outlining what they wanted me to code - a simple CRUD app. It didn't have to have any fancy styling, but it had to look ok, and it had to work. It was "open book" otherwise; use google or whatever other resource as you needed it. Also, all this happened while the other devs were in the area; it was basically a time slot from 2:30pm to closing time...
I'm thinking - really? Something this basic...
But given what you had to do - essentially from a blank slate, including the database, set up the tables, build the SQL, code the PHP, integrate the form to talk to the PHP "backend" and update things, refresh and show the updates, etc...
...well, isn't that basically what most software dev work is, at the core? And if you can't do any of that...
Of course I got the position, and worked there for a couple of years; not the easiest environment I've ever been in, but certainly very interesting.
During it, though, I got to experience, from the "other side" what I went through - and I was amazed and dismayed to see how many people were interviewed who couldn't do it. Who had what seemed like great resumes who couldn't even start. Who'd sit there for 2+ hours, and not type a thing. Who didn't even google up something, or ask a question, or...
We had one guy sit for a while, then just got up and walked out without a word.
As I read comments like yours, and others elsewhere, I can see that this is more common than not. You are right to believe that there will be those that will "memorize" fizzbuzz, which I why I think a challenge similar to what Fender asked for is a better test. I know that some developers would balk at it, but I think the time invested may be worth it, to show you are able to do the job, and can come up with your own solution to a problem, and not just some regurgitated answer.
A colleague of mine I had worked with prior, unknown to me, applied for the same position at Fender and was given the same laptop as I did. But they had forgotten to wipe it! He saw my code, and didn't know if he was supposed to expand on it or what; he told them "hey, this looks like my friend's code...?" - and they realized what they forgot. They thanked him for his honesty, wiped it, and continued on with the process. He also ended up getting the position as well.
How have you not grokked how useless those questions actually are when it comes to knowledge about writing software? Those are both trivia in the same category as "implement the TCP acking mechanism".
Someone who knows the 'trivia' may remember how to implement quicksort without actually being any good.
But someone with even a little bit of understanding and some prodding to not worry about efficiency will be able to come up with some sort of solution, even if bubblesort. If people appear truly petrified, it's easy to give them a chance by breaking down the problem and see if they can reason about it. E.g.whay if you start with a two element array? Then how about 3? How do you generalise that?
Someone with both the trivia and the smarts will give you a good solution and be able to muse about tradeoffs of different implementation methods, pivot selection and the like.
It's usually fairly simple to find out if people understand the solution they offer up and can reason about it, and that's often a lot more important than whether they come up with a great solution.
Depth first search I would now be a little less eager to do (I was asking these questions ten years ago), but there were a few things that I felt came out well from it: if someone wrote something like:
if (DFS(node.left) || DFS(node.right)) return true else return false
That seems to me like it demonstrates at the very least some immaturity of how to write professional code. If the person doesn't know how to do recursion, that stands out. If they fundamentally don't know how to deal with a stack, that stands out.
If someone has never encountered DFS and just gets fundamentally stock on what the algorithm is at all, then that's, I agree, not wildly meaningful. But that was not, in general, a reason why people didn't get the DFS question.
EDIT: I will also note that I've had a couple of people on HN react with horror at the notion that someone might be asked to impelement DFS or BFS. While I agree that these aren't perfect questions, I think they're pretty radically different than some of the puzzle-y or impossible-to-re-derive questions that you sometimes hear about. The algorithm for DFS is:
1. Check to see if the input is null. If it is, return false.
2. Check to see if the input's value is the searched operand. If it is, return true.
3. Return DFS(left) || DFS(right)
Breadth-first is a tiny bit harder, but it's still a while loop on a queue and just test equality and push the children onto the queue. It's about ten lines of code and it's far from rocket science. If nobody ever taught you about binary trees at all, you might still be a great programmer. But if you're a good programmer, and you ever got taught about binary trees (which most people who have traditional backgrounds have), then you should be able to recreate those algorithms from first principles in, I don't know, 15-30 minutes.
One place I worked at, the company hired a developer who claimed to have a CompSci masters.
He was completely unable to code anything. I thought it strange.
I started to ask him some basic questions that any actual CompSci degree holder should be able to answer (and I don't have a degree in CompSci at all - everything I know I've learned on my own, from other sources, for the most part); I didn't make it like a grilling session, just polite conversation about a shared interest - but he either had difficulty, or couldn't answer at all.
He only stuck around a couple of weeks.
I've often joked that an interview question should be asked akin to "What basic logic function is needed to implement a computer? Show it's truth table, then design one in 2-dimensions on a whiteboard as a virtual 'rope-and-pulley' system."
Couple that with a random-style fizzbuzz-like challenge, and maybe a more difficult open-ended programming challenge (ie "build a simple CRUD app") - that would give you a good idea on their real skills.
Note: That first question I wouldn't expect many to be able to pass the last part; even the first two parts many perfectly capable developers would have difficulty with. But I would be disappointed if they claimed to have a CS degree and weren't able to at least tell me what it was and the truth table for it.
I should note that my statements below may be FOC; I do not myself have a CS or EE degree, so take what I say with a modicum of salt...
But first, note that I wrote "function", not "circuit".
It could be argued that CS, on the whole, is a subset of mathematics, particularly that of boolean algebra and logic. As such, the functional equivalency between the abstract of boolean logic/algebra, and its implementation on a physical substrate, could be considered among the most important of CS concepts.
One could also argue (maybe?) that Turing's "equivalency theorem" might be related to such as well. Consider the case of an emulation of hardware done in software; one could consider that - at a base level, it is boolean logic expressed physically, being expressed equivalently as boolean software functions.
The opposite it also true, of course - that it is possible to express software boolean functionality in the equivalent physical form.
What form it physically takes does not matter (other than speed of course), which is why I also didn't ask for an implementation/representation in electrical terms or schematic form, but rather a diagram of something that could be expressed as a physical and mechanical object. If the person were so inclined, they could express it as a series of levers and marbles, or in LEGO, or Meccano, or any other similar option.
EE knowledge is not needed here, I don't believe (Martin Gardner might agree).
did you talk like this to the CS masters, no wonder they left
Sometimes people move between senior eng. and management positions and back depending on organization sizes.
I certainly have had engineering manager positions where I went several years without needing to code for my job. If I'd stayed longer and not coded for fun, and then gone on to the type of positions I did next, which were much more hands on, and using different labguages I can imagine I might have found it hard.
Thankfully I've always enjoyed programming on side projects too, so staying up to date has never felt like a chore.
As another poster said above, best guess is some version of copypasta and navigation of bureaucracy.
I'm genuinely curious how you manage to find all these folks. I've been on the interview team at my company for a several years now(mostly in house, some first pass phone screens) and I've never encountered a single person who was literally unable to code a trivial problem. The last time I met a "programmer" who couldn't code was first semester university, and I thought most of them quickly flunked out/changed majors. I wonder if there is something about your company/recruiting process that is particularly attractive to them, or if our prescreen(which I'm admittedly no expert on) is just particularly good at filtering them out, or if there's some other explanation.
I saw him do SQL joins on the wrong column, cause accidents in source control, lose changes because he wasn't looking at the file and folder names on screen, and so on. Hard to realize for him, and hard to guess in an interview.
No, you mean that hyperbolically.
Not only does it simply not happen that "the vast majority of applications cannot code at all" -- this literally has never happened at all, in my experience.
What does happen is that you get a range of people on a spectrum. And yeah, a fair number of them can't code very well. They're slow, they don't see smart solutions, whatever - or are just plain sloppy. But that's quite different from "not being able to code at all."
As to those people who (supposedly) can't "write a function that adds two numbers or counts the number of elements in a list" -- most likely they're simply freezing up from the anxiety of being whiteboarded by a perfect stranger for the first time in a great while - or perhaps ever. (In fact that's exactly what happened to me, on my very first on-site interview after college).
Or that is to say: they haven't internalized -- and produced defenses for -- the (intentionally) awkward and humiliating ritual of the modern tech interview process.
And again, you should only be actually seeing these people once in a blue moon. Unless the people running your incoming "pipeline" are utterly incompetent, and are constantly feeding you a stream of unqualified candidates. In which case your companies much bigger problem a lack of engineers who are able to "ace" HackerRank problems in 59 minutes or less.
 Which, lest be honest now -- basically can only happen after extensive time spent on practicing these problems in advance. Or that is, by blatantly gaming your hiring "filter".
And one more thing:
How, you ask? By ... dodging responsibility.
No - their jobs just have different metrics for "responsibility" than yours. That's just the way many businesses are run, whether you like it or not.
I've screened people like you describe, but the only time I've interviewed them face-to-face was when they didn't have a technical phone screen for whatever reason.
FWIW, one of the ways I screen companies when I'm looking is whiteboard problems. I refuse them and move on. In my experience, only HugeCos and places with problems use them. I'm sure that's not true, but I have a necessarily small sample-size, and skipping over firms that do it has worked well for me so far, and there are plenty of fish in the sea. (I do in fact suck at writing on whiteboards, I just don't consider it a skill worth developing to pursue jobs I probably don't want.)
I agree with you 100% if "whiteboard problem" means, sit with them while they type up a function in an IDE that does something common (e.g. validates a string, implements some error handling, do a failure backoff, etc).
I disagree if it means, ask them to implement an algorithm on a whiteboard to steer a robot through a maze in a time with optimal algorithmic complexity. This is completely useless and the people that can do this have little overlap with people that can implement easy to read/debug code worthy of production and maintenance.
From an interviewing perspective, asking someone to "solve" this kind of problem on a whiteboard would be interesting to see.
One thing I'd tell them is to not worry about the code; that is, if they just want to write the process in pseudocode or something like that - as long as the logic can be followed, that would be ok. In other words, give them the leeway to not worry about proper coding, knowledge of functions, etc - but instead let them concentrate on the problem.
I wouldn't expect anyone to solve such a question - but it would give a good insight into how they go about solving a problem. Do they ask questions? What happens when they get stuck? Can they explain their reasoning? And so forth.
Let them do what they can, give them 30 minutes or so; if they look lost, ask them some questions, see how they respond, etc.
I think such a question could be very valuable - if presented in the right way.
I've been interviewing devs for years, and this is not my experience at all. The vast majority of applicants that I've interviewed can code, although they tend to be minimally competent at it.
>The vast majority of applicants cannot code at all.
You kinda set up a strawman here. If the purpose of the whiteboard problem was just to establish some very low baseline of coding ability then I doubt many people would argue about their effectiveness. But companies don't use whiteboard problems for that purpose. In my experience (on both sides of the table) they are given with increasing levels of difficulty to see how far the candidate can go. They do not simply ask a few basic questions like "how would you write a function that returned the sum of two numbers" or "count the number of elements in a given array."
I'm not saying there is a really good answer to this. The best I've seen is that some people just seem to be good at hiring and others are not. I am one who is not. I am also a terrible interviewee. The whole process whigs me out.
We have 'hard' questions in our pool we can ask (where optimization actually comes into play) but I've found that the easy questions weed out so many candidates it's not worth it. There's no room for debate if someone tries to write 15+ if statements rather than creating a loop and one if statement.
Absolutely untrue in my experience, I can't speak for other people. To imply that this is absolutely untrue in the global space would require that I have interviewed everyone.
Whiteboard problems absolutely do work in my interviews. Again, use of the word absolute indicates that I've never interviewed without a whiteboard. Given the high number of candidates I've interviewed, this might indicate a flaw in the interview process.
The vast majority of applicants I select for interviews cannot code at all. And I mean that literally: they're at a loss at how to write a function that adds two numbers or counts the number of elements in a list. I should consider the possibility that I'm selecting the wrong people for interviews.
Worse is that these guys can be employed as developers (even 'senior' ones!) for years and years in 'serious' enterprises. Clearly other companies are making the same mistakes I am making in their candidate selection process.
Source: I've interviewed a great deal of poorly selected people for lots of positions over the years.
I'd just note that you can smuggle nasty behavior into any formal mechanism. Witness the legal system (which you'd engage here) - whining about bad-faith arguments in court is basically a national sport in the US, until the whiner is the plaintiff.
Fortunately, the management chain caught this and ... corrected the issue.
My wife showed a sign of mild disapproval, and that was it. She was rejected for "not a culture fit".
Perhaps check references to determine if either candidate was able to work well in the type of environment you provide.
For example - the nice person may be lazy, or the abrasive person may productive and helpful.
It's been my experience that contacting the candidates references will flush that information out. Even in this age of litigation - I've always been able to get an answer to "Please rate on a scale of 1 to 10"
We've avoided some people that interviewed well and had great skills but were horrible human beings - and we've picked up some people that are hard workers that are willing to learn that didn't interview well.
If your hiring criteria excludes groups like that by a "fast friend" proxy test, then yeah, you should be taking the better programmer.
Sure, it'll be more productive for everyone if I tell someone that we think they're just not smart enough to learn shit as fast as we need them to. But that hurts. So we say "You're great, but we're looking for someone with just a few more years of C++ experience". Similarly, when we think you've been acting like an arrogant dick, we say "lack of culture fit".
I once rejected someone whose English was so bad that I couldn't understand them on the phone. That's a hard thing to tell a foreigner who has the courage to call you up, by phone, in a language they probably know they're not great at, for an internship position! So I lied and I told them they had insufficient React experience. Am I proud of that? No. I'm still not sure what I should've done.
I see this trope a lot on HN that "lack of culture fit" means "wrong brand of sneakers" and it's usually nonsense. "No culture fit" means "there's something specific that we dislike about the way you behave or communicate but we don't want to hurt your feelings more than necessary".
Candidate one gives off slightly anti-social vibes but is brilliant, candidate two jokes around with me during the interview and is smart as well, but less so than candidate one. Am I justified in going with candidate two on a hunch?
Which is also why there are usually multiple stages in an interview process, or at least, you pass a candidate to 3 or more team members to interview the candidate.
In my experience, it's hard to find good candidates, but not that hard to figure out which of the candidates will be a good fit, and a strong contributor, to a team. I've also hired mostly for team sizes of 10-20 developers, not 200.
This doesn't even touch even more difficult situations like when your friend messes up and needs to be dealt with in a professional but non-personal manner.
So yeah, hire the better coder.
Culture is preference between two otherwise value-neutral positions.
For example: encouraging collaboration vs. encouraging independent work.
Or supporting self-organizing teams vs. all work having a WBS/charge code.
Culture is not choosing between being respectful and not; or choosing between being openminded and not; or choosing between being honest or not.
I look younger than I am, but my patience for over-working too many hours is at an end. Years of stress have led to health issues that have to be managed.
But this all rolls into "not a cultural fit."
I don't blame them for not wanting to hire someone pushing 50. They just don't know if they're going to get the cool 50-year-old or the grumpy old troll who won't listen to anyone.
Of course they can never come out and say that for obvious reasons.
I play video games. I have all of the last two generations of consoles, a smattering of older consoles I've managed to hold onto over the years (lost most of them for one reason or another) and a half-decent gaming PC where I prefer to play games if possible. On top of that I've done game development on the side, and have run multiple gaming communities for approximately a decade now.
If I went to go interview with a company and they talked about how they were all "gamers", I'd be running for the hills.
You know what I want to do at work? Work.
You know what I want to talk about at work? Not video games.
You know what I want for non-monetary compensation? Not a weekly autochess tournament or PUBG squad night.
You know what kind of people I want to be surrounded with? Not a bunch of clones who all have the same beliefs and values and (lack of) experiences.
You should be a good fit for the company, and the company should be a good fit for you. It goes both ways.
The worst engineers I have had to work with were sometimes pretty skilled technically, but their ego or shitty personality was preventing them from being somebody the team could benefit from.
I would have thought that it is why we have cultural fit interviews though.
Personally I would sooner drop whiteboards than cultural fit interviews, but the later probably need way more training for the interviewer than what I got.
Sometimes those people conducting the interviews are on their first or second job, and generally those younger coders have a tendency to emphasize things like obscure syntax for whatever programming language and the latest programming paradigms. It is overly language centric rather than dealing with how to solve problems. That is a terrible metric for the issue at hand "will this person be an effective at their job".
I've seen this end up in teams that lack diversity several times. My last company was big on interviewing for fit. Then a year goes by and you realize most of the people you've hired are a clone of everyone else. I had a co-worker that once argued that you can't judge people based on personality in an interview and I tend to agree with that now
I don't ask algorithm questions, and I wouldn't even dream of asking brain teasers. I ask questions like "Here's a pile of text that claims to be CSV, let's explore how you'd generate an HTML report out of it", and branch out from there depending on how the interview goes. I've got all sorts of directions I can go from there, from screwing up the input in various real-world ways, discussing HTML security and injection attacks, algorithmic complexity of the report, how to build a service out of this, how we're going to handle reporting errors, there's just an endless number of ways you can take the interview from there.
CS 201 doesn't usually come up.
The ideal would be pair programming for an afternoon.
Even if their "solution" is incomplete I look at if they get the gist right.
- Whiteboarding: algorithmic knowledge not relevant to daily tasks, we have google, obtuse coding environment
- Take home exams: companies have less incentive to respect time of candidate, favors candidates with lots of time to spend interviewing
- Small consulting gigs: not practical for programmers with existing jobs, draws out job search
- Informal conversations, reading resumes: not stringent, susceptible to talented bullshitters
To be 100% clear, none of the above is my opinion, I am simply restating what is regularly posted at this online watering whole.
A discussion of the failings of white-boarding without the context of alternatives is meaningless because interviewing techniques are search functions that all have precision/recall tradeoffs. That there are negatives to white-boarding is a given.
For example, consider that white boarding interviews are short (3 hours). This naturally limits the company's ability to evaluate candidates (less precision), BUT it saves time for the candidate and company (more recall).
So what happens here at HN every week is you get five people all bad mouthing a different interviewing technique, but we never get any closer to a consensus on a technique that would please even a simple majority of programmers (let alone everyone).
TLDR; you don't like white boarding, so what about making a compelling case for something else?
Whiteboards are for writing down algorithms and quick charts and other design-phase stuff, not writing compilable code.
Even when given as a "take home" challenge, they can't even copy-pasta an example off the internet...
make part of the test a reverse question where the interviewers have to work out what language your solution is in
v _@>v >v>:#<.>1+:2^
We need to admit that interviewing is hard, and we aren't good at it. In fact, we don't need to admit it, we know it already.
Interviews used to take about 6-8 hours of my week, and I was usually completely destroyed by the context switch. I remember many days when a deadline was very close, and I absolutely did not want to do the interview, but had to.
In general some days I was in good mood, some in bad mood and even though I tried to make the interview as objective as possible, I am absolutely sure I failed. Keep in mind, one interview has 5 interviewers and guess what they had bad days too.
From those 500 interviews, I was certain for about ~5 cases, for the rest I could not say if it was my fault, or the candidate's fault that the interview went south, in that case I pretty much voted as the majority was voting.
I still feel very bad about people's livelihood being decided in such shitty process, there is absolutely no way that in 1 hour I can find out if someone is good. I knew that, my managers knew that, their managers knew that, and yet we kept going..
I feel that adding more time/people to the process only increases the variance, adding more structure to the process only increases the bias (like we get people who can pass interviews, but not do their job).
A one hour interview required a minimum of 30 minutes prep to read the resume, etc and generate questions. Then at least 30 minutes to write up my findings in a way that would allow me to participate in the roundup, which could be up to a month away if I was the phone screen. The roundup itself accounted for at least another 30 minutes. So you were 2 hours and 30 minutes into it assuming everything goes perfectly. It's a huge time suck.
But beyond the time that I lost, it's a very difficult and crushing experience to have candidate fail and have to be the one that says "no" (and that is happening way way more than yes, especially if you are interviewing early in the pipeline). But it's equally disheartening to see a spark of something in an interview, tease it out, imagine that the person can be raised up to become a good developer only to have the candidate get shot down by all your colleagues because they didn't have a similar experience.
I disliked losing time during the day, but I really disliked the emotional burden of interviewing and saying "no." I empathize with the person on the other side as I've been there.
In some cases when there is "no" it comes with clear instructions like: hey you need to know v8 more or understand hotspot better and try again in 6 months, but then there were those cases where the committee was incapable of giving feedback.
Making sure that developers have the "needle in a haystack" solution that you're looking for almost guarantees that you'll weed out the most talented folks -- the ones who are programming generalists that can reason about and solve any kind of programming problem.
I believe these run into legal grey areas because they have to be obviously relevant to the job in question, which likely leads to the tests getting progressively more domain-specific as time goes on.
Programming does seem to be a field in which general intelligence is a big predictor of success. Whereas a field like management probably personality and general intelligence are more equally weighted.
Well yes. And that is what the current crop of "programming puzzle" interviews essentially are; the next iteration of attempting to measure people on a cardinal scale instead of an ordinal scale.
The key to remember is that evaluating on a cardinal scale is much more efficient than evaluating on an ordinal scale, so companies will do everything they can to transform the "secretary problem" into one where they measure absolute metrics instead of relative metrics.
Now, whether those metrics align with business value is a separate issue. All evidence suggests that they are largely independent of business value.
Basically a scientific experiment, have someone else ask and grade the questions, but those results were not considered for hire/nohire. Then 6 months after a bunch of people were hired we pull the results of the test and compare to their performance reviews. If the questions were a good predictor or future success we could use them on future interviews.
The overhead of this was more than we were willing to pay so we went back to HR's allowed questions.
That's ludicrous nonsense. The primary breakthrough of the theory of relatively was taking what the math had been telling physicists about the universe for years seriously. In hindsight, everything necessary was there before Einstein, it just was very deeply assumed not to be reasonable. Relativity builds on numerous pre-existing foundations and is very much in the "standing on the shoulder of giants" tradition.
https://en.wikipedia.org/wiki/Theory_of_relativity#Developme... , first paragraph
Either you've misinterpreted Adam Frank's work, or it isn't excellent.
I'll counter-cite Reflections on Relativity, which contains a lot of deep analysis of the mathematical history of relativity both before and after Einstein: https://mathpages.com/rr/rrtoc.htm which makes it quite clear both what he did, and did not do.
That's the sign of an expert, not a generalist.
But he wasn't ignorant of them. He knew the existing work. He was extremely well read in physics and maths. He wasn't a generalist!
If you get a group of random generalists to build your compiler, or your graphics engine, or your cryptography stack, they aren't going to know what they don't know, and it'd take them decades to come up to speed.
The amount of unique stuff in each subfield is not that great, and if you have a wide and solid base in different subfields, picking up the unique ideas and techniques isn't that hard or time consuming.
But after a generalist has it, he is a lot more productive and creative because he often applies stuff from other subfields to come up with better/novel solution to the problem at hand - something a specialist cannot do.
Most opportunities look for some high degree of specificity. For contract work or a pre-defined task, that makes sense but for companies that need a rich talent pool, I don't know why there appears to be little opportunity for generalists.
I don’t think anyone considers it a difficult barrier compared to getting into FNG etc.
In short, it's costing Canada billions of dollars and has affected countless people's livelihoods.
That doesn't mesh so well with the modern era of employers hating spending a dime on training, even if it would save them a dollar, and employees wanting to jump ship every other year for that sweet sweet 10% compensation bump
Thought in the US you will run into problems due to Griggs v. Duke Power Co.
One is purely statistical -- that if you offer any metric of success, and try to correlate it with any measure of intelligence, even if the two are actually uncorrelated, then having dead people in the sample (that is, people who score 0 on both) will increase the quality of the correlation. Applicable because when you eliminate people who score low on IQ tests and low on performance tests from the sample, the correlations tend to disappear in these studies.
The second is more intuitive, that intelligence is multifaceted and attempts to project it into a low-dimensional space will mask the inherent complexity, but will be capable of identifying individuals that score low across many dimensions. That leads to the condition in the first issue.
The third is probably the most controversial; that IQ tests in particular do not measure generalized intelligence, but measure how well you can do "drone work", so for places where performance can be objectively measured, it will tend to be in tasks that do not require creativity or initiative in unpredictable directions because those are not easily measured, so thus reflective of the same sorts of tasks that IQ tests measure.
Instead of throwing tricky algorithm questions at a candidates, I scour their detailed employment records for the most relevant experience for the first project. In other words, I'm looking for relevant experience rather than top-of-the-head algorithmic brilliance. In the interview, I pose our project problem, and the candidate who gives me the most impressive proposal to get that done gets a chance to solve it.
I hire from an international pool, often on Upwork, so I can start developers on a project basis.
If the developer does a great job, we hire him/her for another project, and so on. At some point, this becomes a full-time relationship, with stock options and other perks.
Using this approach, we value experience over "raw intelligence," per se, and we end up with a team of self-directed developers who are fabulous at delivering great finished products.
It's amazing how well this has worked out for us. I think there's an arbitrage opportunity to avoid coding tests and hire on this basis.
Also just because someone can't come up with the coolest solution for your project, doesn't mean they aren't good developers. This is even more biased than algorithmic interviews, because algorithmic interviews have structure. I actually CAN solve a puzzle in 20 mins. You definitely can not solve a project in 20 minutes or even an hour. It is impossible. And the best candidates will be those who sit back, analyze the problem for days or even weeks and come up with a good solution for your project. This approach is so infeasible for general interviewing that I don't even know where to begin. You are basically filtering out EVERYONE and take the one person who knew enough upfront to accidentially solve your problem best...
To me, "impressive proposal" means "articulation of a workable plan", the components of which are relatively small, actionable, and take into accounts requirements, and that demonstrates a good understanding of risk. In fact I'd argue that the ability to do this kind of top-level break-down well is one of the best indicators of seniority. The downside is that to do it well requires knowledge of a very specific process, architecture, and technology stack. That is, if you're good at breaking down problems use stateful Erlang and OpenBSD running on bare-metal with web clients, you might have an issue breaking down using stateless C# on Azure with Android native clients. Some combinations are more compatible than others, of course, but any senior dev in one stack is going to have to recapitulate the learning curve of another stack before they can regain this superpower! (We may like to think that only architecture matters, but it is relatively rare that an architecture rendered in one stack is actually isomorphic to the same architecture rendered in another!)
I wish all jobs could just hire devs for short contracts then convert the keepers to full time. I'd be more than happy in that scenario because I know they'll want to convert me and now it's up to me.
For this reason, experienced developers from other countries which have universal healthcare coverage and low cost of living, such as many Eastern European Countries, are very attractive to many startups.
I wish I could always hire developers to start on a project basis, but that's just not possible for many (most?) of the best local candidates. Someone who is great who has a full time job at company A is very, very rarely interested in leaving for company B on a project basis.
I think the reason people do this is a mix of 1) not knowing what traits to look for 2) not comfortable with unstructured conversation 3) frankly that it takes work to evaluate each resume and research projects enough to have a useful conversation.
For me though, how they achieve success at past work is the best indicator of future success.
You'll get the talented, flexible people that can get things done.
1. These "dev gate" programming challenges are filtering out senior devs, talented devs, creative devs etc. people who would be great at the role.
2. There are people applying for these roles who can't knock out a decent Fizzbuzz solution (in any amount of time).
3. For many roles, there's a flood of applicants
Any solutions to this need must address all three of these things simultaneously which seems astoundingly difficult.
Next is a five minute phone screen. We're a Java shop, so I ask them something dumb, like "what's the difference between public, private, and protected?" Something any Java dev would know; I'm just trying to find out if they have ever actually used the language.
People that pass the phone screen get a Skype interview, where they write code in an IDE. The first half hour or so is chat and trivial problems like "sort this array" or "return true of this String starts with a letter between A-Z, inclusive." They're allowed to Google and use the standard libraries.
Finally, we have a "close to real world" problem for them to work on. It's a standalone, mostly-toy CRUD application, and we'll ask them to add a feature that represents the kind of work they'd be doing. Again, they have their IDE of choice, Stack Overflow, etc.
I don't think we've had a bad hire since we started using this process. Have we turned aside some all-stars that interview poorly? Maybe, but the team we've built is really good at what they do, so I'm pretty happy with the results.
 We have hired Senior devs that don't know Java. One of our team leads was a C# guy for a decade or so, but he was smart and available, so we scooped him up. Java is easy to learn. Programming well isn't.
My first job using c# was in 2010, and I have been programming in java since 2002 at that point. I was productive pretty much since the first day & not because I'm a genius - c# is that similar to java...
I suspect that if you'd have hired a php or a python expert they would have taken more time to get used to java (not that that would have meant you shouldn't hire them!)
But really, most of the C-family of languages are similar enough that you can kind of hack your way through pretty quickly, and become mostly-productive in a few weeks.
> I have a degree in physics and 20 years of programming
Depends, do you mention your 20 years of programming on your resume?
I do have Java, Perl and C++ from SF City College - so I guess that counts.
Similarly, a candidate could have graduated top of their class from the most prestigious university in the world, but if they've never been able to hold a job for more than six months, that tells me something.
I still don't know Java that well, though ;-)
Instead put folks on trials and keep the good performers.
A few years ago I instituted an "interviewing code of conduct" for my teams with a few tenants that we've refined over the years, but the first one was "We will treat every candidate with respect and empathy in our interview process". The team has adopted some attitudes and techniques to do so while also not compromising on the talent or skills we are looking for. We've gotten very positive feedback on our interviews, so we think it's working.
1. Grab a sub-set of the competencies that you NEED to have https://sijinjoseph.com/programmer-competency-matrix/
2. From that subset, setup questions for the different levels
For example, for the VCS competency we have:
source code version control
Lvl 1 questions:
"- What is the difference between Git and Github?
- What is a branch?
- What is a pull or merge-request?
- What is a commit?
Lvl 2 questions:
- Mention other VCS besides Git
- Mention good practices of a Commit (short title, description content, etc)
- Describe one branching model (aka gitflow or other)
- What is a cherry-pick?
- Describe good practices of a Pull/Merge request
Lvl 3 questions
- What is the difference between a Merge and a Rebase? When to use them?
- What is Git Bisect? how to use it?
- Git vs Mercural or other DVCS.
- What is partial clone?
- What is a Git Submodule?
- What are Git Hooks
3. Split your 1 hour interview into 2 parts: Coding and Q&A. For the coding part, ask them to build something, in their machine, with the language of their choice, sharing screen. The problem should be something for 1 hour, and split in steps (I usually do 6). With this problem you will score coding ability, cleanliness, defensive coding and similar stuff. I usually score between 1 and 3 (1 - Below Average I have seen, 2 - Above Average I have seen, 3 - Amazing/Impressive)
The second 30min part is Q&A, use the questions to check whether they are a 1, 2 or 3 (or maybe 0 if they cannot answer a Lvl1 question).
Then combine those scores anyway you want (average, weighted average with extra weight for code or for some question) and see what % of the max score the candidate gets.
The key during the Q&A is that you start with 1 or 2 lvl1 questions, and then if they answer, you go to lvl2 questions, and if they get them, go to lvl3 questions. The idea is, to never make the candidate feel like they don't know. With experience you can get very good at seeing what levels of questions a specific candidate will answer.
I don't have a silver bullet solution to this, but I think a few things go in the right direction:
1) Candidates should have to write code in interviews. But they shouldn't have to solve puzzle problems with "gotcha" solutions. If there's a specific trick that requires an "aha" moment, you are really testing how well a candidate solves puzzles under pressure, not how they code.
2) Test candidates on what they are best at. If someone has been working in C# for the last 5 years, don't ask them to whiteboard in Python, which they used in college. Picking up new languages/frameworks is quick for someone who knows what they are doing .
3) Offer candidates a choice between an in person interview or a take home coding test, the latter of which would take more time. Some candidates don't want to deal with doing a 6 hour take-home coding problem . Other candidates suck at whiteboarding under pressure. So more options seems better.
 There are exceptions to this. You might have a unique problem and the budget/resources to hire a rockstar for a specific role. Desirable companies willing to dole out big salaries do this all the time. But much more often, I see companies offering average salaries for very very specific roles. One company near me told me that I was one of the best candidates they've seen, but they are looking specifically for someone with 1+ years of Java experience. I could have picked up the basics of Java in a month, and been fairly proficient in 2-4 months. Meanwhile, they are still looking to fill that role and it's been over 2 years.
 I've had a few companies that insist on this, but I haven't had a period of unemployment where I have the time for this. Good developers tend to be/stay employed, so if you are looking to hire senior devs, you probably need to consider their schedules. Unless I'm desperate to leave a job, I can only make so much time for interviews.
I’ve never come across this myself, but I always figured that sort of interviewing would correct itself over time - if you ask questions that nobody is going to know the answer to, eventually when three years have gone by and you still haven’t hired anybody, you’re going to have to adjust your tactics.
Type I error == false positive == hiring someone who isn't qualified
Type II error == false negative == failing to hire someone who is qualified
I think a lot of companies are obsessed with minimizing Type I error. They really don't want to hire bad developers. As a hiring manager, your ass is on the line if you make too many of these mistakes (when your own manager asks, why are we paying 100k/year for someone who you are telling me isn't very good?). And perhaps you'll have to fire someone, which is painful for most people to do .
On the other hand, the costs of Type II error fly under the radar. Your manager comes to you and asks why haven't you hired anyone yet? "Well, I haven't found someone qualified yet" is the only answer you need to give. So it's easy to avoid culpability and it's harder to measure the costs associated with the work not getting done (generally, it's easy to measure a developer's cost, which is their salary + benefits + time they spend using others' time multiplied by those employees compensation. It's much harder to measure the value of their output in most cases, unless they are working alone on a revenue generating project.
I think there's a problem (at many companies) of people being held accountable for Type I but not Type II error. And so naturally, people worry more about Type I.
 On a tangential note, I had a boss once who made a good point to me. He encouraged me to take risks in hiring, but he said tgat the worst person to hire is someone who is mediocre. If someone is really bad, it's easier to fire them. If someone is really good, then everyone is happy. But if someone is bad, but not bad enough to fire, then they stick around and cause the most damage.
I've seen many places sit on job reqs for a year or two after I applied and would have done quite well. Sure, I may have needed to study a week or two. Then they'd have a solution ~11 months earlier.
There's now a better way, now that a lot of people have open source contributions. Look at their open source contributions, especially if they're involved in public discussions as well as in code. Then you can followup and ask them about those.
Open source behavior is not necessarily quite the same as workplace behavior, especially if their participation was unpaid, and they had limited time, and the team dynamics were different, but there can be a lot of overlap.
(Simple example, using someone famous: if you did not know anything about Linus Torvalds, and didn't trust his resume and references, you could learn from looking at his open source participation that he knows how to code, has managed ambitious projects with cat-herding, is knowledgable and conscientious, historically has a very frank manner that some might find discouraging, and has recently reflected on manner and is modifying it. If that isn't enough, start discussing a technical topic with him that doesn't seem to involve proprietary IP.)
One engineer taking a quick glance at open source participation, and then asking questions about that, is arguably more useful than the engineer spending the same amount of time asking some contrived question and sitting in a stuffy room while the candidate does a theatre performance under conditions that aren't representative of real work.
Also, before considering dumping many hours of take-home makework programming on someone, it's respectful to first take a look at their open source. (Especially with a person who does open source on the side. There's an extra frustration with takehome, which is that they probably have backlogs of unpaid open source things they'd like to spend time on, and the takehome is hours of similar work in free time, but gets thrown away.)
I hate wasting time on candidates that can't answer rudimentary programming questions and I hate being in interviews where I'm asked questions which I feel are silly or irrelevant. I can't even trust my results sometimes because there's always that feeling that perhaps a candidate missed a question because I was a poor communicator.
Listening in on interviews with my team mates has really opened my eyes to the random nature of hiring. I'm certain I would not have passed the bar if interviewed by Team Member B instead of Team Member A.
This sentence shows a level of awareness sadly uncommon, that I don't believe I've encountered in... thousands of interviews.
I have been in charge of smallish engineering teams for around 5 years (as head of engineering for different startups). In the past, I used to actually do the 3-HackerRank-timed question as a 1st automated filter.
The problem is that, by testing for "fundamentals" (algorithms and data structure really) I skewed my hiring pool to the ones that were best at those.
Who is the people that are best at solving those kind of puzzle problems? (like the ones in HR, Codility, CodeFight, Codewars, etc), those most likely are Jr developers that are in Uni or recently graduated and spent their Uni free time in coding competitions.
The problem with that is that, these are a very particular type of programmers: They are super-effective in writing tiny one-of code to solve a specific "closed" problem. They usually don't care about testing, code reading quality, interactions, maintainability, etc. Given that they optimize for time and "pass the test cases".
Because of that, suddenly I had like 10 devs that were very good at algorithms but very Jr with regards to Software Engineering, architecture, maintainability, business understanding, etc.
Nowadays I have developed an automated challenge that 1. Requires coding, 2. Requires HTTP requests interaction, 3. Requires thinking and allows me to filter out people that really don't know what they are doing ( https://paystand.ml/challenge/ ).
WRT the 1 hour interview, I have always used a modified version of ( https://sijinjoseph.com/programmer-competency-matrix/ ) to be as objective as possible, and to be able to score Developers in a wide range of skills, and not only "they don't know how to solve the problem of returning a correct sub-tree from a BST within certain ranges". Sure, Algorithms and Data Structures are part of the requirements, but even knowing little of that should not disqualify you.
In fact, I’ve been thinking how I would approach interviewing candidates. It seems to me a much better way to interview would be to have candidates do a bit of a show and tell with a project they found challenging. Come in, bring your computer, show me what you’ve built. I’ll do my best to understand it, then I’ll ask questions about it: which aspect was the most technically challenging and why, what was your favourite part to work on, least favourite and why, what would be your scaling strategy and have them go into detail about that, etc. I’d much rather hear how a person answers questions about a project they’re very familiar with than have them do a bunch of arbitrary code problems. What’s wrong with this approach?
Further, there are _tons_ of very good engineers that cannot do 'a bit of show and tell' easily. They basically shut down. Doing the process you outline will optimize for talkers, and the industry is full of people that can talk but can't do.
Don't take those jobs if you want to be hireable I guess? You'd have to be joking to think a HackerRank quiz is going to help you glean any information about their expertise. Such expertise, by the way, is something I'd like to know about. Figure out a way to talk about it. Tell me the situation (you can't disclose real details) and then tell me about a hypothetical project with similar parameters in a way that doesn't break your contract. Or make up an imaginary project -- if you're actually a good programmer you can come up with something that sheds light on your technical ability and understanding.
> Further, there are _tons_ of very good engineers that cannot do 'a bit of show and tell' easily.
I'm sorry, soft skills are essential at my imaginary company. You need to be able to explain your thinking at an abstract level. I don't care if you can't do whiteboard problems, I get that - I can't either - but you do have to be able to walk through problems with other people. Especially problems that you know and understand well. If it's the interview environment that's the problem, well I'd like to informalize that process as well too so people don't feel so nervous about the whole thing. But if you can't explain a problem to somebody 1 on 1 then you'll probably be a very annoying person to work with.
> Doing the process you outline will optimize for talkers
It would optimize for "explainers", not "talkers". Talkers are usually people who can't explain something, and they're fairly easy to weed out from the people who actually know their stuff.
They do have a problem with doing it in an adversarial and stressful situation like an interview. I've seen it happen to engineers I _know_ are good.
Also, over years of actually engineering hiring pipelines for software engineers, I've seen no evidence that its easy to weed out the good talkers. Quite the opposite I think it's one of he hardest things about designing these pipelines.
> Don't take those jobs if you want to be hireable I guess?
My point about this is not on behalf of the person being hired, its the person doing the hiring. Your process creates false negatives. False negatives in hiring pipelines is nearly as expensive as false positives.
Unfortunately that was the company that was all 25-year-olds and had a terrible commute, and when they heard my previous salary all interest dried up. But all interviews should be like that imo.
Most egregious example: I interviewed at a household name company, whose entire web stack was built upon a technology I created, and they still gave me a whiteboard test.
I don't disagree that many companies do interviews wrong. But there's also a huge number of "senior" engineers who simply shouldn't be senior in the first place. Perhaps it's a result of companies handing out promotions to keep folks around rather than to recognize talent.
But I can't tell you what the sleep command is any of these languages. I look it up every single time :P
I'm good with the sleep command in my many different languages. It's dealing with dates in any language...
I suspect 10% of of StackOverFlow hits are developers remembering how to parse a date in some form or another.
If the question is from somebody with extensive experience in JS, that's a clear red flag (I don't have any extensive experience in JS). If it's from somebody with passing experience, it's not that bad. That said, I have no idea how to call sleep in any of the ~4 languages I'm currently using either.
We run about as practical a process as I can imagine. A short take home exercise where you use your own environment and build a very small project and are free to even have a starter ready to go ahead of time to focus on the question asked.
In house we have a project you work on on our code base. Very small. Maybe ends up requiring less than 20-30 lines of code for the day. Most people who come in are always familiar with our stack.
The amount of people that just fall on their face is astonishing. Many people will be fine with Greenfield development but then have zero skills being able to deal with an existing code base with places stubbed out with //todos and the ability to pair with people on the team.
I'm convinced people are getting turned down not because of the interview process, but because they just aren't that good at the end of the day.
Yes. I do not need a bunch of code wizards, we aren't solving if p=np daily. But that doesn't mean I should accept people who can't code I'm hopes of them growing into fine developers. I can do that worth one person, but not when bringing on 8.
Do your engineers work out of a cave with no internet and code on cave walls?
You're not hiring me for my encyclopedic knowledge; you're hiring me because I get the job done quickly and effectively, using a proper design and clean, tested code.
If you're working in a small company or a startup, then you don't need BigCo senior engineers, you can get away with people who haven't developed these skill sets. However if you have intentions of growing, either in your system size or in your headcount, you shouldn't discount these engineers over minor details like this. Ask them higher-level design questions if you actually want to test their abilities, otherwise you're looking at the wrong metrics.
Then they should apply for a job where their skills are relevant. If you're applying for a job where you're writing code (like all of the ones I described), you'd better be able to code.
Edit: and we let you Google whatever you need in the interview. If you can't figure out how to clone an array (an implementation detail of the interview question) on your own, I don't know what role you think you're applying for.
That's fair. No sense in bringing in architecture astronauts who can't translate their dreams to reality.
> we let you Google whatever you need in the interview
Right, rejecting someone who can't Google things properly is the correct choice. That's a skill that's useful at all levels.
I had one guy get very angry with me when I asked him a programming question; he insisted that that just wasn't what a senior dev did.
Moreover, keep in mind that for some orgs a Senior dev is actually someone that does more systems design, i.e. an architect. Make sure you ask them those sorts of questions before discounting their abilities. You don't want to be having an architect refactoring your frontend, that's a silly way to spend your money.
But I couldn’t find all matching sub trees within a binary tree at the whiteboard in an hour (45 min really). I haven’t done a formal survey, but my extensive experience and discussions with a lot of devs tells me that this really isn’t about fizzbuzz or other simple things.
It’s time to stop pretending this is about weeding out people who can’t code. Companies by their own admission set a very high false negative rate.
Then they claim there is a shortage and successfully lobby the government to create a shadow immigration system putting tech companies in charge of who gets to work in the US and the conditions under which they are allowed to remain.
I think Software Development has a problem in that there is no space for a middle-ground career path. I typically compare myself to a studio musician. I'm never going to be a Rock Star, but you can plug me into any band and we'll make a good recording. That's why I like contract work... nobody cares about my age or history, I can just hit my marks and do what I love. I'm not Senior, but I don't know if I really ever want to be either.
if these highly competent people can't answer your questions, you really need to take a second look at your Questions.
Have you worked in the industry for longer than 2 years?
> We don't really use arrays in Scala
These all seem like genuine things. In more complex companies, people may use more exotic data structures and/or have abstractions around data access.
Silly comment overall.
I don't care what kind of abstractions you have at your current company. We don't (/didn't) have those abstractions. If you can't show me that you can do CS101 basics (appending to an array?) in the language that you claim to write day to day, how can I trust you to be able to write the code that we describe in the job description?
Probably because most professionals are not fresh out of CS101 & most data structures aren't arrays or exposed as arrays.
List, Collections, Map, Trees, home-rolled data structures better suited to domain - how do you not get this?
Your opinion sounds like that of someone just out of a bootcamp. You're putting all the weight on irrelevant syntax issues. A lot of devs use multiple languages day to day amongst other things.