Hacker News new | past | comments | ask | show | jobs | submit login

It is, regrettably, not the case that all people who work as software engineers can e.g. code a for loop which counts the numbers of lower-case As in a string. This is true even if you spot them the syntax for a for loop and finding the Nth character in a string. It is equally true if you allow them to complete the task in isolation, at their own computer, given an arbitrary amount of time to complete it.

I am, naturally, constrained from saying "Here's a list of three of them." It would not be difficult.

It is also, regretfully, not the case that all applications one receives to an advertised position of Senior Ruby on Rails Programmer would be from people who had ever opened a command line.

Both of these are very difficult things to accept. They may be even more difficult to accept if one is extraordinarily smart/diligent and one studies/works solely in organizations which apply brutal IQ/diligence filters before one is granted even a scintilla of the admission committee's time.

I remember, rather vividly, the first time I figured out that an engineer couldn't program. I was attempting to tell him where a value was being assigned in a program. After failing to do so over email, I went over to his desk and asked him to navigate to the file at issue. He was unable to do so. I told him I would do it, assuming that he was unfamiliar with the directory structure in that part of the program, and opened the file. I then said "So you see, the assignment is made in the fooBar subroutine." He couldn't find the fooBar subroutine. I said "It is the third method on your screen." He couldn't find it. I said "It is this one, here, which I am pointing to with my finger." He said "OK, that one. Where is the assignment?"

The fooBar subroutine was one statement long.

A coworker, having overheard the conversation, stopped at my desk later, and explained to me that, if I had pressing engineering issues, coworkers X and Y would be excellent senior systems engineers to address them to, but that Z should be allowed to "continue to devote his full attention to work which he is well-suited for."

The industry has many destructively untrue beliefs about hiring practices. That there exist at least some people who are not presently capable of doing productive engineering work is not one of these beliefs.




> I am, naturally, constrained from saying "Here's a list of three of them." It would not be difficult.

I can't. Over my career I have worked with hundreds of engineers and scientists who wrote code. I know zero who are in positions where they have to write code at all, let alone as software engineers, who cannot do that.

> Both of these are very difficult things to accept. They may be even more difficult to accept if one is extraordinarily smart/diligent and one studies/works solely in organizations which apply brutal IQ/diligence filters before one is granted even a scintilla of the admission committee's time.

I spent my first 9.5 years at Raytheon, a company that is very light on the detailed technical aspects of interviewing and very heavy on the behavioral aspects. There are not committees, just the hiring managers and the interviewers. You know, the system that is supposed to have companies awash in these people who supposedly can't construct a basic for loop but can bullshit a good game. Never met one. Not at Raytheon, or Northrop, or Lockheed, or Boeing, or any of the other companies I worked with during that period.

Sure, most of them were what this community would disparagingly label as "9-5 engineers", but they all produced code that basically worked and provided some value to the program they were on. Normally if there was a problem it was a political or attitude problem, not one to technical ability.


The Matasano hiring process (to which this article refers) was designed in part because of this bad experience you've never had. I've seen it happen repeatedly over my career; people who'd pass interviews and then, over 15-20 minutes of face-to-face discussion a couple weeks on the job, not be able to comprehend the idea that a C pointer needs to point to valid memory.

That's what's so mind-blowing to me about our terrible process today. It aggressively tries to find and dispatch unqualified candidates, so much so that it forcefully repels talented people. And it fails miserably at screening out people who can't code! It is the worst case scenario of processes.

It's also really important to remember that the problem is People Who Can't Code, not "People Not Smart Enough To Code". Time and again I've "worked" with people who could very effectively critique my own code, who could participate in design meetings down to the bits-and-bytes layer, and, having taken responsibility for actually implementing some portion of those designs, could not commit usable code to save their life. I don't know what the deal with these people is, but they are out there: plenty smart, totally fluent, and completely ineffective.

I was talking to Erin this weekend about the phenomenon and realized that these people not only exist, but are actually favored by the way the market works. A smart person can last many months in a dev role before anyone realizes they're ineffective. Then, it takes many, many more months to manage them out of the team. By the time the story plays itself out, the bad team member has more than a year on the job, and when they parlay that resume line item to another prominent job elsewhere, their previous teammates aren't outraged but instead relieved. They end up with gold-plated resumes and tons of credibility.


The software industry is broken for older engineers. Look at the typical algo/CS heavy interview today. It favors and selects for these characteristics:

1) Young, fresh graduates who still remember well the CS material. Or those who have a lot of free time and can study for these interviews (for example look at this Quora link http://www.quora.com/How-much-time-did-you-spend-preparing-f... the first reply mentions spending 28.5 hours per week for 5 weeks, which is 142.5 hours total) From a senior software engineers perspective that's a ridiculous commitment for something where an arrogant asshole interviewer can shutdown a whole onsite interview process because they're having a bad day or have some favorite question they think is the One True Question for determining if someone is a software engineer.

2) The software industry tends to overvalue youth. It has a very high salary at start, but also a very quick salary cap. And most companies have no technical ladder, and even those that do it's a joke compared to the manager ladder. All of this is not surprising. Since the industry is so heavily biased towards hiring and rewarding youth, older experienced engineers are not valued.

3) Senior software engineers have heavy pressure to leave the software field. I'm feeling this now. I'm a senior at my current place and my salary is basically capped. There would be little to no benefit going to another company. And even if I wanted to, I have a family now and don't have the free time to study for software interviews. My options are basically go into management, consulting, or start a business. I have no desire to start a business or go into consulting because I simply don't like doing those business related things. I want to be an engineer not a business person. But if I just stay an engineer, I will probably get laid off someday due to ageism. And if I'm an old engineer who is laid off, job mobility is basically gone due to ageism and the CS heavy interviews that I don't have the time or motivation to study for. So the only safe path is management. A lot of senior engineers face this decision. Which again reduces the number of mentors and experienced engineers to teach the new younger engineers.

4) And doesn't anyone notice just how wrong whole concept of studying for an interview is? There's a whole book publishing industry dedicated to technical interviews such as Cracking the Coding Interview. The technical interview has become like the SAT. Many could just study to pass it, but be horrible engineers.


Interestingly, the author of that book also wrote the top comment on the TechCrunch story, disagreeing pretty strongly with me about hiring processes. I got to talk to her a bit about it on Twitter:

http://www.conweets.com/tqbf/gayle/

(I'm having a pretty bad day, which I hope doesn't show through too much in that summary).


It seems to me that Google's perceived hiring issue is "we can't tell the candidates apart, because they all studied our interview questions they found online." (and not "our process is fundamentally broken to the core" like it probably should be).

In that context, "give take-home tests" and "ask better questions" sound like the exact opposite of good answers.

I think this is a problem that's unique to very large corporations. Not only do thousands of people interview there, meaning questions and answers are likely to be traded verbatim online, but there's also actually an incentive to cheat your way in: you can hide out for 1-2 years and acquire a horde of cash and resume fodder.

Google have actually made this problem worse for themselves, too. By overvaluing their interview process, they've made cheating even more lucrative. Cheat your way into even an offer there and you have something valuable to put on your resume!

I suspect you haven't seen a lot of cheaters on take-home tests before because at a company like Matasano, there's little incentive to cheat your way in. You certainly can't hide out in a corner doing no work there.


Bummer. Her career seems to be predicated on a correlation between algorithm tests and performance. Can't expect her to seriously entertain disagreement as long as the current model is profitable for her.

Google is no more a proof of the value of algorithm tests than Github proves the value of manager-free organizations.

I was a bit horrified at the acquisition coaching comment. I did not realize that happened but of course it makes sense. Can't manage an acquihire if your devs can't pass the algorithm tests at the big acquirers. Shame.


Your assumptions are false.

It is actually more profitable for me if I say that these algorithm tests are BS and completely study-able, and even the worst engineer can pass them. That easily translates into "here! buy my book and even YOU can get a job at Google!"

It's actually a much harder sell when, really, it's more like studying helps someone perform better, but only to an extent.

Oh, and the acquirers encourage studying and prep, just as most of the top tech companies do for their normal dev (and non-dev) candidates.

To be clear: my experience is that the algorithm interviews, when done properly, work reasonably well to identify intelligence/problem solving and coding skills. Those are more important at some places than others. They can work well for certain places -- but not at all. I do not think that all companies should use this process. It's not right for all places, but it's reasonably good for some.

As for Google being proof of the value of algorithm test: I think you're pulling that from a twitter conversation very much out of context.

Context: Some people were arguing that the algorithm interviews offer zero value - that you'd be at least as well off, if not better off, hiring at random.

If they truly believe that (and they appeared to be sticking to this argument), then it's absolutely relevant that not just Google, but basically all the top tech companies, use this hiring process. Could a randomly selected group of engineers build the technology at companies like Google, Facebook, Amazon, etc? I suppose that's what they must believe, and I find that pretty absurd.

If they had argued that these interviews sort of work, but aren't nearly as good as some other interview processes, then you're right that the company about Google and other companies wouldn't be relevant. And I also wouldn't have brought it up.


"To be clear: my experience is that the algorithm interviews, when done properly, work reasonably well to identify intelligence/problem solving and coding skills."

Given that there is no correlation between job performance and interview score, are you saying this based on your gut instinct, or do you have objective data somewhere?

My best estimation is that the Google interview process is just a glorified fizzbuzz presentation, and that the hiring decisions are more or less made by arbitrary factors, including the nebulous "culture fit".

While it may be amusing to work through the details of doing a merge sort on a linked list in 30 minutes, it's certainly not germaine to the quotidian experience of an SWE at Google.


>> Given that there is no correlation between job performance and interview score, are you saying this based on your gut instinct, or do you have objective data somewhere?

That's not a correct assumption. I suspect that you are referring to the (infamous) Google studies on this that were alleged to show this. The articles about this got many details wrong, and the study was fundamentally flawed. Remember: they only studied the people who did very well. That's like finding no link between exercise and health when you only study people who run regular marathons.

Merge sort on a linked list is a bad interview question, so I agree with you there. Good questions are problem that actually make you design a new algorithm.


It may be that the studies were flawed, and that the reporting was bad, but that flavor of study has been replicated in other venues besides Google. Given that, it essentially comes down to culture being the deciding factor in these interviews. Given the stark cultural skew at Google, it's possible that the pseudo-analytic SWE interview is actually masking systematic bias in the hiring process.


The question for Google is whether Page, Brin, Bak, Buchheit, and Dean passed an algorithm interview before building Google's core technologies. Hiring doctrine after the fact does not ensure a billion dollar company.

I disagree with your assertion that lambasting Google's hiring practices is an easy sell in the market. The last time I tried I was dismissed out of hand by a talent consultant.


> I disagree with your assertion that lambasting Google's hiring practices is an easy sell in the market. The last time I tried I was dismissed out of hand by a talent consultant.

That... wasn't my assertion at all.


>And doesn't anyone notice just how wrong whole concept of studying for an interview is?

This has been going on for ages. A lot of coding bootcamps and similar things are just interview prep programs. A lot, I mean a lot, of people I have worked with can nail interviews but can't do real work for shit.

Unfortunately to land a job, you have to go through this process. Developing interview skills is parallel to being able to do actual work.


>A smart person can last many months in a dev role before anyone realizes they're ineffective.

How? If the person is not delivering stories it's pretty obvious. I've seen a few people get the boot after three weeks.


If your engineering org is 50-weeks-a-year structured around delivering stories and fire-fast and doesn't build in time for ramp-up, it has a good chance of spitting out ineffective hires. But most organizations --- and we worked with lots of them as a high-end engineering consultancy --- aren't structured like this. More importantly: it's disproportionately the larger organizations with bigger brand equity that tend not to be run entirely out of Pivotal Tracker, which feeds the resume effect.


At many large organizations, there's even political pressures that keep employees from being let go early in their employment. If a hiring manager says yes to a candidate and they go through training/onboarding, there's a certain amount of clout to be lost if you jettison that employee early on. It is, instead, better for the manager to make the employee appear to be productive to make it seem like the manager did a good job in hiring.

Team accomplishments can be embellished, stronger team members can cover for the weaknesses of weaker team members, but admitting that a decision you made was incorrect gets punished. It's obviously better to hire someone competent in the first place, but the next best alternative, for the manager, is to live with their bad hiring decision.


I used to work at a company with thousands of staff, and the HR policy was pretty clear:

As engineering managers, we were expected to "manage out" about 10% of the lowest performing staff on a yearly basis. The way that worked was basically that the 10% under-performers were to be put on strict personal improvement plans, and were under no circumstances given raises at all, and it should be made clear that no raises would ever come their way unless they improved.

That was it. Sounded harsh if you were used to 10% a year raises, as many of the good performers got. Not so harsh for those who were used to live in fear of getting "exposed" and fired.

And as a manager, if I were to designate a lot of my team as under-performing come review time, it'd reflect badly on me, so managers also had an added reason to push for good reviews for their reports: You'd end up getting paid more by getting good reviews yourself if your reports were helping create a good impression through rankings that you as a manager could trivially fudge (we did "360 degree reviews" were the manager and report could both designate some co-workers to review the person, which was of course trivial to game for those who wanted).

There was also an implicit expectation of a bell curve. If you had an under-performing team, that'd be awesome news for at least some of the weaker members on your team (and conversely, I several times had reports who got short-changed because I was told too many of them were designated "above expectations" even though my manager and other people who had worked with them agreed, and so several of them were ranked lower to get the "correct" spread - it was absolute lunacy).

Unsurprisingly, come hiring time, I felt like we were under siege by an army of ineffective developers hoping to find refuge somewhere safe and hard-to-get-fired.

There are scarily many places like this for ineffective people to hide. Sometimes for many years.


"comprehend the idea that a C pointer needs to point to valid memory"

I ran into a C programmer that did not get the relation between pointers and parameter passing. It was some scary awful code. Damn prep for code review almost got me in a car wreck.


Is there a process in place to handle such people? Do they get asked to leave (shortly after being hired), or are they moved to a role which doesn't require programming?


I've been coding professionally for 15 years and only have one such story. I once worked with a guy who had supposedly been coding professionally for over 10 years, the most recent 5 of those in Java. He, in all seriousness, asked me how to sort a List. I can forgive not knowing the Collections interface in Java, but not being able to google it and figure it out as a senior developer and team lead is troubling.

Conversely, there have been plenty of people who I would call anti-workers. They technically accomplished "something" but the quality of work was so bad that the team would have been more productive without them. They were a net negative productivity for the team. At most companies I've worked at there was one of those.


The problem is not that these people "cannot code", the problem is that they cannot implement even the simplest things when it cannot be done by using existing libraries and frameworks they already know of with designing any kind of problem-specific data structure or algorithm being completely out of the picture.

For example I've been told that iterating over list of objects and aggregating (min/max/sum) some of their properties would involve introducing dependency on local installation of SQL server.


"the problem is that they cannot implement even the simplest things when it cannot be done by using existing libraries and frameworks they already know"

If that's the bar, then I'd estimate that roughly 80% of programmers in the valley right now can't step over it.

This debate is filled with so much heat -- and so little light -- because everyone carries around their own subjective definition of competence. I've personally never met anyone working in a company who can't code at all, but I've met plenty of supposedly elite programmers who are adrift without a framework. Are these people useless? I don't know, but I know they're interviewing other programmers.

If I think about it long enough, I go in circles and I end up back at the thesis of the article: this industry is filled with some spectacularly judgmental people.


Can you be more specific about the debate to which you're referring? The one with the light/heat problem?


The subject of the whole thread: bad engineers, and the effectiveness of our various tests for detecting them.


What's the debate as you see it? Can you boil it down to two sides? I'm asking because, maybe we actually agree about stuff, or maybe we totally disagree.


From what I've read from you in the past (as well as the way you're mentioned in the article), I think we probably agree.

If I had to reduce the debate to two clear groups, it's that one group believes that there's a bright-line distinction between "good" programmers and "bad" programmers, and that the questions-at-a-whiteboard model is an effective way of discriminating the groups. The other group doesn't believe in partitioning the world into two clear groups.

It isn't debatable that there are a lot of people who can't write code. I believe that trivial phone-screen coding problems are pretty effective at weeding those people out. Beyond that, it's much more about candidate/position fit, and that's a more subjective question. The problem is that programmers tend to be uncomfortable with subjectivity, preferring instead to posterize the shades of gray -- that's how we end up with the apocryphal Google engineer who can't code.

In your stated example (person hired; can't fathom basic realities of C pointers), I doubt that it reflects a complete inability to be functional in some programming role -- just not the one you were needing. Most "Rails programmers", for example, would probably be hopeless in that situation, but perfectly capable of cranking out websites with Rails. I've seen a fair number of real-world programmers whose (lack of) knowledge of algorithms and data structures makes me cringe -- and I certainly wouldn't hire them for the kind of work I tend to do -- but I have to accept that they're effective within their niche.


We agree more than we disagree. Most importantly: the big problem I see with our industry is our inability to detect aptitude in candidates who don't "show" well in resumes and interviews; in other words, the opposite of the problem of being "too selective". It sounds like we agree on that.

On the other hand: I think the counterintuitive thing about ineffective programmers is that many of them are perfectly capable of reasoning through a CS problem in Java or C++ or Python on a whiteboard. But they're --- for lack of a better term --- useless when it comes to actually getting real-world systems built.

My thoughts about why this happens aren't yet well-formed. I just want to contribute the perspective that the "bad programmer" problem isn't just about people who can't buzz a fizz.


> On the other hand: I think the counterintuitive thing about ineffective programmers is that many of them are perfectly capable of reasoning through a CS problem in Java or C++ or Python on a whiteboard. But they're --- for lack of a better term --- useless when it comes to actually getting real-world systems built.

Software engineering is a performative art. Sure, there is a heavy intellectual component, but you actually have to do actual software engineering to get good at it. You can't just study the theoretical aspects. It's like how you can't learn Haskell just be reading a book and some websites. And yet, someone could go through school for CS and slide by without actually doing any real engineering / programming. I had a CS graduate TA (for a CS discrete math class), at a very well regarded school, who literally did not know how to compile a C program, full stop. He was very into the pure-math side of CS...but still.

As an analogy - you can imagine that someone who has studied music theory for years, has perfect pitch, knows how all the various branches of music relate to each other, which artists influenced whom, etc...could still be shit at playing musical instruments or creating new music.


That points to another problem: Computer Science is not Software Engineering. Yet a lot of the time we insist on hiring CS people for our Software Engineering jobs. It doesn't help that there's not really a clear definition of what good quality software engineering is.


It's an interesting problem. Before I started interviewing I expected to get a ton of really good candidates and have trouble being selective for lack of being able to find anything wrong. Boy was I off the mark. The number of people who don't pass the fizzbuzz test is quite high. I think if we tried to give the "sleeper" candidates a break we'd let in so many bad ones we'd quickly get overwhelmed.


We thought so too. But the (nearly) resume-blind process we settled on quickly converged on a nearly all sleeper candidate pipeline; in several years, we hired I think just 2 people that had our field on their resume. We didn't just retain all those candidates: we were knocked on our asses by how well they performed.

Recruiting is two problems: outreach and qualification. We did novel things on both fronts. For this thread, I just want to be pointing out: the changes we made to qualification were critical, instrumental, fundamental to our success. Most of our best hires could not have happened had we qualified candidates the way we did in 2009, and the way most firms do today.


I have a close friend that joined matasano about 2 years ago. While a good chunk of the readership here is familiar with some of those changes, I think your knowledge is an example of where the 'lessons learned' should be shared as widely as possible. I'm no marketing expert but if someone can find a way to make some of your ideas go viral that would be great for the industry.


Can any of this be attributed to salary negotiation or "better" options being available to those already established in the field, such that a large commitment seemed unnecessary to them? For those who were established, where did they drop out of the pipeline and why?


FizzBuzz is easy iff you are aware of the % operator. I only know how to test for divisibility (cleanly) because I looked it up in the course of doing Project Euler. I could easily see there being developers who worked exclusively on non-mathy business logic and never had to test divisibility, which would explain floundering on FizzBuzz. (Then they should be able to reset counters every 3 and 5 iterations or something, but that's a tad more involved than the canonical solution and might be regarded by some interviewers as incorrect.)


Fizzbuzz is useful because people fail it even if you give them info about loop structures and info about mod operators.

It's only sometimes about people not being aware of a language feature. It's usually about people not being able to program.


fizzbuzz can be modified to remove knowledge of the modulus operator. "write a function that loops through an array and prints 'fizz' if it equals the parameter 'a', 'buzz' if it is -a, otherwise print the value of the element". Or something like that. This tests writing a function declaration, passing values in as parameters, iterating over an array, writing an if statement, and printing. That seems pretty basic and fair to me.


Are you a js engineer? It's hard to imagine someone could write c, c++, java, python < 3, or probably c# without cleanly understanding the modulus operator, because

   System.out.println("" + 3/4);
(or the moral equivalent) prints 0 in all the above languages. Every developer gets bit by integers performing integer division.


Uhhh, the mod operator is not the solution to that problem though.

In reality you'd only use a mod operator if you were doing things like sorting into 4 columns or doing an operation on every 3rd thing. And it's not a concept that is introduced at school. Now I try to think of it, I think I only picked up mod when I was learning rounding in my first language and the man page happened to mention modulus at the same time.


The mod operator isn't the solution to that problem, but that problem shoves integer division in your face. If a dev sees that and isn't curious enough to understand there is a division operator and a remainder operator, just like when you studied fractions in 3rd grade, I don't know that I want to work with that person.

And every dev should have hit the remainder operator at bare minimum when they had a long running loop and wanted to print a status every kth operation, eg

   for(int i=0; i < 100000000; i++){
     // some operation
     if(i % 100000 == 0)
       printf("** operating on count %d\n", i);
   }
or when processing a big file, printing every k lines; or when running a slow operation, printing every k seconds; or ...


No, because for most of those problems there's an easy alternative, declare another counter variable and you can just go:

    z++;
    if(z>100000) {
        print x;
        z=0;
    }
When I was taught maths there was no emphasis on remainders and it certainly wasn't denoted with a % sign. I vaguely remember writing something like 12r3 in primary school. The r meaning remainder.

A brief search on SO and lo and behold:

http://stackoverflow.com/questions/1504420/c-what-does-the-p...

Viewed 38,000 times. Every language probably has a similar question.


While I learned about the modulus operator at a young age, I grew up programming in environments where "nobody" used floats because the CPU's in question didn't have FPUs, and floating point operations resulted in costly library calls.

While only "older" (I'm 39) developers will be likely to have been in that situation, it took another decade after I moved onto hardware with FPU's before I worked on anything where we actually used floating point math.

Instead we'd be working with fixed point stored in integer. E.g. for financial systems, floating point is a nightmare. Working with fixed point to whatever number of decimal points our accounting department wanted (5-6 typically) for tax calculations and the like was preferred.

So there are large number of areas where people can have worked successfully for many years without ever using floating point.


I mean, I know about the mod operator because I did Projrct Euler problems in middle school, but if I hadn't, nothing I've done since would have made me learn it. The extent of the math I've had to do was tracking send buffers in C, which was just addition and less than/equal to.


A naive implementation of % is pretty trivial, so even if they weren't aware of the operator, I would at least expect them to write an equivalent function.


But they're --- for lack of a better term --- useless when it comes to actually getting real-world systems built.

Part of the issue here is that each of us has a different understanding of what it takes to get a "real-world system" built.

That's somewhat obvious because each of us has different "real worlds", so even within the same language/ecosystem there is a huge disparity in what skills are required to be genuinely successful in different teams/organisations/applications.

To put it in concrete terms, when I worked in banking, the main impediment to building successful "real-world" systems was getting clear, unambiguous requirements that could be translated into something implementable. Some of the best developers in that organisation were excellent at their job because they could take the incomplete and inconsistent desires of a banker, and turn that into a coherent and complete "world view" about the system they were required to build. We were primarily a Java shop, and they could produce an adequate application within the frameworks available to them, but I shudder to think what they would have said if you asked about "coupling and cohesion" or the java memory model, or how dynamic proxies work, or even how they should choose between making a field a short, int or long.

So, for my current (startup) organisation, I wouldn't hire those people - despite them being instrumental in producing some of the most satisfying and successful (ROI) systems I've been involved in. This team needs people who can do hard engineering, and if my former colleagues applied for a role here they'd end up looking like the proverbial "inept programmer" that we're debating.

This is a reflection on their skills, but it's also a sign that similarly named roles ("software developer") in different teams require and cultivate different skill sets.


Although I read the answer from the OP, reading the original message made me think of something slightly different. The debate is almost impossible to characterize because nobody is arguing apples to apples. When we talk about incompetence, almost nobody will agree to its meaning.

Thanks to the Dunning Kruger effect, people who are incompetent do not realize they are incompetent. I remember reading a follow-up paper to this study (I wish I could find it again!) which concluded that when people have their mistaken beliefs pointed out to them, it actually reinforces that mistaken belief! This means that when people actually engage in debate, the two sides entrench and become more confident in their positions.

I have actually argued that this may be what leads large corporations to have a kind of "talent inversion". The people with the least talent have the most confidence and volunteer for the craziest projects. The projects fail, but due to politics the failure is hidden and spun into a success. The original developers begin to believe their failure was a real success and any debate to the contrary only strengthens their belief -- which fuels their confidence. This confidence allows them to be promoted where the cycle continues and multiplies.

This leaves you with a strange situation. Everyone will agree that incompetence is a big problem in the organization. Unfortunately, everyone will point to completely different people when asked who is incompetent.

Let's say that you want to improve the level of talent on the team. In other words, you want to hire only people who are more talented than the average person already on your team. I think this is not an unusual situation to be in. If talent is normally distributed on the team then half of the team is below average (I hope I got that right because I suck at statistics ;-) ). But thanks to "talent inversion", you may have a higher proportion of people in senior ranks that are in the "below average" section.

Now you can have a debate about the what kind of people to hire: If person A thinks that they are at the top of the talent scale, they can't expect candidates to be as talented as they are. Probably they think it is completely reasonable to hire people who are considerably less talented than they are. Person B may actually be at the top of the talent scale and may be arguing strenuously that candidates must be considerably more talented than Person A.

Clearly such a debate will have a lot of heat without a lot of light.


Raytheon must have been doing something right, and I would be curious to learn what that is.


I think the difference is that at medium-large companies there are plenty of technical career paths that don't require a person to actually code - business analyst, QA, documentation, compliance, project manager, etc.

So people at a place like Raytheon who simply can't code (or don't like to code) end up doing something else that they are better suited for.

Early stage tech startups just don't have a bunch of ancillary positions like that. Founders and early hires had either better be great at code or great at biz dev.

Also it's worth mentioning that it's not just a tech/engineering phenomenon. It's very common for people in other professional fields (law, medicine, etc) to get a little ways into their career and realize that the job is not at all what they expected or wanted or are suited for, and there are alternate career paths there as well.


This is a great point. Again, we're getting a weird look at things because we're posting on Hacker News, not Defense Contractor News or Accountant News. Software companies are definitely a significant group, but there are lots of companies whose programming division is just a small part of what they do. They're big enough that they need their own programmers and can't outsource it to someone, but they have all sorts of other stuff they're working on.

If you can't code, you're probably still useful for other stuff. Businesses love to take people they already know and find better roles for them. It's way better than firing badly utilized talent and having to interview for some other candidate with that skillset.

I wouldn't go so far as to say, "There are no incompetent employees," as there are definitely worthless people in the workforce. But it's not as clear as a lot of people might think.


> Not at Raytheon, or Northrop, or Lockheed, or Boeing, or any of the other companies I worked with during that period.

All these companies require degrees, check GPA, and typically have managers that came from technical backgrounds. While not perfect, this combination reduces the chances that a total dunce slips through. Also, all of the above companies tend to be looking at engineers/physics folk over cs/IT, although in the past 15-20 years a lot more CS.

I work at one of the above and for a long time only EEs were hired for any software positions. In regards to math ability that's a fairly high bar. And math competency is pretty well correlated with the ability to program at all.

Not saying that someone couldn't slip through. But all of the mentioned factors tend to reduce the pool of potential flops vs some of the now-standard hiring practices (from the article).


EE/CE here. I took the intro to data structures class and stopped at that. Been kicking myself for years because that's seemingly the only thing people care about; not actual competency (at least in the context of interviews).

Thing is, I'm a decent programmer, maybe even above average. My career so far has consisted largely of being the only or one of a few technical people and doing whatever needs to be done to make X happen. I built a poor-man's version of the mail sorting machines the USPS (and other postal organizations) use to photograph, OCR and sort mail. I remember everyone's mind's being blown when Outbox (R.I.P.) showed people pictures of their letters, and I built a system like that by myself for a previous employer in about two years while also completely rewriting the website. 65kLOC of VBScript/ASP to 20kLOC of python/django plus a whole bunch of new features.

If you believe sloccount (I don't really) in two years I did about 4 person-years of work that it says is worth a substantial fraction of a million dollars and a substantial multiple of what I got paid.

But I don't remember what a red/black tree is.

Knowing data structures and algorithms isn't the only way to be a competent programmer. To get my EE degree I had to take calc 1,2,3, diffeq, discrete math, linear algebra, senior level statistics, and these were all just pre-reqs to my engineering classes.

I think a lot of people get hung up on computer science == programming. There's a lot of overlap, sure, but huge amounts of programming doesn't have anything to do with computer science.


Yea. I and many of my coworkers are similar to you. BS&MS in applied math. Did some of the little programming courses but not the full CS path (though I did last summer work through all of this just to catch up some background http://ocw.mit.edu/courses/electrical-engineering-and-comput...).

The reality is different industries have different standards. I'd probably never pass a SV interview because I'm not ever going to memorize many of the CS fundamentals. I work with EEs, Aeros, MEs that code (robotics types), math physics and chem PHds, etc. Sure there's a few CS guys here and there usually that double majored in phys or math.

On a given day, what's more important to our engineering problems is my understanding of advanced linear algebra topics, not a sorting alg (lot of work in CV recently for ex). We really don't bring anyone into an interview that hasnt studied the full gamut of math required in a math/phys/engineering degree. In such an environment, the interview questions are more about how you solved problems and never the memorization of some sorting alg. Oddly enough, when you weed out everyone that doesn't have a full math curriculum first, you don't seem to find many people that 'cant code'. Go figure.


> the interview questions are more about how you solved problems and never the memorization of some sorting alg

How you solved a problem is probably the only meaningful way to screen candidates. Sadly there are a lot of people who think that a whiteboard coding session is "how you solved a problem" when few -- if any -- meaningful problems get solved in an hour or two.

When I get really stumped on something the first thing I do is get my head out of the problem. Go for a walk, work with my hands in the shop, get lunch, anything but try and solve the problem directly anymore. This strategy used to make me really nervous in college with short deadlines but now that's less of a concern.

I have yet to step back from a problem and not figure out the solution (at least where one exists).


I believe part of this is your own frustration, and I do not blame you for venting.

Another part is the inherent error is using education as a proxy for ability to contribute.

My biggest concern, however, is that we assume there is a hierarchy of programmers. That the person who surpasses John Carmack is automatically more valuable than someone else, because it's a vertical pecking order.

In reality, it's likely that given the constraints of a position, Carmack++ is not a good fit.

We see someone build a faster sqrt algorithm as ability. What about the programmer who does his best to navigate internal obstacles adroitly -- why is that not praised just as much? It's certainly a difficult accomplishment -- why praise people for ability entirely?


That tends to bug me, who has no formal education beyond High School, but I have worked a lot to gain knowledge related to software development, both conceptual and practical. There's no "in" when you don't have the minimum requirements at such companies...

I have a friend who is a mid-level manager at one of the listed companies, who came in without a degree, and was promoted because of proven competence to where he is now, and is spending several hours a week working on his degree because he's been promoted to the point, none of his superiors will let him be promoted further without a degree, despite handling the job better than his peers.

On the flip side, I've worked with people with a degree in CS (or similar), who couldn't develop a good software system if their life depended on it. The other side of this is, that many people don't continue to learn and only stagnate once they become familiar with one system. Software development tools don't stop coming out... there will always be new languages and platforms, but I have always seen a lot of resistance... I'm one of the few 40+ y.o. developers I know who keeps an eye on what's coming up in the community.


Yea, hopefully my post didn't indicate that I agreed with the process. You certainly listed plenty of counter examples why it can miss the mark.

But my point was about Statistics. The HR warlords for all those defense/aero companies that have set these limits have done so because it segments the pool of candidates to one with a lower chance of bust (or so they've decided as such).

I'd like to see data, I'm some HR or consulting group somewhere has it. Or maybe it doesnt exist. A simple logistic regression problem with a high number of variables (degree or not, age, major, self-study, continuing ed, etc etc).

edit: also - all my comments are speaking from the engineering side of these companies. not IT. So the actual work content is a bit different than the original article may be concerned with.


I didn't mean to imply that you agree with the process... I think my own comment is more along the lines of, this is one of those rules people should be able to break now and then... I absolutely agree, that there are far fewer risks of people without the necessary skill/ability to do a given job in terms of those with a degree to those without. The challenge is then to not lock those who don't fit that mold out, either by experience and/or referral.


The OP's point is, unless I misread it, not that there are no incompetent engineers who can't code, but it is ridiculous to, like, ask some "fancy" algorithm questions, brain teasers, etc. that are mostly not related to the candidate's working experience at all.

For candidates who are fresh graduates, it looks it is the most effective way to ask algo and data structures. For a senior engineer who has at least 10 years of experiences, the interviewers cannot find some better questions to ask, to verify if the candidate is a good coder?

I was interviewed once. I was asked a question. I knew the question could be solved with a "fancy" algorithm, since I read the original paper in graduate school. It was elegant and beautiful, very creative. After so many years, I couldn't remember any details, and I had had no opportunity to use it in my work. I am not a genius, so I couldn't figure it out in 20 minutes during interview. Does that mean I am a fake?


> In all my years immersed in the tech industry, I have never once heard a firm talk about the idiots lurking in their own offices. They always seem to be elsewhere. For everyone.

It certainly sounds like the author doesn't think there are any actual incompetent engineers out there.

Personally, I know that to be false. I have worked with incompetent engineers, who literally couldn't code FizzBuzz.


I think the author is saying we should do interviews like nearly every other industry.

Look at past experience, and ask the candidate to talk about that experience to verify that they aren't lying.

If someone has 5 years experience working at Google, and they can talk coherently about technical aspects of the projects they worked on, there is no reason to ask them to code on a whiteboard.

The way it currently works is this: I see you have a CS, degree and 10 years experience at a well known company. Excellent. Now you have 2 eggs and a 100 story skyscraper. Tell me the minimum number of egg drops you'll have to perform to discover the minimum height that will break the egg.

Every engineer no matter how experienced has to answer questions like they just left school yesterday, no other industry does this.

>Personally, I know that to be false. I have worked with incompetent engineers, who literally couldn't code FizzBuzz.

99 times out of 100 you could catch that engineer by talking to him about technical aspects of past projects he worked on.


"Look at past experience, and ask the candidate to talk about that experience to verify that they aren't lying. If someone has 5 years experience working at Google, and they can talk coherently about technical aspects of the projects they worked on, there is no reason to ask them to code on a whiteboard."

I always argue for this, and about 98% of the time people look at me like I'm sort of crazy man and our company is going to go down in flames. Everyone has an anecdote, "well, there was this one time ....", yet I'm convinced that there's roughly 0 real value add from all of the hoops and hurdles most interviews provide in terms of accurately identifying strong candidates.

I always try to make the argument that if someone can talk intelligently about a wide range of programming knowledge in an actual conversation - for instance, knowing that data type X would be a logical segue from the current point, and do that throughout the interview, personally I think the likelihood of them being an idiot are slim. However, most people I argue this to are always afraid of this, afraid that some interloper will sneak through unless we put them through a gauntlet of questions regarding things they'll never do in their day to day life.


This is why I like to give laptops to engineers and ask to make something simple, that might require logic, but doesn't require any sort of specialized knowledge. They can even look up documentation.

Like make a table view that has word counts for a string. Or make this simple puzzle game. Or make a simple address book app.


Just an FYI -- working on someone else's laptop can often be an unpleasant experience (weird keyboard configuration, different software installed, etc.)

My current opinion on this would be to offer a laptop so you aren't disadvantaging candidates who don't have one too badly, but let people use their own if they have one (and let them know ahead of time that it will be an option and what they should be set up for.)


We tell candidates up front that they'll be doing some live coding and welcome them to bring their own laptop if they would like. We have a backup around with just about every reasonable IDE and editor installed for our languages but even then I agree, it's definitely not the same using someone else's computer.


I program on my desktop with two monitors, I wouldn't be comfortable on a laptop without at least two extra large monitors hooked up to it already. And a normal keyboard, don't give me that laptop keyboard crap.


Wow... I understand that you'd be much more productive with a large monitor and good keyboard, but to the extent that you couldn't perform in a coding interview? Give me a break...

In the past, I've used CollabEdit for "phone screens" and the whiteboard for in-person interviews. I try to ask some "coding" questions (identify a better data structure / algo for a particular scenario and then implement it in code) and some problem-solving conversational questions. Seems to work pretty well, but I'm always on the lookout for better approaches. Part of the problem is that the software field is so broad that it's hard to get a sense of a person's abilities with such a limited amount of time to ask questions.


Sure, we'd all want an ideal setup but ideal isn't possible in an interview situation - both sides need to compromise a bit for practical reasons. And really, while we say "laptop" if someone wanted to lug in their desktop with two monitors and keyboard we wouldn't stop them. In fact, it'd probably earn them points for being extra nerdy ;)

Would you rather a laptop w/ standard editors and IDEs or would you prefer a google doc, collabedit or a whiteboard like many places do?


That's a bit like saying, "My daily driver is a Mercedes Benz SL55 AMG, so I can't drive a Hyundai Elantra, not even for a block or two." Or like saying, "Oh, I normally wear Air Jordans, so I can't walk with flip-flops."

Moreover, if you're that useless without your dual monitors and a full-size keyboard, would a whiteboard-based interview be any better?


One company I know of says bring your own laptop as a result, and another says they do a day of pair programming as the interview.


Hmm..This just pop into my mind as I read your comment.

Could you use this specific scenario to view how the candidate would react? If he starts to complain that the right tools are not on the laptop, or he is unfamiliar with environment, does it is signify that he is focusing too much on the tools rather than on solving the problem?

Maybe his time management skills needs some work, ie limited time to solve the problem and too much time worrying about environment.


Well, we don't do the puzzle questions, and I agree that those are not useful to determine the level of software engineering experience a candidate has.

However...

99 times out of 100 you could catch that engineer by talking to him about technical aspects of past projects he worked on.

That does not work for us.

We must put each and every candidate in front of a keyboard, and bang out at least a little code.

We had one candidate, recently graduated with an MS in CS. Nice personality, good talker in general. The candidate seemed to understand CS and could explain how to solve problems.

However, even with repeated prompting, I could not get this person to even type in a loop to start to solve the problem presented.

We've interviewed multiple candidates with a BS in CS or similar who also spectacularly failed at doing even basic coding tasks.

Some people are really great talkers. They really do sound like they know what they are doing. But that does not mean they can actually do anything.


As someone else said, I'm not talking about someone fresh out of school with a few months internship experience.

>That does not work for us.

How do you know, did you try it? I said 99/100, sure there are some rare people who may slip through, but there are people who slip through a coding challenge as well.

How do you know that your coding test is better at detecting secretly terrible engineers than talking through past projects (for experienced engineers)?

>Some people are really great talkers. They really do sound like they know what they are doing. But that does not mean they can actually do anything.

That's sounds like a problem with the questions you're asking. Some people can throw out buzzwords and talk at a high level, but if you really dig down into the implementation details of project you can weed those people out.


How is this relevant to the parent? You're talking about someone with a CS degree, which means they have no experience. How can you ask them about the technical aspects of what they did on a project. "What parts of the system did you write? And what problems did you hit. What solution or bug were you proud of solving?".

A degree is not experience, CS != programming.


Nearly all the the candidates that we've interviewed have at least a little job experience, such as an internship. And even for the ones that don't, they all have worked on team-based projects as part of their coursework.

So we end up asking them all those sorts of questions.


Working on team-based projects as part of coursework does not mean they've ever written a line of code in their life.

Going through a CS degree without learning to code is easy. Depending on your CS course you may hardly have touched any code beyond extremely entry level courses where you're being hand-held the entire way, if you pick the "right" subject.

CS is not software engineering.

Similar with internships.

For both those situations, I can understand that you want to verify ability to actually write code.


You're just agreeing with ansible - no one here is arguing that a CS degree means you can code. Rather ansible is relating his experiences with people graduating from those courses and not being able to code.

What you seem to be implying, is that this is obviously true for graduates, but couldn't possibly be true for people with real experience. (But your implication has no evidence to support it)

Exactly why do you believe that it is completely plausible for a recent CS graduate to be able to explain the technical details of their final year project when in reality they contributed zero code to that project, yet it is entirely implausible for someone with 3 years experience to be able to be able to explain the technical details of their most recent development projects when in reality they contributed zero code to those projects?


> yet it is entirely implausible for someone with 3 years experience to be able to be able to explain the technical details of their most recent development projects when in reality they contributed zero code to those projects?

Here is problem. It's not entirely implausible, just very unlikely.

The type of interview I'm talking about isn't going to catch every single bad candidate just the vast majority. But whiteboard interviews don't catch every candidate either and they have an extreme amount of false negatives.

To your point about students, they have much less work experience to talk about. The point of the interview is to look at their experience and talk about it to verity they aren't lying. In most cases the student doesn't have enough experience to tell me anything.

That being said, if a student can talk me through technical implementation details of a final project, then I think that's a pretty good indication that the student can code.

I would say the false positive rate for that test would be very close to the false positive rate for a white board interview with drastically lower false negatives.

If you're Google and you have a limitless number of people who want to work for you, you can get away with taking a vastly higher false negative rate for a slightly lower false positive rate. However, if you're almost every other company out there, you might want to reconsider.


> Going through a CS degree without learning to code is easy.

Let's name some schools here. In no school I am reasonably familiar with (either first hand or by friends with attendees) would this possible.


I am not sure if your comment is supposed to reply to mine.

I do think there are incompetent engineers, and I have interviewed about a dozen of them. It is very easy to filter them out.

The challenging part is to evaluate how good the candidates are, and if they would be good for the team and for the applied positions.

In start-ups with <20 engineers, you would like to have the candidates to be able to wear many hats. In a big engineering team, the requirements could be different. For example, besides challenging and creative work, there could be some tedious, even boring, work. The latter is also important, both for the whole system running smoothly and for the long term health of the system. So if the candidate is sufficiently competent, very detail-oriented, disciplined, team-oriented, she/he would be an excellent fit for the job, even if she/he is not technically very strong and couldn't figure out challenging technical issues. An excellent engineer, on the other hand, might get bore, grumpy, and leave in 6 months. It is expensive to restart the hiring process.


I worked with a guy who, after having been at the company for 6 months, called me over to his desk to help with a "problem." He wondered why he was getting a syntax error. Okay, fair enough. I looked at his screen, and he was putting the arguments to the function OUTSIDE the parentheses. The guy had clearly just been copying and pasting code until "it worked" the entire time he had been there. I almost palmed my face right off of my head.


To be fair, I have done that before when just messing around on my own late at night. After working for as long as I had been I felt like reading my own code was like wading through sludge and my eyes just didn't catch the error. I was still however well aware that putting the arguments outside the parentheses is not how this works, not how any of this works.


That reminds me of how I deal with C's pointer syntax. "Well, it's either * or & or a combination of a low number of the two, let's try a couple of combinations that seems vaguely reasonable... and some that don't, just in case". I don't know who's most at fault: Me or C.


In case it helps, I've noticed what I'd consider a wart in C that seems to be a common stumbling block with this - including when I first learned C a couple decades ago: * when used for types, has the "opposite" effect when it's used for values/expressions.

  int* foo; // The addition of a star here, on a type, means we're converting the type /into/ a pointer-to-int /from/ a plain int.

  int i = *foo; // The addition of a star here, on a value, means we're converting the type /from/ a pointer-to-int /into/ a plain int (reference).
I found it much easier to keep the two straight once I segregated their use cases like this. There's a similar duality with & in C++ using references, although not quite as cleanly:

  int& foo = ...; // The addition of & here, on a type, means we're converting the type /into/ a reference /from/ a plain int.

  p = &foo; // The addition of & here, on a value, means we're converting the type /from/ a reference /into/ a pointer-to-int.
Regardless of "fault", I hope this helps :)

(edit: C++ style comments used even in 'C' code to try and prevent YN from treating stars as formatting... replaced a few instances with 'a star'... changed emphasis to use slashes because it's still eating asterisks characters...)


rule of thumb: "*" means "look up the area of memory this value points to". "&" means the reverse - "give me a reference to this area of memory".

The problem with randomly trying things with pointers is that sometimes you can accidentally fix the issue but introduce a subtle bug that's dependent on the data being stored.


I'm with you.. I'd have to read a refresher on getting started with C if I were to even do something trivial with it... I've been in higher level languages so long (C#, JavaScript, etc)... I find it helpful to understand the concepts sometimes, but would be unlikely to reach for something more low-level than go these days.


I'm sure there are several variations but I have a sort of mnemonic to remember this - that * looks like an arrowhead seen head-on and so it "points" to a value. & is then, a reference.


I've been fortunate enough to work on small teams my entire career where someone like you describe simply would not last. I don't think I've come across a person this bad though. Even the 'bad' programmers I've worked with were able make working code, but it was not designed well.


>I have worked with incompetent engineers, who literally couldn't code FizzBuzz.

Operative word worked. This is exactly what the article says: "They always seem to be elsewhere. For everyone."


Uh, no? For the time when I was employed there, they weren't "elsewhere." They were sitting right next to me.

(The presence of incompetent engineers was certainly a factor in why I left that job.)


You should work on your reading comprehension. What you're saying is precisely what the author is talking about: everyone has a story about terrible engineers, but they're almost always "elsewhere". Co-workers tend to become "stupid" and "incompetent" after you leave. And yet they maintain employment.

It's an idea, not an iron-clad rule, that perhaps "incompetence" isn't as rampant as so many people suggest.


I read a really good comment by I think Nolan Bushnell about engineers. He said said he got asked all the time about how to avoid hiring unproductive engineers. And he said he found that baffling. Because an unproductive engineer just costs the company their salary. More important is ferreting out and eliminating negatively productive managers, engineers, and technicians in that order.

Algorithms? Why would you care if a programmer knows the details unless that's the job. More important is how they approach and attack a problem. Sometimes a half broken shell script is fine. Or leveraging a database because even though slow an inefficient it will do it right. Half the secret to success is... Um.. seriously just watch this video and you'll know it's truth.

https://www.youtube.com/watch?v=q5LiQcY0bqk


An anecdotal case of the worst candidate I interviewed. He was friends of our VP. He had an engineering degree from UC Berkeley. Over 20+ years, he was a consultant for firmware development at 28 positions. The non technical portion of the interview went great and I said I wanted to ask a few coding questions. He was really resistant and used all kinds of excuses. I told him to humor me and code something simple like draw how a linked list works and how the to write the insert function in C. At the point I was totally caught off guard. He knew not a single like of C syntax and just made things up. He didn't understand the concept of pointers at all. Later when I talked with my co-workers they had the same feedback. He didn't know how to program at all. We wondered how he consulted at 28 companies for the last two decades!


Probably explains why he went through 28 of them!


I think his average stay was maybe 9 months which is not unreasonable for consultants. My guess is they were all charmed by the CV and didn't interview him thoroughly enough.


Or your analysis is wrong and he didn't need to know a single keyword in C to be fantastic at his job. i.e. if this is the truth then if we re-run history but you spontaneously accept an excuse and let him skip your technical interview, you might have been extremely satisfied with his work and input with you over the next 9 months.

We have a couple of things to analyze here: 1) could this have happened? 2) in this case would your satisfaction be false, i.e. he wouldn't have been doing good work for you over the next few months?

Or is there a chance that he had some skills that didn't require knowing any coding, while being able to contribute valuably to you...


Or maybe he's old enough that he's used to thinking in assembly. I learned firmware on 8051 assembly, I've used C mostly since then but always with a reference, and only the last few months finally am getting into C++.

That doesn't mean I don't understand how firmware works. I know what's going on deeply under the hood, and I understand systems architecture. I know how to debug the stupid SPI interface when you're talking to an Analog Devices ADC that just won't respond for no clear reason. I know how to read the datasheets for parts and pick reliable ones that can be sourced readily and have sane interfaces.

I have no idea what a linked list is. I do know pointers, but not super well, I just know that they're a spot in memory that gives you the address of another spot in memory, like the program counter. I know my limitations, I'd never try to interject in dealing with a system that has a full OS on-board, and I'd refer questions about encrypted bootloader based firmware upgrades to someone who knows what they're doing. But I still know what Mode 2 SPI is, and the typical noise levels on different kinds of shielded cable. That's not nothing.

shrug

I think the best way to determine if someone is a decent engineer is to get them talking about previous projects. If all they can do is talk about it at a high level and they don't seem to have any real knowledge of the implementation details, they didn't do the work themselves. I once had a guy who claimed to be an instrumentation engineer, but didn't know (even vaguely) how you would go about measuring temperatures in a process you're dealing with. I could be wrong (often am), but anyone who has done instrumentation but has never encountered a thermocouple... well, that would be a very unusual history.


> Or maybe he's old enough that he's used to thinking in assembly

"I'm sorry, I don't actually know C, can I show you this in assembly instead"?

> I have no idea what a linked list is.

I cool thing about the linked list is that it's probably the simplest data-structure in all of computer science. I can explain to you what they are in less than 30 seconds, provided you have even the faintest grasp on how computers actually work.

> I could be wrong (often am), but anyone who has done instrumentation but has never encountered a thermocouple... well, that would be a very unusual history.

This is exactly what I'm talking about: A good programmer who can't understand and reason about a linked list, that's very unusual.


Haha, thanks for the schooling. I looked it up and it turns out I studied them in scheme back in the day. I just didn't know the term, so I'd (probably) come off as incompetent in an interview where asked.

> This is exactly what I'm talking about: A good programmer who can't understand and reason about a linked list, that's very unusual.

Well put then, I say rather chagrined ;-)

My intended point though was that sometimes you can't tell how good of an engineer someone is by asking them about things you know. Better to ask them to talk about what they know. But that's not very controversial I'd hope!


  I have no idea what a linked list is.
Really? I knew what they were the moment I read the term, from the name alone, as a teenager.


Your entire comment seems to hinge on the assumption that the GP wasn't hiring for a C programming job. That's pretty disingenuous.

Or do you really find it plausible that people who don't know a single keyword in C can be "fantastic" C programmers? Or that it's a good idea to hire very, very nice people who can program for programming jobs because they might have other skills that might be valuable?


Sorry I'm still not in industry, but isn't there a review system for past employees, and specially for past consulting work? This would be extremely valuable for future employers, problems with lies and uncertainty notwithstanding.

It looks to me a few amazon-style 10 minute reviews could save many many hours of HR rumblings and expenses.


No, people are too lawsuit-crazed these days. Many companies will give no facts about a past employee other than the dates they were employed. Even "would you hire this person again?" can be dangerous.

At one of my past employers, my manager (who sat next to me) told me that he wasn't allowed to say anything whatsoever about past employees - when anyone called to confirm past employment he had to give them HR's phone number, where specially trained operatives would presumably answer the one and only question allowed.


We do all development in 95% C and 5% assembly. While not with him, I have given people chances at other times and they fail miserably if they don't have basic C programming skills.


We've all worked with them. A senior engineer I worked with previously, claiming 25 years experience, did not understand when I rejected his code that declared a pointer, set it to null and then proceeded to pass it to another function with a memcpy style interface to put data into a supplied buffer. It compiled and was therefore correct to his mind. After several incidents like this and lots of proclaiming "it doesn't work" when encountering a bug, which he would then proceed to make someone else's problem, I learned to think of him as a very poor tester.

I didn't perform that interview but I was left wondering what the hell he'd managed to do for so long. A long list of experience on a cv can be meaningless if every post has ended up being a net drain on the team while others work around the problem.


I've interviewed, and been an interviewer, in interviews with the most obnixous technical questions, interviews where the hardest technical question was equivalent to "can you code?" and interviews that were in between. In no case was there a discernable correlation between the technical difficulty of the interview and the abilities of the eventual hires.

This industry's biggest hiring/talent problem isn't "posuers." The biggest problem is the persistent and incorrect belief that applied mathematicians (CS majors) can easily learn the skils needed to engineer complex systems as they go (or, worse, that said skills are trivial or even irrelevant compared to the ability to recite textbook DS/Algorithms trivia) and that they are also able to adequately determine the capabilities of other similarly trained persons to do the same mainly by relying on textbook test problems (albeit sometimes with a twist).

In other words the (undisputed, in my opinion) existence of people who can't write a simple for-loop doesn't justify the absurdities seen in modern software interviews.


Strongly agree.

Having been involved in a large number of interviews as an interviewer, my experience tells me the issue is that many highly-competent engineers are incompetent interviewers.

To illustrate my point, here is an imaginary example in the extreme scenario. I once spent a large amount of spare time in graph theory. So I could easily fail at least 95% of candidates by giving algo questions in graph theory. The questions are easy to describe, so they are perfect for interviews in this sense. But I would have done my team a big disservice by turning away potentially very good candidates, and I would have harmed our company's reputation.

Edit: typos and grammers


This has always mystified me. The key to any interview I do is a pair programming session. Before I hire somebody, I'd like to see what they'll be doing after I hire them. Crazy, I guess.

I have never understood what application Mensa-quiz puzzles and algorithms trivia questions have to the day-to-day work of making things go. It seems like a smart-person pissing contest. Maybe that that's what the bulk of their work time consists of? I hope not, but I wouldn't be entirely shocked.


Come on -- you're on of the commenters I make a point to read on HN. You can't possibly have missed that a huge chunk of valley interviews are, exactly, a fucking smart person pissing contest.

A previous employer loved them some graph problems for interviews. I got in an argument with one of the senior engineers, and demanded I be shown where in our codebase we used graph algorithms. The answer: nowhere. But you had to know your way around them to get hired there.

Now when I interview, I refuse to study them. I think this serves me well in helping me avoid jobs I wouldn't enjoy anyway.

An interviewer once essentially demanded I invent morris traversal on the whiteboard in front of him. I'm not sure how that related to my ability to be an ml engineer.


Heh. Thanks for the compliment. I have actually never sat for a valley interview, so I didn't want to judge based on second-hand data. But I'd believe it entirely.


I think in some (maybe many) cases it is a "pissing contest." In other, and probably the majority, of cases I'd hope it is a case of simple inexperience coupled with arrogance resulting from the relative immaturity of the field. And I mean "immaturity" in more than one sense. The industry itself is young, especially compared to "traditional" engineering and sciences, and it is also filled with a high proportion of relatively immature people, at least in comparison to the slice of academia I was in.


Been through dozens of interviews in the last 5 years and never encountered this. I think this would work well because it draws out a conversation. An interview should be about discovering how a person solves problems, not about either the problem or the solution.

In fact, a long time ago an accountant used this very technique to decide that I was a smart software guy. He then hired me even though he knew nothing about computers and software.

If an engineer can contribute in a pair programming session then you know they have something to contribute in your development team later, even if you don't pair all the time.


i bet there is a little bit of interviewer insecurity going on. "look how hardcore we are, we act like this is normal and we solve problems like this every day"

when i run in to an interview like this, it tells me i dont want to work with these people.


"The biggest problem is the persistent and incorrect belief that applied mathematicians (CS majors) can easily learn the skils needed to engineer complex systems as they go (or, worse, that said skills are trivial or even irrelevant compared to the ability to recite textbook DS/Algorithms trivia) and that they are also able to adequately determine the capabilities of other similarly trained persons to do the same mainly by relying on textbook test problems (albeit sometimes with a twist)"

thank you for this golden nugget. its like thinking you can build the best basketball team by getting the most athletic looking players, except much worse. i guess the conventional wisdom is that good software engineers are like mathematicians, but in my experience they are more like craftsmen, like a cabinet maker or ship builder. i suppose its contextual to the software problem, but analytical, deductive aptitude, while helpful, seems to pale in comparison to what i consider the craft (consistent style, naming, testing, pacing, working together and communicating, effective editor and shell use, knowing how the fuck to use git, etc) which have very little to do with analytical, deductive aptitude.


Actually I think engineering requires the same analytical ability as the science that forms the basis for it. It's just applied in a different context and requires a different path to develop into a good talent.


Terrible $whatever make it in to every field imaginable. The tech industry is nothing unique at all. Bad doctors exist (despite the frameworks for professional exist) for example. I bet almost everyone has dealt with bad customer support people or bad secretaries.

It seems to be something that is clearly inevitable. We've been trying for decades to construct the perfect interview process to weed out people, and while we've had some fair success, people always manage to slip their way through.

I wonder if we spend sufficient time on the remedial end of things? Can we improve our practices for handling bad software engineers? Can they be saved? When do we get to the point where it's appropriate to cut and run?

Mostly what I look for in candidates I interview (for sysadmin/sysengineer roles) is signs of aptitude. I have a baseline knowledge I'd expect from them, but after that I'm more interested in gauging how flexible they are, and how they adapt to circumstances and change. e.g. "I see you introduced Chef to your current place to handle configuration management, why did you pick Chef vs other solutions? What did you compare it to? What hurdles did you come across implementing it?" and diving deeper and deeper in to it. The deeper you dive the more apparent it should be what kind of role they played in the whole thing and what they've learnt.

Me? I suck at whiteboard coding exercises. I'm a sysadmin, and one who does a reasonable amount of coding, but put me in a room under interview conditions and ask me to write something to solve a problem and I'll struggle. Worse it always seems like I get stuck with a developer asking me to solve some ridiculous thought exercise that bears little resemblance to anything I'd do in the real world, that I'm assuming ties in to their favourite algorithm or something.

> It is also, regretfully, not the case that all applications one receives to an advertised position of Senior Ruby on Rails Programmer would be from people who had ever opened a command line.

It always amazes me how many "Geek Squad" candidates you'll see who never even mention Linux on their resume that go and apply for Linux Sysadmin roles.


This is a good systematic approach.

It strikes me that in the same day (albeit one filled with hiring articles) we get an article about the success of Greyston Bakery, which will literally hire anyone and train and support them as much as needed to be successful—and this, which is along similar lines but is about software development, which, obviously, requires special aptitude that not just anyone can succeed at, and must be protected from those who would trick us into getting hired. /s

Somehow, it feels warm and fuzzy to support the social justice of the little bakery that could, but when it comes to our own companies, our true colors shine.

Continue to take the systematic approach. It is still the best way. Of course, of course you should hire as effectively as possible, and try to have a baseline of aptitude, but there are always going to be outliers and bad fits. It's far more effective to optimize your system for the success of all employees than to focus on weeding out the 'bad ones.' Sure, there will be people that won't fit your system, but they made it through the hiring process for some reason—figure out their true aptitude and transfer them to something they'll be effective at.

The article may be wrong about the existence of fakes and bad programmers—anyone who has filed through a stack of resumes knows that—but it's right on the unreasonableness of the fear and the weight we give it.

I say it all the time to our CEO: if hiring is really a huge problem we want to focus on, how did we get all these great people? Hm.

The real problem is organizational. Stop worrying so much about hiring, optimize your process and be done with it, and focus on what happens afterward instead.


It seems to be something that is clearly inevitable. We've been trying for decades to construct the perfect interview process to weed out people, and while we've had some fair success, people always manage to slip their way through.

Much as successful security involves "defense in depth," I suspect that good HR involves much the same. It's not reasonable to build a strategy on some sort of ultimate impenetrable barrier.

Most organizations have two actual levels of membership that often exist outside of the formal rules. The reason it works this way, is to sequester the "net negative" members and protect the contribution of the "net positive" members. This must be what Valve is aiming for with it's hyper "flat" lack of structure.


I asked someone with a PhD in statistics to, "Explain regression to me like I know some math but have never used statistics." For those of you who don't know, regression can be taught to people either with a single course in calculus or people with a single course in linear algebra. It is also a, maybe the, fundamental statistical tool. I'm generally satisfied with people drawing some points on a 2-variable graph, explaining we're trying to calculate the least error fit, then telling me what that the error specifically is squared error. If the candidate can discuss penalties and outliers and glms and exponential family distributions and IRLS and sampling distributions and gauss-markov then great, but I'm satisfied with a really basic understanding. This guy couldn't get any of it. I simply don't understand how you can have a phd in statistics and not be able to explain in 60 seconds roughly what regression is, how one interprets the betas, and at least one basic inference method.


FWIW I've come across many PhDs (I've worked in PhD-rich environments for the last 15 years) who are surprisingly poor at fundamental level stuff in their field. They're so far beyond it and so specialized that they just don't really remember how the basic stuff works.


But maybe he wasn't good at explaining things. Did you try asking if he could apply regression to a simple problem?


Part of my standard phone screen approach is to open up an etherpad or other collaborative coding tool and ask the candidate to perform a Fizzbuzz variant (variant enough that memorizing Fizzbuzz wouldn't save you). This is entirely intended for failing people who can't code at all.

Yes, I've had people fail this. Yes, I gave them an extra day post-phone-screen to complete it. No, they did not complete it.

While I loathe the alpha male dance of software engineering interviews, I fully support the Fizzbuzz test, having encountered those that fail it.


That story could have been taken directly from my own brain. People like this get hired not infrequently and it's always frustrating to get one on your team.


I've never seen such an incompetent person hired anywhere I worked for.

I did see similarly qualified people turn up to interviews but they were all invariably rejected.


I want to work where you have worked, then. Rant from the aircraft industry follows, feel free to ignore.

Engineering incompetence is widespread from my experience (flight test engineering). I wish, as a flight test instrumentation engineer, that I had not had engineers from various disciplines ask me how to interpret the contents of their test monitoring screens. I am responsible for providing the accelerometers, strain gages, data bus monitoring, and so forth that the various teams request and making sure that the data from those sensors is decision quality and properly telemetered and recorded. But it is NOT my responsibility to interpret that data. And yet, some of these other engineers, who are responsible for real-time monitoring of safety of test parameters during flights, clearly have no clue what they are looking at. One engineer even confused hydraulic pressures with engine temperatures during one test I was involved with and was about to raise a fault against my system because she thought the data was faulty. That's just one instance -- sadly, I have encountered many more during this career.

Yes, these people are trained in their disciplines and on the content of their monitoring screens, but for some, it just doesn't seem to stick. How can they, in good conscience, respond with a "GO for test" when asked if they know they do not understand their monitoring screen?


I've never worked with anyone who flat out couldn't code. I've worked with some who needed more help than others, but I'm fine with that since I've been that person myself with some things. But never with someone who just couldn't do it at all.

On the other hand, I've certainly had interviews with people who are just awful at it. So much so that I think the interview process is by far the greater obstacle to finding competent people than some huge mass of incompetent people out there.

Think about it this way- Almost no one has ever had explicit training in interviewing. Everyone learns by trial and error and rumor and superstition. It's one of the most cargo cult aspects of any given business.


I had a "software engineer" work for me that didn't know if from while. Literally. He was copy pasting stuff until it "worked". One day he calls me over because his program is hanging. He had something like " while (x) print something; ". I asked how that would work if x never changes in the loop. After I bit of blankness, I said " isn't this just supposed to print once, like maybe use an if? "

"Oh yeah, if. That's the one I wanted."

...

Worked with another person of similar skill, that went on to become Microsoft MVP. Do not underestimate how far drag-n-drop and copy paste can take someone with modern timing. Especially if they're good/inclined to tout themselves.


Say what you want about Google (it seems to be common here), but you will never work with anybody like that at Google. Really never ever.

You will work with managers, PMs, and lawyers who have much more coding skill than that.

Not to say there isn't bad code. Actually the problem at Google is people write too much code... But there is a base level of competence you can expect.


> you will never work with anybody like that at Google. Really never ever.

Well... at a previous company I worked at, we hired an ex-google software engineer. He had a good resume, and was one of the best interviewees I've ever had. He totally nailed algo questions and even some arcane functional stuff I threw at him he hadn't seen before (yes, those kinds of questions are pretty dumb, but I was a bit more naive back then).

But, this guy ended up being a terrible programmer. I mean, he seemed pretty darned bright from that interview session. Amazing even. But he was unable to ship code, let alone write good, elegant code. He just wasn't wired for it.

Ironically, he ended up getting booted from his team and moving into the biz side where he made a lot more money than the rest of us (trading).


Yeah, I'm not saying there aren't terrible programmers. I'm just saying that everybody can write a 1 to 3 line "for" loop and navigate a directory structure :) That's the level of incompetence the parent was describing.


Definitely not true. I'm a current Google employee, and one of my teammates (a senior staff engineer, no less) does NOT know what a unit test is, what an API is, what Borg is, or how MapReduce works. After suspecting this for a while, I asked him to try to explain some of these concepts, and he either had no answer or his answers were hilariously wrong.

(He did come from an acquisition, though, and doesn't write any code. He's more of a manager, despite not being on the manager track.)


"Borg?" I had to look that up. It's Google's in-house cluster manager.


I thought he meant the 'Borg' design pattern ;-)


I don't know what Borg is either, but then I realized it is likely an internal Google thing; unit test, API, MapReduce -- these are all ubiquitous concepts to programmers outside of Mountain View, so the inclusion of Borg seemed a little out of place.


I am currently working with Google. We had people who couldn't code on our team. I don't know if they were officially contractors like me or Google proper.


I knew a guy who could go to the board and draw a FSM, construct a context-free grammar, or construct a proof for anything the professor asked for.

And in Algorithms, he could go to the board and walk through any algorithm we happened to be talking about.

In essence he was perfectly suited to white board interviews.

But when I worked with him in Software Engineering his code was unmaintainable, poorly commented, and he was very abrasive and unpleasant to work with.


Yup. The difference between a computer scientist and a software engineer.

It's a sad state of affairs that our industry is currently conflating the two. Or maybe the industry has only now reached a stage where the two need to become separate?

I mean, you wouldn't expect a theoretical physicist to design an experiment or, worse, a bridge.


I absolutely would expect a theoretical physicist to at least be able to competently contribute to the design of an experiment, especially as it pertains to his specialty.

The bridge analogy is spot on, in my opinion. It mirrors my above comment, and I couldn't agree more.


The theoreticians usually figure out what exactly to look at. The experimentalists figure out how to do it, and then carry it out.

A common joke is that you don't let the theorists anywhere near the apparatus, because they'll break things.


That person knows how to code, he just writes bad code. As most people do at the start of their career. It is learnable though.

The bigger problem apparently is that he's an asshole. Interviews can screen for assholes, but not with programming exersizes.


This perfectly puts an handle on what is happening: http://sijinjoseph.com/programmer-competency-matrix/


That may have been the case for 2006 Google, not 2015 Google; there's ~18000 engineers now.

I helped a classmate who didn't understand basic programming concepts batting 1/20 in full time interviews before getting and accepting a Google offer.

That was when I realized how far Google's hiring standards have fallen.


Do you work at Google? I agree with the grandparent - I haven't met a single person lke that here. I'm sure false positives exist but they are rare - you need to do well on a bunch of interviews that involve both coding and algo questions. Either your friend learned how to code or won't last long.


I knew somebody like that who got hired at Google. Bad coder, got fired from two companies in a row, and would have gotten fired from where we worked too, except I went out of my way to stop it from happening.

That turned out to be a mistake, but that's another story. Point is, he went on to work at Google.


Is this always nepotism, or does it have something to do with the reluctance some organizations have in firing someone?


This also happens in places where people are hired 'in bulk', and where technical questions can be answered by rote. Even when your standards are higher, it's still easy to hire people that are just slightly more incompetent, just by not doing a coding assignment.

This is why, at least around here in the Midwest, it's easy to stereotype other programmers by just looking at the last few companies they worked for. Came from XYZ? I won't even suggest an interview, because their architects would barely be junior devs here. Came from ABC? You don't need a screening, because you'd not have lasted two years there if you aren't at least decent.

This is the real reason hiring through networking is so prevalent around here. If someone competent vouches for you enough to bring your resume forward, chances are they'll at least not embarrass themselves in a simple interview that asks for a little bit of coding.

The reluctance of accepting a mistake in hiring is also an issue too, but I see more of that when we add both time and changing standards. For instance, I know of a big company that started with a really low talent level. They somehow managed to hire better people than they had, as time went by, but promotions have a lot to do with seniority. So you have a team of 'architects', that are supposedly in charge of things, but that, really, are worse at their job than the people that they are currently hiring. So what happens there? They have this nice cycle of hiring developers, having the good ones see that they will have to spend their time there arguing with architects that were never that talented, and whose skills are now outdated. But the architects have been there so long, they are part of the scenery. They'll never quit, as they'd never pass another interview for their experience level somewhere else, but they'll never leave their spot open unless they are fired. And management will not fire them, because they are old buddies. That's how organizations decline.


I know in South Africa, where I live, you have a third reason. And fourth.

3. There are regulations in place that disallow you from firing employees easily. Even if they're incompetent, they can only be fired outright if they break their contract in some really big/obvious way. So they're forced to follow the motions, giving warning letters, eventually having a meeting to discuss it, etc, and only after that has all been done, can they dismiss the employee.

4. Affirmative action quotas. Theoretically sound and noble. But in practice just means that if there is a shortage of said "group", then you are invariable forced to hire sub-standard individuals, and keep them. Sad, but true.

Funny aside from above. We once had a junior that was hired fresh from undergrad school, with the idea that they'd be trained up to speed. They were trained by the company (while getting a salary) for, I think, about a year before they were passed off as real junior developers in to the company. Said junior had the nerve to chuckle/mock me because I had an "IT" degree, instead of his fancy CS degree. In reality, he couldn't code a coherent piece of code to save his life. We spent countless wasted hours (this was after he finished the internal training) trying to at least get him to contribute something, anything even non-coding related. In the end, once he got his 9month/1year experience, he jumped ship. Probably to trick another company into paying him to sit around "learning".


"Never attribute to malice what is sufficiently explained by incompetence". Nepotism is not necessary.

In one company I worked for, the interviews were done by the self proclaimed CTO (which was one of the founders of the company) which was actually a dentist with no tech knowledge whatsoever. Of course in this case the interviews were just generic chats, with no technical question.


I agree.

I once was offered a job for director of engineering of a big financial firm in the UK based on a couple chats over the phone with the CEO and some tech guy (don't remember the title). With the CEO, I just chatted a bit in general about my past and my future, with the tech guy, mostly explained my past, talked maybe 5 minutes about tech (design patterns or something) and that was it.

No code, no hard technical questions on what to use what or when, nothing. I know a lot of people that didn't have the skills for that job but their attitude and bullshitting techniques would probably have gotten them the job. And this was a very successful company handling millions in revenues and working with the major banks.


Agreed totally.

However, the question we should be asking is: do we need a question more complex than FizzBuzz to determine whether or not someone is an STE? If so, why?

And if not, then why do most developer interviews consist of a series of coding questions that are more complex than that? Because as much as we like to say "so we can see how you think", really, virtually all interview coding questions are FizzBuzzes.


I had one job that, days before the interview, sent me the spec for an archival format they used, and asked me to implement an extremely basic archive generator and email them my code. At the interview, they asked me to explain what my code did. I went into detail about specific implementation decisions, things that worked for this restricted domain but would not work for a more general version, and ideas for future development.

Best job I ever had.

I think the answer to your question is that it's impossible to determine whether someone is an STE with 15 minutes in front of a whiteboard, because nobody codes like that. You will get false positives from people who can slap together a simple algorithm but don't understand why slapdash hacks aren't good enough for production, and false negatives from people who are brilliant coders but need a little while to get into the headspace, or have their flow disrupted by having to use a damn marker instead of a keyboard and their favorite text editor. You cannot identify an STE in a one-hour interview. If you want to see if someone can code, have them actually code.

Oh, here's another big problem I don't think anyone has mentioned yet: people who think that the stuff they, personally, have memorized represents the core knowledge of computer science that everyone should know, and everything else isn't really important. This is the guy who is unimpressed by absolute mastery of regular expressions, but sneers in disgust when you don't know all the arguments for pushd off the top of your head.


I was once asked "Can you do a quick fizzbuzz implementation on the white board for me" by an interviewer. I stood up, looked at the whiteboard and said "So it's, what.. multiples of three are fizz, five are buzz and both are fizzbuzz right?" and he said "Yep, nevermind, let's move on."

There was a lot more technical, and a lot more practical stuff to that interview as well, but as someone with both dev and dev management positions on their resume, I appreciated that the interviewer just wanted to know if I'd prepared or at least been paying attention for the last few years with the easy opener. Would you even consider someone in this career field who didn't know what fizzbuzz was?


I'd say after 7 years as a professional software engineer. I had never had to do anything harder in an interview than show up. The companies I applied too had prior coworkers and managers who wanted me on their teams. Then I ended up in a company where the management was so dysfunctional that it had the overtones of an abusive relationship. They then laid me off. At that point, I was a Sr. Software Engineer, during the depression, with a 2 week old newborn, out of work, stressed, not sleeping, and I had never white boarded a class, no idea what fizzbuzz was, and the last design pattern conversation I had had been 2 years previous.

I failed the interviews with some pretty decent companies. I got better.

My learning experience with that, is that the majority of interviews I took were not to determine if I was actually good at my job. They were to determine if I was good at interviews.


"Would you even consider someone in this career field who didn't know what fizzbuzz was?"

Yes, of course. Why would not knowing what fizzbuzz is disqualify someone? Simply because they have not heard of it or not read the article on it means absolutely nothing. Not being able to do it, on the other hand, would obviously disqualify someone.


Same here. Maybe when I was reading all the FizzBuzz blog posts, they were busy doing something amazing.


I heard this many times, and yet I've never encountered them in my career. I was even interviewing for a while and none of the candidates were as bad as people say.

Some wrote C or pseudocode on Java interview but pseudocode made sense and they could explain it.

I wonder if people with such experiences interview candidates without degree in CS (it's free where I live so it acts as a good filter). Basicaly unless you are on 3rd-5th year of CS degree or graduated, or worked as a programmer - you will have problems with getting to interview.


I'd say somewhere between 1 in 3 and 1 in 6 people that graduated with me (from a decent university!) could not actually program at the end of their three or four years.

That said, the same goes for at least two of the professors...


Well given how little time is spent on teaching actual programming/coding at a university, this isn't totally surprising.

Writing code is the easy part, after all.


That is pretty scary. But more importantly it is really sad that the example programmer's manager doesn't know he's in capable of basic functions. I've seen programmers who were in a new place (new language, new environment) struggle and have helped them get organized, and if they weren't able to adapt move on to a company that used a more familiar environment. But that does take paying attention, and sadly some managers don't really pay attention.


I'm a bit of a fan of the mindset a few consulting companies I know have (only in this particular regard):

You are a warm body. We need a programmer. We will teach you to program well enough to make the client happy, regardless of your prior experience. If you don't learn, we'll bench/fire/dismiss you.

It's very cynical, but at least it's better than "oh no they can't program whatever shall we doooo" I see elsewhere.

To your point, a good manager should be helping grow the skills of their team, even if a member who technically shouldn't be there has landed in their midst. The industry is littered with stories of people who, given proper support, turned from a liability into a huge asset.


As long as they're honest about it upfront, and provide honest feedback through the process, I don't really see that as cynical at all.

Having someone take a risk on you and be prepared to invest in teaching you necessarily will come with obligations to deliver, and even if it doesn't work out chances are you're better off (hopefully you'll have learned something even if it worst case is just that you're not cut out for that work) than when you started.

We could use a lot more of that for people to get past the initial challenge of making the move into industry after graduating. Especially given that so many areas mostly have CS heavy courses rather than software engineering.


Is there any chance the bad programmer was new to whatever language you were working with? Stuff like this:

"The fooBar subroutine was one statement long."

reminds me of when I first started to learn Clojure. I was so used to "=" or ":" for assignment that I had a difficult time seeing where the assignment was happening. Especially the rules about destructuring, which took me at least 6 months before I was comfortable with them. Also, the recursive rebinding in a Clojure "loop".

I don't mean to defend the bad programmer that you are talking about, but I can imagine an experienced programmer having trouble recognizing assignment if they were new to the language and the language had some unusual system of assignment.


My wife ended up with an employee like that one time. He was absolutely incompetent at even basic development tasks.

More troublesome, the place she worked at the time had a very complex technical interview process where candidates have to show at least some level of competency in the core languages and concepts the company was using at the time -- and he had passed it.

But once work started, he was completely incapable of even trivial development tasks...somebody who knew nothing and had access to stackoverflow would have performed better.

Eventually, after about six long months, he just had to be let go. But the mystery of how he made it through the technical gauntlet remains.

I thought I had escaped this for the most part, but recently, while working a contract that had my company in close proximity with another, I uncovered perhaps one the best bullshit salesmen I've ever encountered. He was a master at moving goalposts, organizing meetings, and producing lots and lots of activity, but no actual technical output of any sort. He knew all the latest technobabble buzz-words and even taught as a professor in the subject at a local college. His pedigree seemed great.

Finally, during a forced meeting, he was asked to show something anything he had produced, and he pulled up a half-formed, invalid JSON file he was hand writing for another project.

We've been able to separate our work from his group, but it's unbelievably aggravating that he's still getting paid.


> We've been able to separate our work from his group, but it's unbelievably aggravating that he's still getting paid.

How common is this?

I had a similar situation. I didn't challenge the developer directly, but when asked to justify why our project was falling behind I sent a summary of the VCS log to my manager. The developer hadn't committed anything in five years! Management promised to sort things out, but it's over a year and she's still employed.

Throwaway account, because this is one of the reasons I won't be employed there much longer...


I suspect it's more common than anybody would think. A disturbingly large number of stories I've heard about regarding "consultants" (from numerous points of view, including consultants themselves) seems to indicate there's a huge quality variance in the field.


When I hear a story like this I have to ask - how did no one else notice?

What was this person doing while they were employed? Why did no one else spot that his productivity was effectively zero?


wait, wait, wait. In your first two paragraphs, by "software engineer" do you specifically mean a coder, a programmer, someone who is paid basically to write lines of code? (As opposed to: software tester, architect or manager-type role on technologies they might not know personally, front-end designer who happens to accidentally be able to do some javascript etc.) You mean someone with nobody below them, who is not doing software testing, debugging, front-end design, or the like, but paid to put out code, but cannot write a for loop that does a character count in the specific language they are paid to code in? And who is gainfully employed "coding" in that language, for months, years?


I know, you think I'm insane, right? I doubted my sanity, too. Not a software tester, nor an architect, nor a manager, nor a houseplant, nor a figment of my imagination. A software engineer.

Actually, come to think of it, I think it's almost exactly accurate that "He had the same seniority in the same role that I would be have today if I had not quit that company 4 years ago." (This incident would have been from 5 or 6 years ago.)


Okay then. So, you were explicitly told:

>that Z should be allowed to "continue to devote his full attention to work which he is well-suited for."

So what was this work? What did the person saying this have in mind?

(I cannot possibly imagine anyone saying that with a straight face, if they meant that he is totally incompetent but is the boss's son-in-law. If your quote/paraphrase is anything near accurate, the person must have genuinely thought that z was making an excellent, albeit different type of contribution. He took time out of his day to tell you that. What could he have been 'well-suited for'?)


I'm curious if you have any insight into how coworker Z was hired in the first place.


Both of these are very difficult things to accept. They may be even more difficult to accept if one is extraordinarily smart/diligent and one studies/works solely in organizations which apply brutal IQ/diligence filters before one is granted even a scintilla of the admission committee's time.

It's irrational to base a network security plan on the notion that one can concoct an impenetrable barrier. Very few things in the real world are built like this, because it's very hard to get that to work. Most successful plans are built on a notion of "defense in depth." Valve is probably using their structure to create a "free market" within its workforce, allowing natural leaders to create subgroups, elites, and further compartmentalization.

Even if Valve, being comprised of very smart people, has a magic interview plan that can select for real technical talent 100% of the time, all of the other possible workplace pathologies will be impossible to filter out 100%.


The real question is why wasn't this coworker fired on the spot.

Also, how many years of experience with Java he has on his CV...


Could you at least give us some hints about these supposedly utterly incompetent software engineers? Since I've never encountered anything close to that level of incompetence, I'm assuming there is some variable that predicts it and that I have managed to avoid that variable. Is it a certain type of company that employees these engineers? Is it possible that the issue is not even with the employee's competence at performing his or her job, but rather just a bad job title that doesn't reflect their actual role at the company?


I've seen people leave University with all As in their degrees who couldn't parse a string into a number, and were pretty sure that that was in fact impossible.


> A coworker, having overheard the conversation, stopped at my desk later, and explained to me that, if I had pressing engineering issues, coworkers X and Y would be excellent senior systems engineers to address them to, but that Z should be allowed to "continue to devote his full attention to work which he is well-suited for."

What the hell would he be suited for in that case? Sweeping? Mopping?


Reading and commenting on HN? b^)


I used to think that working with harder languages, e.g. C++,Scala,...[0], would avoid such people in the team, but no, some managers always manage to cherry pick them into the projects.

[0] Just one possible example, take your pick.


Now, to place this in context--was this at a company in Japan, with (by you even!) well-documented social norms about how technically incompetent employees are dealt with, or was this with a company with more conventional engineering culture?


This is not a Japanese-only thing. It is a thing so common in America that there is a named test to detect it -- FizzBuzz [+] -- and this test, or one like it, actually catches many people out.

[+] c.f. Any of the following and note that they're from the nation of, ahem, conventional engineering culture:

http://imranontech.com/2007/01/24/using-fizzbuzz-to-find-dev...

http://blog.codinghorror.com/why-cant-programmers-program/

http://www.globalnerdy.com/2012/11/15/fizzbuzz-still-works/

I would note that a common theme in these posts is "I thought it was a myth but then I tried hiring and the boogeyman sprung to horrifying life."


Not that it invalidates your larger point, but I feel I should point out that I invented FizzBuzz while primarily interviewing European candidates in London :)


I've seen it too. I think, if you haven't seen it, it's either due to luck or sample size.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: