Hacker News new | past | comments | ask | show | jobs | submit login
Hiring Is Broken – My interview experience in the tech industry (medium.com/evnowandforever)
376 points by sahat on April 27, 2016 | hide | past | favorite | 671 comments

Hiring isn't broken, dude. You just don't interview well, and you seem antagonistic even to the idea that your cachet and skills, insofar as they exist, aren't useful to the market. That's a problem -- for you.

Interviews exist because employers need a way to quantify and qualify your ability. Typical computer science problems are a (flawed, but concrete) way of doing that. No reasonable interviewer expects you to recollect breadth-first search flawlessly, on demand, onto a whiteboard. But they do expect you to be able to reason from a problem statement to something approximating a solution. That's fundamentally what programmers do. You should have the chops to think in this way, and you should be eager to try. Emoting "what the fuck?!" and claiming you can't, or won't, because you're not a recent graduate is an excuse and a cop-out.

A few popular open-source projects don't necessarily speak to your talent as a programmer. A 960 day GitHub streak is trivia, not a signal of anything useful. (If anything, it marks you as a target for burnout!) A few Hackathon wins and a couple hundred GH stars are the artifacts of a successful hobbyist, not a proxy for professional ability, or a gateway to employment.

Hiring IS broken, unfortunately everyone isn't realizing it simultaneously, so half of us end up speaking about it to the other half who are already employed and feel the need to justify their employment. Since these discussions never happen out of CS / IT circles they very rarely propogate upwards to changing company practices unless the CTO/CEO happens to read HN. You're just further contributing to the rigged scenario when you respond like this.

I run a small company and I absolutely refuse to have an HR department. We're still small enough that this is fairly easy to work around.

My recent hires are all just people I met at events. I'm told by people outside the company that they're all very happy workers.

We do not do technical interviews. We talk casually about past work and what people want to achieve. After hiring we usually find out they have a different skillset and then we find the right tasks to match experience vs. career path.

Knowing this works for my team I would never want to go back to full-time employment at a place with HR ever again. HR has become an insular group of "specialists" who do not deliver value to organizations, and often ruin them IMO.

>I run a small company and I absolutely refuse to have an HR department.

You are my hero. If you ever end up having an HR department you can put in mediocre software developers to do the menial tasks. Most mediocre software developers will out do your average HR drone. For hiring and interviewing use the very best people that you have.

I'm confused. I think of HR as human resources, that is, the folks who manage benefits and paychecks and insurance struggles. How is that related to interviewing new engineers?

If you've got a good mix of developers at various levels of skill I'd say you're doing it correctly then. The problem is the corporate mandates that 100% of employees be ninja/guru/cowboys. It's this mindset that keeps everyone else running around interviewing needlessly.

That does sound like a nice play to work. How do I get in touch?

I wish I were hiring! I just doubled the team though so I'm not bringing anyone new on for the next ~6mo probably. Sorry!

When the author himself describes very positive and negative interviewing experiences, yet having no success at either, I have a hard time blaming the hiring practices.

An interview can be positive and not lead to a job.

I went on a lot of interviews (maybe close to 100) before starting my own business, and I'd say at least 9/10 of them were very positive.

By positive: I felt we both left the interview a bit happier than we went in. We both had a legitimate good time, and enjoyed it. It wasn't a grind, both people got to know each other.

I'd say I had a 10% offer rate. The most common "why didn't you select me?" follow up answer I got was "we need someone who is an expert in XYZ language." Sometimes it was even that they need someone who is an expert in ABC library (when I was already pretty-much an expert in the language.)

The ones that did lead to job offers I didn't want after the interview.

The other 1/10 was because they had me take an IQ test, or do some really super long project, or meet with literally all 20 people in the company, or they made me sit in the lobby for 2 hours past the meeting time.

> I'd say I had a 10% offer rate. The most common "why didn't you select me?" follow up answer I got was "we need someone who is an expert in XYZ language."

I would think this is something that should have been made clear from the get-go in the initial technical phone interview.

Anyone who goes to an in-person interview should expect, at minimum, that the interviewers have done their homework.

For example, if they just want Stanford or MIT grads, fine. But to bring someone into an interview - which may involve more than a day of travel, lost vacation time etc - just to tell them they went to the wrong college afterwards, that's plain infuriating and shows complete lack of courtesy.

> The most common "why didn't you select me?" follow up answer I got was "we need someone who is an expert in XYZ language." Sometimes it was even that they need someone who is an expert in ABC library (when I was already pretty-much an expert in the language.)

There's nothing really wrong with that as long as they advertise such and make it clear before even starting the interview process that you won't be hired unless you're a library ABC or language XYZ expert. The problem comes when they don't advertise that, bring you in anyway, and waste everybody's time.

2 hours? wow. and you stayed?

I once showed up to an interview only to discover that the people who were supposed to interview me had traveled to another state. This was only after I had taken time off work and been escorted from the military base security gate to the ramshackle, unmarked building that did not in any way resemble a clean and healthy workplace.

Someone eventually called me to reschedule, and it took every last ounce of my willpower not to perform a verbal auxiliary anus installation over the phone.

To name names and shame the shameful, it was SAIC (before they split into SAIC and Leidos).

I had already took the day off, and they did update me every 20 minutes or so. They also gave me water and snacks before I even knew to wait, and I had my laptop.

It wasn't that big a deal, but they didn't have a good reason to make it not a deal breaker.

Hiring in CS / IT is massively more objective compared to most of other fields.

Not really - these kinds of interviews are about hiring "People Like Us", i.e. recent college CS graduates who the interviewers like right off the bat (are they going to strongly recommend someone they don't like). It discourages diversity, reinforces ageism, and creates a stagnation in the company doing the hiring.

I've seen this directly happen - the interviewers were two Asian and one White male, who came to this job as their first outside college. Looking over the engineering department, I saw, with one exception in fourty, young White or Asian males. The one exception was a lone Asian female, who had a doctorate in CS.

Hiring "People Like Us" doesn't help a company.

> Looking over the engineering department, I saw, with one exception in fourty, young White or Asian males.

IME, that's much more a consequence of the limited diversity of candidates, not a limited diversity of hires. In most places I've worked at, even if they hired all women and blacks (or other non-Asian minorities), there would still be at most approximately 5% women and 0% blacks (the latter would probably be higher in the US).

How many of those potential candidates were weeded out by phone interviews with the same hiring criteria and scoring mechanism?

I have personally been rejected after a phone interview, not because my technical skills were lacking, but because the interviewer didn't like something he heard in our conversation. I could quite literally hear him checking out of the conversation and going back to surfing the internet.

IT (By IT I'm talking about my industry, I'm an IT administrator, not programming) hiring is based a lot on certifications, certifications which are abundantly easy to scam.

You can find full exams with answers of CCNA and every Microsoft cert imaginable, I've worked for companies that incentivise certifications, and i watched coworkers use these 'tools' en masse to gain easy pay bumps and pad their resumes.

I've had two interviewers ever to ask me technical questions that actually tested my knowledge or capabilities. I look good enough on paper so usually interviews are with a couple managers and if there's a culture fit I'm in. Doesn't matter if i dont know a damn thing, if my resume says MCSE on it.

Microsoft and others further incentivise these certs by giving bonuses and breaks to companies employing certified individuals.

If you think people being hired because they passed a test they cheated on is 'more objective than other fields' then maybe i just have a warped view of how objective other fields are

None of the employees I would consider working for gives fk about certifications.

I agree with you that certifications are crap but I haven't encountered a single good company that cares about them.

In my experience I wouldn't say they were complete crap or didn't care at all, but I'd agree with the notion that they're a poor indicator by themselves.

When my team would look at candidates, past accomplishments and projects counted for tons more than a cert. Still, a cert told us what someone was interested in and what they wanted to bone up on. They could expect questions relating to that cert field during the call. If they had trouble talking on the subject, then we knew that the cert was likely there to be a pad or just something to fulfill some billet requirement. In good cases we'd find that the cert was an indicator of something they were passionate about.

So I consider it a pointer of sorts. Never worthy of hiring someone on it's own, but I'd recommend them to people who are /ready/ to demonstrate knowledge and may not necessarily have the project experience otherwise. I'd just recommend to them that they be ready to talk to it in an interview.

I've let mine lapse completely to weed out companies searching only for major certs, i'm not interested in competing with cheaters.

Vastly fewer responses and recruiters on major sites, but i've got about another 6 months to go on a really interesting couple of projects at my current job, so I'm not too upset about it.

It's not true. Some companies legitimately care about them: if you are an MSP it is very useful to have people with Cisco, MS, and other certs.

Sometimes it is almost required (see MS certs) to be competitive.

So, they only care because it's a business thing (not a qualification thing) but they do care.

I've worked for several MSPs this is my experience.

They are very keen on getting employees with existing certs or getting current employees to actively be getting more certs.

There are specific incentives from microsoft based on how many employees have certain levels of microsoft certification. it comes with discounts and marketing materials/privileges (Who doesnt want to say they are a Microsoft certified Gold support partner?)

As such, it is conveniently ignored that the guy with an MCSE isnt going to get fired because their gold level partnership lapses if they dont keep X number of MCSEs on staff

>* being hired because they passed a test they cheated on*

The certificate shouldn't matter in the interview itself - it is just a way to pair down the choices of who you interview when you have many candidates. If you have knowledgeable people in the interview process (i.e. at least have a dev present for a dev interview, not just managers or HR people) then you should be able to catch those that don't really have the skills.

Heck, even people who genuinely have a certificate might be rubbish overall but revised well for the exam days - you'll hopefully weed those out in a good interview process too.

>The certificate shouldn't matter in the interview itself

Have you had many IT interviews? I have, they are vastly non-technical interviews.

Unless you count "have you ever used #CommonSoftware? how about #SlightlylessCommonSoftware?" as a technical interview.

I'm an IT administrator, not a programmer, so i'm speaking to my experience in that field, I am not trying to make any claims about how programming interviews work.

I've never had a whiteboard in an interview, I have in two interviews ever been asked to walk through a diagnostic process for problems, or asked any infrastructure or network design/architecture questions.

Its worth noting I work and live in Detroit Michigan, so I'm not exactly interviewing with major technology companies regularly. But I have worked for MSPs which glossed over technical interviews in favor of culture fit as well. (MSPs are especially incentivised to have employees with certs, they get discounts and kickbacks)

The discounts and kickbacks matter far less than referrals from the major tech company vendor for, say, integration work that's contracted. Most tech work outside major cities comes down to basic IT or contracting for the big tech companies that don't want to deal with enterprise drivel.

Thanks for the downvotes on these comments, I'd love an explanation as to why - Am i not contributing to the discussion with my comments?

I guess maybe i insulted some CCNA or MCSE-ers

I downvoted you because your comment was off topic. The discussion is about the interview process for business application developers. That is a very different role than what you do.

I respectfully disagree.

The article starts with:

>This is a story about my interview experience in the tech industry.

I also work in the tech industry, I was sharing my experience of interviewing in the tech industry.

More specifically i was responding to a comment that said:

> Hiring in CS / IT is massively more objective compared to most of other fields.

Can someone explain why this is downvoted (it appears grey to me). Is his content wrong/false?

> Hiring in CS / IT is massively more objective compared to most of other fields.

That is the illusion many people have convinced themselves of but literally 0 of my coworkers could answer the questions I was asked simply because don't have similar responsibilities despite us applying for literally word-for-word identical job ads.

The reality is, unless you do X regularly, you simply aren't going to be able to impress people with your answer to X.

Unfortunately it's objectively wrong.

It's really not an either/or situation. Hiring could improve obviously but the current process of generic white board problems is effective if imperfect.

It's just really hard to decipher one's ability when everyone puts technical experience on their resume because they took an online class or read a book on some thing once.

But don't many thousands of lines of code for personal and professional projects written over many years count for anything?

1. It gets your foot in the door and gets you an interview.

2. It likely gives you valuable experience to answer white board questions.

I don't think it's a perfect process but it weeds out fakers/resume padders. Also a lot of companies arent looking for specialists but want broad skills to handle a variety of problems cause the problems 5 years from now could be very different for the company.

I don't think many software engineers deal with "computer science problems" in their everyday jobs.

GitHub stars are worth much more IMO. It shows that other developers value your work. That says a lot.

In the end, we have to realize that most developers are just average. Why go through the ridiculous process of finding average developers who by luck (or some homework) happen to solve the problems you throw at them perfectly?

GitHub stars are a popularity contest that has little correlation with the quality of your work. You get them when lots of people happen to look at your GitHub profile, for one reason or another. I recently discovered this when a project of mine [1] that had been on GitHub for six months got starred by a prominent open source developer at my company. Within the next week, 50 random people came through and starred the same project. It's not my best code, it's not complicated, and I doubt most of those people ever tried it. Instead, they starred it because they're the kind of people who happen to star a lot of things, and they saw it from somewhere else.

I'm always happy to get a kudos on GitHub. But those of us who don't play the game of promoting our own code on social media, or who have code we can't post on GitHub, would be at a significant disadvantage in a hiring process based on that metric.

[1]: https://github.com/tdeck/teambot

Most developers are average and most jobs are average. How many people really get to think about breadth first or binary trees daily? Most of us spend their days wondering why Chrome doesn't render a div in the expected way or why some API doesn't do what it should and finding workarounds.

> Why go through the ridiculous process of finding average developers who by luck (or some homework) happen to solve the problems you throw at them perfectly?

Never underestimate the value of hard work, preparation, and a can-do attitude. It eclipses natural talent every time.

The hardest workers with the most successful can-do attitudes will simply reject your company if you ask them to do bullshit like algorithm trivia or HackerRank tests.

Such trivia is the opposite of useful hard work; preparation for it truly wastes time that could be better spent on other things; a can-do attitude would imply rejecting or ignoring requests to do such things.

As a result, the people who actually do take the time to complete it will not be completing it because of a good attitude or work ethic, yet you will mistakenly believe they are.

or they already learned it because they worked hard during school and they work hard to keep themselves current.

I think the concerns people have about the efficacy of this interview style is valid, but extending it to the point where you start to make claims about how people who can pass them aren't as good is ridiculous.

No, overfitting is a real thing. Overfitted learning algorithms are generally worse at generalizing their ability to broader examples and new situations.

The types of candidates who spend the time necessary to memorize algorithm trivia for the sake of passing these exams are exactly like overfitted learning algorithms. What they happen to know is unlikely to generalize well. Of course you could get lucky and hire someone like that who can generalize, but that's rare. More often, since hiring is political, you pat yourself on the back for how "good" the candidate is (based on some trivia) and make excuses when their on-the-job performance isn't what you'd hoped, and find ways to deflect attention from that so that you, as an inefficient hirer, won't be called out on it.

Willingness to waste time overfitting yourself to algorithm trivia absolutely predicts worse later-on performance than candidates with demonstrated experience and pragmatism (e.g. I'm not wasting my time memorizing how to solve tricky things that rarely matter. I will look them up / derive them / figure them out when/if I need them).

If given the choice between hiring a math/programming olympiad winner vs. a Macgyver/Edison-like tinkerer who may not be able to explain how to convert between Thevenin and Norton circuit forms, but who took their family radio apart and put it back together, Macgyver/Edison wins every time (unless you're hiring for bullshit on-paper prestige, and of course many places are while proclaiming loudly that they aren't).

Totally disagree with the final sentence. Most math/programming olympiad winners are way more than capable of handling anything the Macgyver/Edison type would be good at. At least in my industry, every olympiad winner has been a consistently spectacular performer, and I have absolutely no qualms heavily biasing myself towards that credential.

Having worked with some such folks in a quant-focused field, I can say that their performance was no better than average and in some cases worse. After a few hires like this, my boss actually pulled me aside and asked me to take over one of the main hiring projects because he and a few other managers were unhappy with the way the emphasis on this type of paper credential had failed to produce good enough analysts, and they wanted a process more focused on probing someone's experience and ingenuity. I didn't enjoy it at the time, but that 9-month project focused on that team's hiring needs (which took me away from some of my technical projects for more time than I liked) ended up teaching me a ton about what candidates in general look like (at least for that type of firm).

But I grant this is reasoning just from the anecdata that I have. I can believe that winners perhaps represent a higher degree of skill, but then we're talking about an extremely small number of people.

Generally you're facing a tradeoff where you have to choose between a sort of rustic self-reliance skill set versus a bookworm skill set. People from either group can learn the other over time, but you can't predict how well by testing them solely on trivia that constitutes their current main group. My preference is to hire for self-reliance and learn bookworm stuff later. I used to believe the opposite (e.g. hire someone good at math because they can always learn to be an effective programmer later) but my job experience made me believe the opposite (e.g. actually it's pretty easy to teach people stochastic processes, machine learning, or cryptography, but it's incredibly hard to teach people how to be good at creative software design).

But in original author's interview recaps, it's pretty evident that he's NOT comfortable "generalizing his ability to new situations". A BFS is not a super exotic, highly specialized algorithm. It's pretty much a bread-and-butter application of general principles (linear search of children + storing away hypotheses for later).

> Overfitted learning algorithms are generally worse at generalizing their ability to broader examples and new situations.

Source? or just trying to justify your own shortcomings?

Here, let me Google that for you.

[0] < https://en.wikipedia.org/wiki/Overfitting >

> Overfitting generally occurs when a model is excessively complex, such as having too many parameters relative to the number of observations. A model that has been overfit will generally have poor predictive performance, as it can exaggerate minor fluctuations in the data.

[1] < https://en.wikipedia.org/wiki/Generalization_error#Relation_... >

> The concepts of generalization error and overfitting are closely related. Overfitting occurs when the learned function f_S becomes sensitive to the noise in the sample. As a result, the function will perform well on the training set but not perform well on other data from the joint probability distribution of x and y. Thus, the more overfitting occurs, the larger the generalization error.

Shall I fetch a ruler?

I apologize.

> The types of candidates who spend the time necessary to memorize algorithm trivia for the sake of passing these exams are exactly like overfitted learning algorithms.

I should have asked for a source for that portion. It's the part that is ridiculous.

Again, my bad.

Sigh. I wish instead of the sarcasm, you actually introspected on the toxic and needlessly aggressive way you're defending rote memorization of algorithm trivia. For example, you saying

> Source? or just trying to justify your own shortcomings?

is clearly, unequivocally unprovoked and needlessly antagonistic (what do my shortcomings have to do with my point ... either you engage with the claim or not, but use ad hominem insults is not a valid discussion tactic. Yet you seem to assume no responsibility for your completely unprovoked hostility, and continue on with sarcasm, even sarcasm about your own ridiculousness.)

In terms of data, obviously no study is perfect and we should not simply base everything on a single study, but the 2012 PISA results offer at least some evidence < http://www.oecd.org/pisa/keyfindings/pisa-2012-results-volum... >.

The particular sections of the actual study that cover poor performance of those who focus on memorization is not available in the free sample (Chapter 2), but it was reported on, e.g. here: < http://hechingerreport.org/memorizers-are-the-lowest-achieve... >, including this:

> The U.S. has more memorizers than most other countries in the world. Perhaps not surprisingly as math teachers, driven by narrow state standards and tests, have valued those students over all others, communicating to many other students along the way – often girls – that they do not belong in math class.

> The fact that we have valued one type of learner and given others the idea they cannot do math is part of the reason for the widespread math failure and dislike in the U.S.

There is also the discussion from Google that test scores and brainteasers do not predict later-on job success < http://www.newyorker.com/tech/elements/why-brainteasers-dont... > (and many algorithm interviews are absolutely the same kind of brainteaser nonsense).

From the NY article:

> The major problem with most attempts to predict a specific outcome, such as interviews, is decontextualization: the attempt takes place in a generalized environment, as opposed to the context in which a behavior or trait naturally occurs. Google’s brainteasers measure how good people are at quickly coming up with a clever, plausible-seeming solution to an abstract problem under pressure. But employees don’t experience this particular type of pressure on the job. What the interviewee faces, instead, is the objective of a stressful, artificial interview setting: to make an impression that speaks to her qualifications in a limited time, within the narrow parameters set by the interviewer. What’s more, the candidate is asked to handle an abstracted “gotcha” situation, where thinking quickly is often more important than thinking well. Instead of determining how someone will perform on relevant tasks, the interviewer measures how the candidate will handle a brainteaser during an interview, and not much more.

The other thing we have to fight against is self-selection. It's not necessarily in everyone's interest to publicize that their riddles and algorithm hazing process isn't working. Especially not for start-ups which need investors to feel like they are crammed to the brim with stereotypical nerds or something. So if you go looking along the lines of "well, my company uses riddles and we have hired well" you're already done. You haven't hired well. Your company (if it's a startup) probably isn't even remotely proven yet, even if it's well-funded, and the verdict is out on whether the people you rejected really should have been.

there's nothing 'current' about BFS. It hasn't changed. You'll never write one to meet a real, work-related need. It's just a memory test that puts recent grads at an advantage.

Yes, people will write them; the fact that you do not means nothing. Software field is large, do not project your experience onto the rest of the industry.

Ok, people will write them, but very few people should write them.

Also, the question is, is the ability to write a BFS on a whiteboard a useful way to screen programming candidates for most positions. No it is not.

Tree traversal is such a basic algo that anyone who's done any amount of programming shouldn't have trouble working out how to do it, depth- or breadth-first, even if they haven't done that before. If a potential candidate can't figure that out, even in a simple pseudo-code, and even with guidance from an interviewer, I can't imagine ever hiring that person.

Next to arrays and lists, trees are such fundamental part of computer programming that I can't find an excuse for not being aware of some basic operations. I am not suggesting here that the RB or AVL trees and all the tricky stuff about them should be your bed time reading, but a certain baseline should be established.

Many highly skilled and experienced programmers would struggle with this kind of thing in a whiteboard interview, but then if hired would be very good at the job.

The interview scenario is just not similar to on-the-job coding. It has different social pressures, time pressures, access to help, access to privacy, etc. etc.

Plus, for someone like me (I studied machine learning and stats, and all data structure knowledge is self-taught post college, yet I have 6 years experience writing scientific code, performant database stuff, etc.) I am always hearing about data structures and algorithms that are supposedly "fundamental" but I never even heard of them before and have yet to need them in a job.

Basically, all of my experience, and all of the experience of hiring software developers in any of the companies I've been in just completely suggests what you're saying is actually not true. It's just a story we tell to allow us to keep our crab mentality hazing rituals.

Not BFS exactly, but I did a breadth first traversal of a graph a couple weeks ago.

Could I have dug up a library to do what I needed, sure I guess. Easier to take 20 minutes and just write the tool.

> Easier to take 20 minutes and just write the tool.

What? That would be a huge red flag in my book. Where are your unit tests? I'm not trusting your 20 minute off the cuff reproduction of classic algorithms in any business critical piece of the code, not ever.

This would get you booted from a lot of places, or at least given a stern talking to for doing something that seems slick, cool, and time-saving in the short term (yay, let's roll our own!) when really it's immature and time-wasting in the long run.

Never (!) homebrew that shit unless you have to (like, you're in an embedded environment or your use case requires some bleeding edge research algorithm).

It's like seeing someone write their own argument-parsing code. Holy shit, what a bad idea. Never (!) do that.

The pseudo-code of BFS is all of 20 lines, and that's accounting for a graph possibly containing cycles. A graph without cycles (such as a tree) would be even simpler/shorter.


Correct arg parsing is actually significantly more complicated in comparison because of all the possibilities and edge cases.

> Never (!) homebrew that shit unless you have to (like, you're in an embedded environment or your use case requires some bleeding edge research algorithm).

I can't help but think this mentality is what lead to the recent left-pad debacle

> I can't help but think this mentality is what lead to the recent left-pad debacle

It borders on category error to compare homebrewing argument parsing with an overreliance on microframeworks extending all the way down to leftpad. Argument parsing is almost always supported directly by the language implementers in a standard library. And even if it wasn't, it's such a critical task with huge overhead for handling corner cases, cross-platform details, etc., that the value of a central implementation is obvious. A left pad operation ought to be part of the standard language (which is actually what the leftpad debacle was about, Javascript's incredible failures as a language), but it's too trivial to compare it to something like argument parsing.

> The pseudo-code of BFS is all of 20 lines, ...

If it's only 20 lines, that's great because it means it was easy for the other 500 library writers to write it, write tests, and observe and fix issues over time. So, whew, that's 20 lines I totally shouldn't waste my time on, plus unit tests and routine maintenance I don't have to commit to.

And also, given the wide acknowledgement that a good programmer should be writing at most a few hundred lines of code per day (otherwise it's probably mostly junk), 20 lines of code is not trivial.

Probably most programmers write a whole lot more than that per day (especially if counting copy/pasted code), but this is in part a sign of a bad programmer, or perhaps more so a sign of the anti-quality constraints placed on them by most employers.

What the hell sort of library would you crack out to do a breadth first search traversal? This is my main problem with the OP, BFS isn't some arcane complex cryptic algorithm, it's a way of moving through a data structure that becomes natural and very obvious when you think about how you can effectively traverse it for some tasks.

If I were ever to want to traverse something in a breadth first search manner, then I would just write it. It would take longer to search for another library and read how it's somehow implemented this search than it would to write it, test it, get it reviewed, and we've still avoided adding another potentially crappy dependency to our project.

It's not category error to compare this to leftpad. leftpad actually makes more sense imo, as it's a common microfunction that may be used more than once. BFS is typically far more coupled with the underlying data and is more naturally written inline.

> we've still avoided adding another potentially crappy dependency to our project.

No, you've just added the crappy dependency == your off the cuff implementation (which is even more work since you have to handle issues and unit testing for it too).

> What the hell sort of library would you crack out to do a breadth first search traversal?

In Python I would use networkx probably, unless there was a good reason I couldn't (such as working in an embedded environment where I couldn't install large libraries). It ought to require an exceptional circumstance to stoop to implementing it myself.

Tell me, if we switched from talking about breadth first search to, say, inverting a matrix with Gaussian elimination and pivoting, do you feel the same? Are you going to trust your implementation over something you can get from netlib?

In the case where you'd require a bfs to traverse a graph, you're already doing something that is algorithmic, and will otherwise be prohibitively costly if explored in a dfs manner. The most complicated part of this algorithm is highly unlikely to be the bfs itself, with it's highly likely 10 lines of code in total.

In this case, someone who didn't have the time to visualise exactly how the bfs is operating shouldn't be writing the algorithm. Pretending like you don't need to know the details of how it traverses the tree when you needed to choose the specific implementation for this algorithm to be performant is a lie. It's not like Guassian elimination, where the concept is far more abstract, this is a concrete idea of how you're moving through the graph and shouldn't be some black magic.

On top of this, who writes this code and doesn't test that it does what you expected it to do on sample data. If you bfs implementation was bust, then you'd miss nodes or revisit them, which becomes immediately apparent from testing the algorithm you're writing.

> It ought to require an exceptional circumstance to stoop to implementing it myself.

Maybe the problem here is that I see a bfs as a fundamentally simple algorithm. Saying writing it yourself must be an exceptional circumstance sounds about as crazy to me as saying no-one should ever use for loops, they expose you to making mistakes with an index you may forget is zero-based, please use anonymous iterators instead. When something takes 5m to write, is heavily tested by default (as it will be a feature under test anyway and possible flaws will be highly obvious) and is at the same level of abstraction as the code you'll be writing around it, then just write the damn thing.

EDIT - Try writing a BFS that operates on an acyclic graph that outputs the value of each node, that compiles and runs and prints at least two node values that has a bug. Try writing bugs into this, it's actually difficult to screw up.

> Maybe the problem here is that I see a bfs as a fundamentally simple algorithm.

Yes, this kind of hubris is often the root of the problem when so-called 'simple' algorithms get homebrewed, then later on unexpected corner cases pop up, testing isn't adequate, you didn't implement it in an extensible way, your implementation is unacceptably inefficient, etc. etc.

> On top of this, who writes this code and doesn't test that it does what you expected it to do on sample data. If you bfs implementation was bust, then you'd miss nodes or revisit them, which becomes immediately apparent from testing the algorithm you're writing.

Yes, but the point is that testing isn't free. If you commit to writing your own implementation of something to interface with whatever larger business / domain-specific project created the need for it in the first place, then you have to dedicate not just the time to write it, but also the time to test it, and to test how it integrates, and to maintain the API for it and maintain its integration (possibly backward compatibility, etc. etc.)

If you adopt a standard to use a certain library, then up to the degree to which you trust the library, those problems are mostly offloaded to others. For things like classical algorithms, the libraries are extremely trustworthy, so many of the pitfalls of adding a dependency don't apply.

> In this case, someone who didn't have the time to visualise exactly how the bfs is operating shouldn't be writing the algorithm.

It's not about being too pressed for time. Even if you had all day, it's a poor use of time to create more future work for yourself by yoking yourself to all the extra responsibilities that come along with rolling your own. Yoking yourself to those responsibilities should require an exception reason for doing so.

> Saying writing it yourself must be an exceptional circumstance sounds about as crazy to me as saying no-one should ever use for loops, they expose you to making mistakes with an index you may forget is zero-based, please use anonymous iterators instead.

This is an extremely fallacious comparison. Basic control structure are built in features of the language. You don't have to test or maintain the runtime execution of a for loop, that's the job of language designers. Making mistakes in the usage of something (like an off-by-one index) is completely non sequitur to this entire discussion. I don't see how you would think that type of bug is related to anything I'm saying.

In general, you also assume a much higher fidelity of testing than what happens in the real world. In 99% of software jobs, you'll be extremely lucky if someone even documents their homemade BFS algorithm, let alone writing even the most superficial of tests. Expecting them to give you an adequate bank of automated tests is like believing in the Easter Bunny.

> Try writing bugs into this, it's actually difficult to screw up.


In practice it's extremely easy to mess up gaussian elimination and get implementations that are bad or just plain wrong.

I totally agree. And no one should let hubris make them think otherwise for e.g. graph traversal algorithms.

There is, for instance, the Boost Graph library in C++: http://www.boost.org/doc/libs/1_60_0/libs/graph/doc/table_of...

I would use one of the tree implementations in java.util. You may have heard of it.

Agreed. Spending twenty minutes finding a good library would be time much better spent

> It's like seeing someone write their own argument-parsing code. Holy shit, what a bad idea. Never (!) do that.

So we did that. works well. I dunno, maybe this company is just full of incompetents. I mean, I doubt it, but it is possible.

I appreciate judgement without context as well. Really, it's awesome.

If you did write your own argument parsing code, then yes the choice to commit to that was incompetent, barring some extraordinary situation.

I don't think it's controversial or insulting to say so in the least. It's a solved problem in every major language and most fringe languages. Spending time re-inventing it is just obviously wrong.

It's not insulting or judgmental to say that it's wrong. It's just a fact. I do not understand being offended or angry about my statement of that fact. If we're talking about what is effective engineering and what is not, writing your own argument parsing tool for actual business usage (as opposed to doing this as a pedagogical side project or something), without evidence of some exceptional corner case, is ineffective engineering. It's practically the definition of ineffective engineering.

Are GitHub star-bots the future?

Are you frickin' kidding me?

If you think that AT LEAST one part of the hiring "process" isn't broken, you're living in lala land.

Lets just pick one: recruiters. You're telling me that every single tech recruiter you've come across is excellent at what they do, an amazing and decent human being and has amazing communication skills?!

(Generally/Most of the time) Recruiters don't--and yes, they should--care about candidates. They're interested in pocketing their cut of the candidates salary, so as soon as the company says they're not interested in the candidate, the recruiter is done. Most people don't get call-backs even informing them that they're out of the running, much less a reason as to why they weren't deemed qualified.

Recruiters--especially tech recruiters are NOTORIOUS for being bottom-dwelling scum. I actually know a few good recruiters, but they are--by far--in the minority.

The fact that you don't recognize this indicates that you've either had extraordinary/atypical results, or aren't very familiar with the "process". Dude.

Edit: spelling, puncturation

Hiring IS in fact broken, dude. Did you read his whole piece? Its not just pulling questions off of hacker rank or writing full data structure implementation from scratch at a white board while being time or whatever. The author also mentions the culture of unprofessional recruiters and interviewers who show no interest in the candidate or selling him on the job(yes It's a two way street.) If someone takes time out of their day to talk to you or come to your office, you owe them a little professional courtesy and respect. I think lots of these companies have no idea how poorly some people are representing them and their company during the interview/recruiting process. And these aren't just early stage startups that might not know any better, many of these are Twitters, Netflex et al as well. So yeah its broken.

How do you know his skills aren't useful to the market? That seems like a terrible thing to say.

I don't interview well. Technology for me is exciting because of its potential to bring happiness to people. If someone tells me that the product I am working on has a security issue because it does not follow some the security portions of the RFC correctly, I will be delighted to read it carefully, study arcane encoding algorthms etc and implement code that brings up the compliance to an acceptable level. If you ask me to devise an algorithm to find the longest increasing subsequence in an array of numbers without any context I won't do so well.

Maybe I am conceited, but I can't convince myself I am an inferior programmer because of it. I can make money for your company. I can work well with a diverse of group of people. I take pains to conduct myself with integrity. Shouldn't all that be the central measure of how people are hired ?

For what it's worth, if you were interviewing for a security job and couldn't solve the longest increasing subsequence problem within a minute, I would probably say "no hire". All security folks I know can solve such problems in their sleep, and it seems really important for the job. Or at least you'd have to show your algorithmic skills in some other way, without relying on "business" rhetoric. Does that sound reasonable?

This is a false correlation. What is it about longest increasing subsequence that is relevant to security? If you can't explain, then it has no relevance and means nothing.

Being intelligent enough to solve the problem is what's relevant.

Yeah, but how do you measure it objectively?

"But they do expect you to be able to reason from a problem statement to something approximating a solution."

In under 45 minutes? Are you really concerned with their ability to even naively solve an algorithmic problem in that time frame? I would think not. No, this is not only flawed, but it's so deeply flawed (and easily exploitable), that you're not even interviewing for skill set anymore, simply either 1) someone's luck for having been exposed to the problem before or 2) their ability to remember things they don't use day to day.

Congratulations, you're hiring lucky people or idiot savants.

My friend recently got a job at a fund when her now boss asked her, "what are some of the more recent innovations in the Haswell chip set that might be advantageous to a fund". She had literally just read an article about this--out of pure coincidence--the day before, and she was able to recall it when the question was asked.

She was offered the job on the spot.

> In under 45 minutes?

Yes, absolutely.

> A few popular open-source projects don't necessarily speak to your talent as a programmer.

If having source code and demonstrated traction for a few open source projects to review don't speak to one's talent as a programmer, then what does?

"The first round had four interviewers in the room. It was an open-ended discussion about me, my accomplishments, my projects. To be honest, I was not prepared for this interview format, so my presentation was probably all over the place"

A professional programmer, the one hired by a company to work alongside other professionals - as opposed to one working by themselves on small projects - is capable of presenting previous experiences and applying them to new problems. Unless you want to be put in the basement and handed small chunks of work to solve. It's about how you mentor juniors, resolve disagreements on how to proceed, deal with set backs and changing requirements.

Given the whole post is about how terrible the interviewing process is the above quote is the most telling, and backs up the GPs point - not being prepared to have a conversation about your previous experience when interviewing for a job is pretty bad.

And this is why there is an impedance mismatch... these people are interviewing for a "computer science" role they're filling when really they want a programmer. The two are different.

As an Information Engineer, you're totally right.

Almost in any interview I had, I been asked about CS stuff. Never anything related to my field.

I don't blame them. They ask about what they know and what they think it's important, even if it's not relevant to the position.

> Information Engineer

What's this?

Additionally, a good programmer (or worker in general) should be good at much more than programming; like being able to collaborate & socialize with other team members & management without immediately resorting to "Fuck you, I quit".

The author seems to have a chip on his shoulder...

Just out of curiosity: how many bad interviews, or interviews with a rejection at the end, have you had in a row? Or total?

Zero(?), but I have also never had a "great" job that I had to fight for. Regardless, I think everyone knows through experience that an employee's technical skills are practically useless if said worker is socially inept.

And what are your social skills? You, who can't even see that the "fuck you, I quit" attitude does not come up first out of nothing but after a series of rejection and nonsensical situations, i.e. that it is a consequence of successive experiences and not the reason of these experiences. You, who decide unilaterally from your failure to understand the logic of events in a blog post, that the blogger is once and for all "socially inept". Great social skills without a bit of understanding and empathy, no doubt.

Gee, I am also very tired of people for which stuff went well, and think it is be the same for others. Can't you even imagine there is a variety of situations out there? Can't you imagine it may happen to you one day? Do you think people who encounter this kind of problems never had a shiny position earlier, and their "social skills" were considered perfect before some conceited internet stranger decided they were "socially inept"?

And you dare to talk about "experience", seriously?

You started your post with a veiled ad hominem attack, which instantly sets an illogical, angry tone... :( Yeah, I have no social skills, so what? I may also have a below average mind, so what?

I did not call the author "socially inept"; that sentence was a hypothetical example of how an individual's incredible technical skills can be overshadowed by other problems.

I am sorry if I offended you. I thought my posts were critical but logical...

Sure, but what I was trying to get at is: if you had a string of interviews that felt "bad", and/or rejections, that you would probably be a little sour, and that's what I get from the article. It doesn't strike me that the author is particularly socially inept.

As someone who watched previous coworkers, who are older, go through the interview ringer, I can sympathize with the author on the pigeonhole one can get stuck in due to seemingly trivial reasons: I don't have a classic CS background, people think I'm past my prime, etc.

These aren't imaginary problems, and I struggle to believe that I would be all "go-getter" and "pull yourself up by your bootstraps" after continually being rejected, too.

I guess my confusion stems from the article's lack of a conclusion. Yes, _blank_ is sub-optimal, but how did you overcome the obstacle?

What do you think about OP's comments that the algorithm questions were unrelated to the front-end positions?

Complaining that breadth-first-search is irrelevant to frontend dev seems concerning to me. The DOM is a tree; so are JSON structures; so are website directory structures; so are half the data structures you will encounter in most businesses (departments, categories, etc.). You WILL need to traverse a tree in a frontend position at some point. I absolutely want to know whether a frontend developer has a variety of tools in their toolbox for handling tree structured data - and ideally the ability to refactor between recursion (using the call stack), explicit stack descent (which will be depth first), and queue based descent (which will be breadth first).

That would be incredibly rare to implement a breadth first algorithm in front end work. I mean you could pick any question about the most obscure browser quirks would probably be more relevant.

I think that's an unfounded and problematic stance.

Front-ends almost always involve working with data. It is increasingly common for front-ends to have filters, searching, sorting, etc... of that data as well.

Doing that efficiently is key to ensuring a fast, responsive UI.

Wrangling with divs and CSS may take a disproportionately large amount of your time but that doesn't mean it's the most important skill to look for, nor does it mean the CS questions are somehow irrelevant to the work.

Of course they're unrelated. The interviewer asking this doesn't expect you to write these algorithms all day, they want to know that you have the basic mental dexterity to figure things like this out. They are, ideally, abstracted problems of the complexity that you'd be expected to be able to solve on your own.

There are some positions where you don't have to care if people can really problem-solve; as long as they can tweak CSS until it works right-ish, or know the particular arcane implementation details of installing some WordPress module, they can do the job. But there are a lot more positions where you want someone who can solve the problems that need solving, pick up new technologies as needed, and just generally be a flexible and productive contributor in a way that doesn't relate to the fiddly details of one particular technology.

> they want to know that you have the basic mental dexterity to figure things like this out

If you have the "mental dexterity" to coming up with your own solution which is identical to Dijkstra's pathfinding algorithms in terms of complexity within an hour interview, you are a god among mortals.

And that's what they expect - a solution to a problem that is equal to the best minds of our field. I say this from experience: I was expected to do exactly that, and criticized when I came up with a O(n^3) instead of O(n^2).

Solving a maze isn't identical to Dijkstra's algorithm. And Dijkstra's algorithm was specifically chosen to be an easy problem with an understandable answer, to show off a new computer. It was never some grand unsolved problem. Yes, it's reasonable to expect somebody with a modern data structures education who mysteriously hadn't heard of it to be able to invent it. Or to reinvent it, having forgotten it, because it's your basic breadth first search with care taken not to get "lost in the woods."

They aren't asking those questions to see if you have mental dexterity. They are asking them as they asked hundreds of people that same question before and kid themselves to thinking it is an objective measure as, well, everyone working at the company passed it and everyone here is a 10xer and so if you don't pass it you are a loser.

I find the 10x funny..

If everyone is an 10x, then everyone is an 1x .. Isn't?

Passing such interview questions could only proof: 1. you are prepared 2. you are eager to get a job 3. you are not dumb

Hiring is broken if it weeds out people who don't interview well, rather than people who don't code well.

It's also broken if it simply selects for people who can only code well - with little-to-no insight into how well they collaborate and work with a team.

In this case, I think an argument can be made that hiring worked for the firms involved.

Because of or in spite of?

> No reasonable interviewer expects you to recollect breadth-first search flawlessly, on demand, onto a whiteboard.

Wait, really? I've had to do more than that just for internships, the bar for perfection at many companies is just that high or higher.

I agree with the rest of what you're saying.

The 960 day github streak is an illustration of his passion to code, of the fact that he is doing it for the pleasure of coding. He said it himself that it's not there to show any skill.

The hubris in this response is unbelievably high! Have you ever considered that YOU can also be systemically decimated in an interview designed around your weaknesses?

Heck, what if I picked 5 problems from a national level math-Olympiad. You've surey studied high school math right? Should be able to solve them in the 5 hour interview I invite you to!

I think hiring is really steered to avoid hiring idiots. After all it's the interviewer that will have to answer if they hire someone who can't do the job. So all questions are very technical/random and difficult. Most of these people just lookup stuff from textbooks and fling them at interviewees. It eliminates the top and bottom contenders while catching mostly people who just came out of school (or ridicilously smart).

Hiring does need to avoid hiring idiots, but I've found that it's more effective to ask easy questions. Seriously. People who can't program a digital computer have just as much trouble, and there's less opportunity to fake it. Easy questions produce better signals.

I was asked once during an interview to describe a specific problem that I enjoyed solving. And I was told to go in as much depth as I could during the timeslot.

I thought that was a great question. It got me talking about something I cared about, knew very well, and showed my normal thought process and development process outside of an interview. It also gave the interviewer plenty of things to ask about, to "quiz" me on if they wanted, or to just make sure I could actually code.

If I'm ever in a position to interview someone one-on-one, that's most likely going to be my main question.

I've been asked a similar question which would have been perfect for this author.

I had recently published a (terrible, but also terribly fun to write) JavaScript physics engine, which the interviewer saw on GitHub. He began with, "how does it work?" and then continued with probing questions, getting deeper and deeper into the various problems I solved while developing the library, especially focusing on tradeoffs between various choices.

By opening the discussion on "my turf," he put me at ease in the conversation, but by driving the interview with very technical questions, he also got the answers he needed about my technical ability.

This is a great approach. I do this with candidates I interview. I ask for a sample they are proud of and then I just probe away. Has worked very well.

There are a number of problems with this approach. It's difficult to standardize so personal biases are more likely to creep in.

I've asked this question in interviews a number of times, but it's primarily to gauge communication skills rather than technical skills. The biggest problem with using it as a technical screen is that candidates are really bad at choosing a project to tell you about. I've had candidates choose as their problem building out some responsive dynamic front end and their entire solution boils down to, I picked Angular and Bootstrap.

I think the best questions are those that have an easy initial answer, but which can be expanded upon into a harder and harder direction. This way you are able to measure candidates in a repeatable way based on how far they get.

Hiring interviews often contain shibboleths that serve as cultural markers. It's a way to exclude people who are different while claiming that hiring is a meritocracy.

Any examples?

The whole idea of writing your code on a whiteboard.

It took me a year to find a new job. I got rejected from on-sites 8 or 9 times, a few other rejections before that. I've been through everything the OP has and more. I was once forgotten in an interview room while my interviewer played foosball and then went home. I've managed to pass all rounds with "positive feedback" only to get rejected three days later. I've swam through rivers of aerated bullshit to find a new job and it sucked - but I never once believed that I was a bad engineer. Hate the game all you want (and I really do hate it), but you have no choice other than to play it or have an extremely strong network.

I didn't even want to go to the interview which landed me my new job. In my previous phone screen they were looking to hire a single person.The perfect fit. Probably not me - what the hell do I know about video players? And it's one of those interviews where they'll boot you to the curb if you do poorly in the first half. Whatever, I'll go anyway. And as luck would have it, I didn't get booted. I did damn well. And in my final phone screen with the CTO, I got asked how to find the Nth last spot from the end of a linked list.

I really wasn't good at interviewing for a long time. And from the outset, I didn't know everything I needed to get the job I wanted. It was consistent studying and a buy-in to the bullshit that interviewing is that landed me a new job.

I worked for (led) the same company for 8.5 years, then decided to go do something else. I was pretty certain that I wanted to be independent (again) but I also had a mortgage, was married, etc...

So while I lined up clients and started transitioning from the current job I applied for a couple positions (note that as an executive I did this by contacting other executives (when possible) at firms, not filling out forms on the website).

I had a few phone interviews, and these were fine.

But a couple times it would get kicked to HR and they would send links for personality tests - before I ever talked to anyone.

I guess if I was desperate for a job (really glad I didn't take one) I might have complied, but I decided I didn't want to work for a firm that approaches recruitment like that.

Applied for Thales UK a few months ago. I was met with 19 pages of personality quiz. Seeing that, I decided I don't want to work there anymore, but I dared myself to finish that questionnaire. It was excruciatingly painful and felt really no regret when I was rejected.

It's almost like it reinforces that you are just a commodity, meat.

I do think it's important to have a fit on teams, and I have hired people with awesome skills that were terrible cultural fits more than once.

But I'm not going to fill out personality quizzes unless I am desperate. And I work really hard to try to make sure I'm never desperate. :)

> In my previous phone screen they were looking to hire a single person.The perfect fit. Probably not me

What country was this in? In Ireland that qualifies as employment discrimination on the "civil status" ground.

Think they mean the company was hiring one person. Not that they were looking for an unmarried person.

America. It's a bit hyperbolic, but is it really discrimination to want to hire a person who meets all your requirements?

Oh, single person. Ha, it is illegal to discriminate on the basis of civil status. They meant it as "one person".

Apologies, I misread your original comment. I thought they wanted a person who wasn't in a relationship.

They would probably prefer that as well

I'm looking right now and my previous experience has made me afraid it's going to be this way for me ("rivers of aerated bullshit.") Add to it that I don't know very many people here in the Bay Area, didn't study CS, didn't go to a "name" school, and half my software engineering experience doesn't seem to count because it wasn't web dev, and I'm just seeing a very long road in front of me. I hope it doesn't take a year.

Don't sweat if it takes a while, and don't compare yourself. Not everyone is good at interviewing at first. Just try not to burn out - when I first started, I would go crazy studying for a month, then burn for a month. Rinse and repeat. In the past months it's been a slow, steady, and consistent effort to improve. I don't have a CS degree (ECE) and went to a state school (but it has a good rep) and my previous job was at a tiny no-name company. Oh, and you get used to the bullshit. My advice: network. I got a lot of interviews (and my current job, and another offer) due to networking. I went to a lot of meetups at my scene (NJ/NYC).

Your experience will count, it's just you've narrowed the pool of opportunity. Web Dev is just a bigger target with more hiring going on.

If you apply to a Web company without much web dev experience (or knowledge) then yes, it'll count for less.

But that's true of anything. The bay area is probably the easiest place to get hired in tech right now as an engineer.

I failed 7 interviews before getting my current job. I'm a self taught programmer, and I've had to beat down the door to make a place for myself in this industry. I have a huge chip on my shoulder. But that's why I'm going to beat those golden children who never had to fight for it. I may not be as smart as them, but I'm stubborn as fuck, and I never, ever quit.

There's selectivity, and there's courtesy and respect. Companies are entitled to be as selective as they like, and if those are the kinds of tests that yield the people they want to hire then good for them. On the other hand courtesy and respect would demand that you're clear with the candidate about your expectations and requirements, that you don't waste their time, that you respect the investment of time by communicating after the interview to let them know how they did, and perhaps even that you let them know up front what areas you expect to cover. Otherwise, as was noted in an earlier comment, it really is just a lottery to see if you happen to be fresh on whatever thing they happen to ask you.

There is another side to this which leads me to ask why companies feel they have to be so defensive? My wife is a registered nurse on a cardiac critical care ward. If she doesn't know what she's doing actual people can actually die, something that is a rare outcome for even the worst software developer. Nevertheless, she has a BS, and work experience, and when she interviews they don't require her to stand up at a whiteboard and prove she's a nurse all over again. They respect her experience, and the questions are more about process, work habits, personality, etc. In the software development world we appear to have zero respect for experience, and I wonder why that is? Have employers been burned so often? Or is this more of a geek cred gauntlet thing?

Nurses also have licenses and registered credentials. If they mess up badly, they can be disciplined and lose their license and career. So there is a self-regulation that takes place in that industry. Were it not so, I could imagine nursing interviews to be a lot more intense.

There is no such regulation in the tech industry. One bad hire I made faked his technical capabilities and I regretted it a lot when I had to work with him. I hope I've now learned how to spot that type of candidate (without resorting to interview hazing).

Good points. They do have licenses and credentials, and they can be disciplined if they are negligent. Not for making mistakes generally speaking, but for professional negligence. They also have to carry professional insurance against malpractice. On the other hand the consequences of hiring a nurse who isn't qualified are very much worse than the typical consequences of hiring an unqualified developer.

You are right, but the consequences of hiring bad developers are rising every year. I think licensing is coming whether the industry wants it or not.

Why? The vast majority of industry engineers in other disciplines are not licensed. Why would software engineers be any different?

I didn't say everyone would need licensing, just that it's coming. Safety critical areas will probably be first.

Even then, most engineers of other disciplines do not need to be licensed. Safety-critical products must conform to standards and must be signed off by someone. Dozens or hundreds of other engineers of multiple disciplines can be involved in designing and manufacturing those products, but only one licensed engineer of any discipline is needed to sign off on its compliance.

Companies that have the kind of engineering culture required to produce safety-critical products already treat software engineers like any other engineering discipline and expect them to act with the same rigor has other engineers. That's actually why I cringe every time I hear about a Valley software company wanting to pivot or branch into products like that.

Can you expand on the guy who faked his technical capabilities?

Way back, I was part of a 4-person team that did some phone interviews of contractors. We asked some pretty basic questions, based on their resumes, and after the call voted on him/her. One of my questions to see if they knew anything about C++ was just to ask if they could tell me what a class was. A bunch of candidates couldn't answer that question.

It really isn't that hard to ask some basic questions and talk about work experience, to get a handle on whether someone is actually a competent programmer or not. If you're looking for top-tier talent, this might not be sufficient, but most jobs do not need top-tier talent anyway.

The guy was lazy and not very competent. We hired him for a specific capability that he was supposedly doing in his previous job. He knew the words and terms but was dangerous in his position. I can only assume that his previous workplace, a very large company, kept him away from doing anything meaningful so that he didn't mess things up. That's what we ended up doing.

I later discovered his resume was an identical copy of his coworker's resume (both were submitted several months apart for the same job).

We're not talking about fraud. You fire those people and move on. You shouldn't hamstring your recruiting process for the rest of time because of it.

BTW, why didn't you check references?

Is your wife re-certified every year or two? Does she have to have a certain number of continuing education hours every year?

I don't doubt for a minute that there's some kind of alpha geek thing going on with a lot of interviews. But it's also a reflection of the fact we don't have a widely-accepted accreditation body that can vouch for people.

I think this is at the core of the problem.

As the infamous CodingHorror post points out, an awful lot of programmers can't actually program. That means interviews have to check for more than culture fit. It also drives extremely high selectivity, and when you know you'll be rejecting 90% of your applicants it becomes that much harder to give everyone a smooth and welcoming interview.

Of course, none of this deals with the problem that (as Sahat found) many software interviews don't actually check for relevant skills. That's just stupid.

My sister is a nurse (traveling nurse actually) and she does carry credentials. Moving to a new state requires you to get licensed all over again. However, from everything I can glean from the process, as a nurse you only have to take a certified test once. Then you just submit some paper work to have it renewed every few years, you do not need to retake boards every so often like physicians.

When getting licensed in a new state (ie move from Montana to California), you just need to submit basic paperwork, background check/get finger printed and then you are licensed for the new state (no need to retake an exam). All these comments seem to indicate nurses have insanely high selective pressures to weed out people who suck. It's really, really not the case. The stories I hear about certain nurses who completely lack common sense are astounding.

When my sister gets a new position (even when she started travel nursing after only 1 year of experience), she has a phone interview with a company hired by the hospital to find travel nurses. They may or may not contact her previous employers/references and then she finds out within a day or two if she has the job. It typically takes her 1 week to get a new assignment, 3 weeks tops if she is being exceptionally picky and trying to work in a specific area/hospital. That's all there is to it.

She's gone through this process about 6 times in the past two years (I've even seen her do a phone interview while at a wine tasting! -- and she was offered the position) and she rarely stresses about these interviews as they are not technical.

> Is your wife re-certified every year or two? Does she have to have a certain number of continuing education hours every year?

Every time credentials come up with respect to this industry there are a whole bunch of people who complain that credentials are meaningless because people can just cheat or coast their way through. They go so far as to include CS degrees themselves in that category. What makes you think the education and credentials for nurses are so much better than for software engineers that nurses can be hired based on credentials and software engineers cannot?

Two possible answers come to mind: the first is that in order to be licensed nurses, like doctors and lawyers, have to pass an intensive test known as "boards." The second is that licensing, certification, and recertification are functions of state government, not voluntary industry guidelines.

> the first is that in order to be licensed nurses, like doctors and lawyers, have to pass an intensive test known as "boards."

Board certification is not a legal requirement in any jurisdiction I am aware of. The licensure tests and board certifications are separate, and the former is generally taken very close to when the doctor, lawyer, or nurse graduates from their program and completes their internship period.

> The second is that licensing, certification, and recertification are functions of state government, not voluntary industry guidelines.

Yes and no. State governments decide what they will recognize, but industry provides the training. Some of the shit that counts as "continuing education" for the medical profession is little better than what you get at DeVry for programmers.

That said, at least these professions have strong professional associations that birddog state licensing boards to keep the bullshit out. My original comment, in fact, was motivated by IEEE and ACM's attempts to do the same for software engineers and the way the industry seems to be laughing at their efforts.

Yes as I mentioned in a reply above these are very good points, but there is a significant difference of context, don't you think? Certification and licensing of medical personnel came about over many years because of the very great harm that could be done by those pretending to know what they were doing. What is at risk for most companies in hiring a programmer is more along the lines of a few hours of interview time, and the few weeks of ramp-up that might have been wasted. If an unqualified person can get in, go undetected, and do real harm to the company and its business then that is a failure of a different sort, imo.

> Is your wife re-certified every year or two?

License requirements are by state, but yes, there is a licensing process.

> Does she have to have a certain number of continuing education hours every year?

It depends on the state, but yes, some states require a minimum number of hours of education to renew a license.

It's not about selectivity in general. It's about companies not testing for relevant skills when they interview.

Reciting the implementation of some obscure algorithm from memory is entirely irrelevant to the daily job of a software engineer. Being able to do that does not make you a good software engineer, and most good software engineers cannot do that.

They do these "programming trivia" style interviews because it makes them feel clever and superior to the interviewees, and because it's a lot less work for them than conducting an interview that actually probes what level of software engineering talent the candidate has.

I'm torn.

On the one hand: the interview processes this post describes are hilariously broken. Stand up at a whiteboard and implement breadth-first search from memory! You know, like no programmer at their desk staring at their editor ever does. I think "that's the one where you use a queue, right?" is a fully valid and complete answer to that dumb question.

I also think you're within your rights to demand that your interviewer implement Kruskal's minimum cost spanning tree from memory at the same whiteboard before you have to do BFS. That's an extremely simple and important graph theory algorithm that nobody memorizes either.

These interviews are nerd status rituals. Really good candidates know this and game them. If you know anyone like this, ask them for stories. I've heard some great ones. But obviously, this isn't a good way to select software developers.

On the other hand...

The idea that "front-end developers" shouldn't need to be able to implement a BFS (at all) bothers me a lot. If you're a software developer, you should grok basic conceptual computer science. You should be able to work with a graph. If you're doing web work, you're working with graphs all day whether you grok that or not!

I've been doing front-end work for the past month, and I've had to do more low-level profiling and performance work here than in the previous 4 years of low-level systems work.

Maybe your idea of a front end developer differs from mine. I don't agree that front end developers need to code BFS. They just need to know that graphs can be searched, that should be it.

Frontend devs can do many tasks that systems engineers (who can supposedly whip out algorithms in their sleep) cannot do: - Make pages render properly in all popular browsers - Make responsive UIs - Align text of variable length in the vertical center of a page - Know when to use tables and when not to - Understand when to use a SPA app and what not to - Make SEO friendly pages


All this has nothing to do with said algos. I really think that people who don't understand this should not be interviewing frontend engineers.

I'm having a hard time responding to this. More and more, whole applications are being delivered in clientside Javascript using Angular or React or whatever. Are you suggesting that there are two "kinds" of front-end developers, the kind that knows how the DOM APIs work and the kind that can implement the rest of the application and domain logic in Javascript?

I'd love for you to try to explain to me how it's relevant to my front end javascript to write a moving window algorithm in 45 minutes on a white board.

I have no problem with thinking engineers should be able to understand and implement algorithms. I have a real problem with thinking they need to know them off the top of their head.

Wouldn't a more reasonable approach be to give an algorithm and ask the person to implement it? If that's what they're actually going to do (since most of us aren't PhDs) day-to-day, and they may more likely find an appropriate algorithm on Wikipedia and implement it, isn't that more relevant?

We agree about the whiteboard thing. I'm only pushing back on the argument that these things aren't relevant to front-end.


Lets say your software as a service website has a real time analytics graph, so that customers can look at statistics of how much they are using the service in real time.

One potential question a customer might want to know is "How many API calls have I made over the last hour". This is a moving window average question, and it needs to be displayed/done on the front end.

My company that I work at has such a feature.

Algorithms questions come up all the time in javascript development.

Thank you for the example, but this is not a generalized problem, and is very specific to your domain, so it would be silly to receive such a question from companies that are not your company, which I have had happen.

If I were applying for your company, I would expect to have to implement an algorithm like this on the job, but I don't see why I should have to know it to get the job. As long as I was capable of implementing algorithms as described, wouldn't that be the most important factor?

And how would the company know you were capable of implementing algorithms as described? Perhaps by asking you during the interview how you might go about implementing it?


How would you demonstrate that you're capable of implementing that algorithm? Isn't the best way to have them implement that algorithm? (I will admit that this is best done when the candidate has their favorite text editor and internet, and I have had an interview in the past where this was the case, which went well)

I would suggest that you provide the algorithm in process or through formulae, and have the candidate implement it. That's what they'll be doing anyway.

What kind of job do you have where someone tells you the algorithm? I can't imagine my PM saying "Let's count recent events by storing it in a hashmap and using a linked list to track which elements have expired, can you code that up?"

Instead, the PM might say "We need a system which can count how many unique events of each type has occurred in the last 100k requests, with small impact to time, high accuracy, and up to 100mb of space"

But even that would be unusually specific. It would usually be: "the service breaks sometimes, can you figure it out?" And then the engineers figure out that it's because of unusual event distributions, and they figure out what needs to be done, how much space / time they can afford to do it, etc. For instance, do you trade off accuracy by having a periodic job flush old events out of the map? Do you quantize by time to save space at the cost of resolution?

That's what I do at my job, and that's the kind of question I see in interviews. If a candidate is expecting to have the algorithms dictated to them, then I would not consider them to be a software engineer.. coding up a program from a specification would have been a technician's job 30 years ago (now largely automated by compilers, synthesizers (for HDLs), and other tools). The software engineer should be coming up with solutions, which often involve using algorithms and data structures.

And, again, I ask "are you under some kind of arbitrary time constraint when you have to create a novel problem solving approach?" Listen, most data theory problems have been solved, but for the ones that haven't, in work, we're asked to come up with a solution, but not in any weird arbitrarily short amount of time.

If your goal is to simply know how someone explores a problem, fine I guess. I don't agree with your approach but I can see your point. But if you expect a working implementation, that's unreasonable in my opinion and I believe you'll be rejecting many fine candidates using that method.

> arbitrary time constraint

Don't you have release deadlines, even if they're measured in weeks?

Depending on the complexity of the problem, I've been expected to produce code. That tends to be tricky and I don't always get it 100%, but I've not finished my code and gotten just the general direction, and still passed (received the offer).

> you'll be rejecting many fine candidates using that method.

I could see that being true. Without knowing what kind of company you work at I wouldn't know. Since I work in a big tech company who interviews in this fashion, all my coworkers who I'd consider fine candidates have passed this (partly arbitrary) bar.

It's not something I was good at, but I decided that I wanted to work at a big tech company, so I decided to play the interview game.

I am suggesting the frontend developers developing using angular/react code do not require to implement graph search algorithms.

What algorithms have you seen in practice in web frontends? I have seen nothing even minor, not even binary search or bublle sort.

I really don't know what to say to this. What kinds of applications do you work on? If what you do is primarily CRUD front-end stuff, then sure, I suppose you can get by without really understanding how to write general-purpose code. But pretty much anything past simple CRUD is going to start implicating simple data structures pretty quickly.

I can cite examples from the front-end I'm working on right now, but that wouldn't sound like a fair example (it's an AVR debugger). But all sophisticated applications are complicated in their own ways, and most of them require developers who can actually, you know, program.

Even if all you're doing is CRUD, how much computer science is embedded in even a straightforward framework like React? "Lots", is I think the answer.

I'm building an app with react which uses DFS to determine connected elements on a 2D grid: you'll have to be more specific when you say "frontend developers", since the frontend is getting more and more business logic.

> All this has nothing to do with said algos. I really think that people who don't understand this should not be interviewing frontend engineers.

React, Angular, Ember, etc, are all basically just giant tree diffing algorithms. The DOM is a tree. Understanding how to manipulate trees is important. Memorizing algorithms, less so, but it's absurd to suggest that front-end developers should get a pass on being able to code generalized solutions to problems.

More and more, "front end" is coming to mean "the whole application," and the difficulty is moving away from the gritty individual DOM manipulations and into writing structured algorithms to process large sets of application data and wrangle state, which you then throw into your deterministic renderer of choice.

It sounds like when you say "front end developer," you might really mean, "person who is comfortable with HTML and CSS." In that case, sure, that has nothing to do with algorithms, but that person is also not a software developer.

I recently bombed an interview with a graph theory question. I hadn't done any real from-scratch implementations of that stuff, but I do know enough to discuss it and pseudocode it.

They were disappointed, they wanted to see runnable code right there on the whiteboard.

I think you're right, you should know basic conceptual computer science. I felt like I did a good job of explaining that I understand enough to do a real implementation if I had to, with man pages and my IDE and so on. But nope, they wanted a piece of acrylic to be a compiler, and that's just wrong IMO.

I spent 2 years writing multicast routing algorithm implementations for an overlay multicast startup I cofounded. The moral equivalents of PIM sparse and OSPF. I wouldn't do it this way again, knowing what I know now, but we did LSA forwarding between nodes the same way a Cisco would flood LSAs over a set of unruly PPP DS1 links. The hard way, is what I'm trying to convey here.

Anyways, I interviewed a company my friend Nate was working for, on a lark (my company was sort of winding down).

First question I got: "explain Bellman-Ford routing". Bellman-Ford is one of the simplest distributed systems algorithms there is, and certainly the simplest foundation for a routing protocol (it's what the RIP protocol does). I had, prior to that interview, implemented RIP twice, in two different jobs.

I totally bombed the interview! My mind just went blank. I could explain how link-state routing with shortest path graph reductions worked, but not how a much simpler algorithm worked.

The interview made some clucking noises about how interesting it was to be interviewing someone who wasn't a Stanford CS student.

The next interview asked me to implement Towers of Hanoi non-recursively. I refused.

Obviously, I didn't get an offer.

> The next interview asked me to implement Towers of Hanoi non-recursively.

What? Obviously this isn't a great interview question, but this seems massively pointless. I don't think there is an obvious way of doing this that doesn't just turn the call stack into an explicit stack (which, again, is pointless). And the recursive code is the only code that could possibly be understood by someone whose whole world isn't dominated by understanding those ~12 lines.

EDIT: also, ya know. Who cares. I have not even heard of a software engineering legend where solving anything remotely similar to towers of hanoi was necessary.

To be honest I was a bit taken aback by the author not being able to implement a BFS. It's a fairly standard and really simple algorithm and it's probably one of the most common ones to actually implement because of a need in your day-to-day work.

What do you do if you have a nested data structure (like a tree) and want to print it in order? You write a simple BFS on it.

It's not even a question where you need prior knowledge on the algorithm, it's just logical. He could've easily asked "Print the contents of this tree to screen" and it would've been the same.

I can see how some questions like implementing quick or merge sort can be annoying, most libraries have a sort() function that usually implement either (or similar), you don't often have to write them yourself, but a BFS does not fall into that category in my opinion.

Hold on. I'm not taken aback by their inability to do BFS in an interview at a whiteboard from memory. That, I think, is a total bullshit question, which is why I think the right play there is to one-up them.

What I have a problem with is the assertion that a front-end dev shouldn't have to grok BFS. Not "be able to implement from memory", but "be able to quickly implement on demand given a few minutes research".

No. if you have a tree, you get an iterator on that tree, and print each element. Because your a pro, so of course your tree is based on a well-tested library component, one that provides a way to iterate it.

I would agree with you if the author had complained about depth-first search as opposed to breadth-first search. Any decent developer should be able to understand and reason through depth-first search but I wouldn't expect breadth-first search to show up doing front-end web development. In my own side-projects, which involve much more complicated algorithms than anything I've implemented in my day job, breadth-first search has only really come up in one context that wasn't explicitly AI related, and that was a grammar parser. In every other case that I needed to search something, it was depth-first search that I was using. If you're given the whole tree at the beginning, I think most developers would probably start with depth-first search, even if they didn't know the name of the algorithm they were using.

My degree should be proof that given some time to study I am capable of implementing it.

Technical interviews became 10x easier when I realized that most companies aren't necessarily looking for the right answer as much as they are trying to look into your mind.

As a self taught programmer things like binary search trees and linked lists are a foreign concepts (especially as a self taught frontend developer). When I am asked to solve a problem in a way I've never encountered before, people are pretty open to explaining how the problem works.

I don't get frustrated if a problem seems arbitrary or obscure because that's typically not the point. The point is, if you're going to join my team, how do you approach a difficult problem; do you get upset? do you clam up? I don't want someone like that on my team. I'd say most people would prefer a teammate who is resourceful rather than one who only wants to solve problems they're comfortable solving.

>Technical interviews became 10x easier when I realized that most companies aren't necessarily looking for the right answer as much as they are trying to look into your mind.

This is a great, great point.

I'd like to add one more: Admitting when you don't know the answer. I've been in a number of recent phonescreens where we'd ask a technical question of a simple "good/bad" sort. My advise to those reading: If you don't know, just say so. "I'm really not sure." or "I knew at one point, but I'd have to go look it up again." Perfectly acceptable. No one's a walking encyclopedia, and to say you don't know show's humility. I'm much more comfortable with someone who says they don't know and will go look for the correct answer than someone who might stand in front of the customer and try to poker face.

To those reading this and taking notes for an interview, I'd say instances like these are a good opportunity to discuss. If you don't know, ask what the correct answer was. Ask why. See if you can build off the answer, sometimes you might get asked a question you couldn't recall the answer for, but once given, you can expand upon and demonstrate your knowledge in other ways.

> I'd like to add one more: Admitting when you don't know the answer

I'll +1 this, but with a slight caveat. Some interviewers don't get it, and will ding you for this. Those are the companies to avoid.

Since those are companies to avoid anyway, you are not losing if you are rejected because of an honest "I don't know" answer.

That's fine in theory, but the reaction you get during an interview - which is a highly stressful, completely artificial situation - tells you nothing useful about the reaction you'll get when someone is settled in a role, they know the culture, and the culture knows them.

OP sounds like someone who will rage quit and have tantrums. Those aren't employable qualities.

But is that an accurate impression in context? He obviously has a practical background, likes to code, and can solve problems.

How do you tell if he'll be a net benefit or a net loss to a team?

I don't think asking anyone to improvise a BFS tells you much about that.

This sounds a lot like confirmation bias. You enumerated what you what you do when in these situations, and that your expectations of candidates are closely aligned to your own behaviors. There doesn't seem to be much room for understanding why a candidate's behaviors don't align with yours, or thought for bridging that gap.

I think you've hit on a great strategy to coast through these interviews: "Gee whiz, that's an amazing question. I've never thought of that before. It's probably beyond my reach but let me try to stumble through it. Will you help me out if I get lost? You're so amazing, how could this company get by without your brilliance?"

You have to be very careful with that attitude.

When I interview candidates, I state up-front that I care less about the solutions than the steps taken to reach one. It's okay to ask for help or hints when confronted with a previously unfamiliar problem. But if I need to provide too much help, teach the use of basic tools (whether they are logical patterns or universal utilities), and even then getting to A solution requires excessive hand-holding, I will hold that against you.

Also, I will happily invert a question if needed. Showing one way to solve a given problem, I would expect a qualified candidate to be able to point out where it could be done differently.

The OP is describing the new grad tech hiring process. It looks like an exam because it's for hiring people straight out of school.

For experienced people, it's not what you know, it's who you know. You tell your connected friend that you're looking for a job, he tells you who's hiring and gives you a recommendation.

You still have an interview, but it's no longer adversarial, it's a formality, it's friendly, it's just a screen to make sure you're not faking it. One of the people on the other side of the table has seen your work before, so is on your side. So when you get those stupid questions, it's a joke that you all laugh at. You handwave at it, and it's enough.

There are obviously problems with the above process -- it leads to hiring friends rather than the "best" candidate, but I'm surprised that we've moved so far away from it that the obviously well-connected OP is having so much trouble.

That depends on the culture of where you're applying, and maybe what team is hiring you.

When I interviewed at Google after working for a small company for 13 years, it was my first "newfangled" tech interview. I didn't know anybody in the room. And it was more than a formality. I'm a fairly terrible interviewee, and although I got hired (finally, after a very long process), I was hired at least a level below where I should have been. I blame my fuzzy memories of algorithms I haven't had to implement in 20+ years. I finally got promoted to the correct level nearly 2 years later.

Contrast this with Netflix, where I found the opening via my network. I knew more than half the people who interviewed me, and the interview itself felt like it was basically a formality. Days later they made an offer that I accepted.

If one is experienced yet not well-connected, then it's back to the interrogation room. I have a small number of really good friends, who aren't in my geographic area. So it's being asked logic puzzles by twenty-somethings who think Bill Gates invented the computer. Or OO religious questions that only serve to prove that I don't want to work with these people or for their company. And yes I've been through the Google process, at least until I bailed because it was so ridiculous - I am no one's show pony. By comparison, recently I've done a bunch of product management and analysis interviews and they've actually been more fun than not - real conversations exchanging views.

This calls the entire meme of "Silicon Valley is a meritocracy" into question. This just further validates those who criticize tech for being a monoculture with lack of diversity.

Anyone who actually believes "Silicon Valley is a meritocracy" is exactly the reason why Silicon Valley is, in fact, not a meritocracy. :)

The gauntlet also applies if you've spent the better part of a decade in an industry that recently collapsed, and your connected friends are either unemployed or living in constant fear of the layoff notice. cough oilandgas cough

This hasen't been my experience. I've been referred a few times before and still thrown to the whiteboard to reverse a linked list.

I interviewed with Netflix for a front-end JavaScript position a few years ago. I had two technical phone screens. One with the engineering manager and another with a senior engineer on their team.

Those went smoothly so I was asked to fly out to Los Gatos, CA.

On-site I was supposed to talk first with the senior engineer I had already interviewed with over the phone. He was out sick so they changed at the last minute.

In walks a guy in a fedora with full tattoo sleeves. He glances at my resume and laughs saying he hasn't looked at it at all and really knows nothing about me.

He proceeds to ask me detailed questions about Node.js. I told him that I was very clear with them on the phone, that I've played around with Node, but I hadn't used it to any serious degree (at that point in my career).

He continued with the Node.js questions for a while, some of which I knew because they were the same as the browser (console) some of which I wasn't familiar with at the time (EventEmitter).

He then asked me some simple JavaScript, Backbone, CSS questions all of which were easy. He then asked me a C# question and I knew that too.

We shook hands and after he left, the engineering manager came back in and I was basically escorted out. He said I was good but that I wasn't what they were looking for.

I've had some retarded interview experiences but that one took the cake.

I wonder if he didn't like you because you didn't look like a hipster....

I'm totally serious: I really wonder how many of these baffling interview experiences where seemingly highly-qualified candidates get turned down is really about non-technical and non-work-related factors, but no one wants to actually admit it. How much of it is due to personal biases by the interviewers, who may discriminate against people for various reasons? A lot of people want to surround themselves with people just like themselves, so if you have a company full of hipsters wearing fedoras and someone comes in for an interview and they don't look like that, they could easily be passed over because they're "not a fit for our company culture".

I don't think I'm half as competent as the guy who wrote this article, but I've had a much easier time getting jobs in general. My background isn't even CS, it's EE, so I totally suck at all the algorithm questions. But OTOH I generally apply for embedded programming positions where knowledge of algorithms isn't that important anyway. I even interviewed at Google (one of their recruiters contacted me) and had pretty much the same experience as him; I won't waste my time with that company again. A bunch of recruiters have tried to get me to interview at Bloomberg LP, but I would never work in that crappy open-plan environment that they're infamous for. But I frequently wonder how much of my success is just from being tall and in-shape, not having any obvious personality quirks, and "fitting in" with the look and the company culture (I interview with more stodgy places, not places with hipsters with arm-sleeve tattoos), rather than due to my technical proficiency.

Netflix has a very strong focus on culture so you are likely correct, although it doesn't sound like he got to the part where they actually quiz you on their Netflix culture slide deck they likely did not continue for this reason.

There is some merit to trying to maintain culture if they have some formula for what works there and they want to maintain it but an engineer could pretty easily think something along the lines of "he's not hipster enough" and then make a case that they don't fit the culture but really not care about netflix's corporate ideas about culture.

Those bloomberg recruiters must contact everyone. I'm like "yeah I play around with Rails on side projects". Then they want to interview me. I've never gone through with it because I'm pretty sure I wouldn't be successful in that position.

I hired Sahat as an intern three years ago while he was an undergrad. It was one of the best hiring decisions I have ever made. He was productive immediately and our (small) team felt the loss when he went back to school. This guy is good and gets stuff done, ask people who have worked with him.

I wish Sahat had reached out to more of his network before responding to random recruiters. Tech interviews as Sahat experienced them are broken, but tech hiring is slightly less broken especially when you leverage your network.

> I wish Sahat had reached out to more of his network before responding to random recruiters.

Has his network moved forward in what they do? I've found with my own network that so many of them are in the same (or equivalent) spots they were when I left to advance my career. Not precisely positions of power which I can take advantage of for the next step forward in my career.

You do not need to be in position of power to refer someone.

The usefulness of contacts only really matters when they are managing (or are a key-decider in) the team you wish to apply for.

Otherwise, a reference will only add to the burden you have to overcome with the interviewers - they will consider not only your qualities, but the referrer's qualities as well. "Richard vouched for Jane? But Richard is writing testing software, and we're hiring for backend. We don't need a SDET or we would have hired Richard."

I've referred people senior to me and been told by the recruiter that it was significant.

Because I can't be expected to know their work as well as they do I describe their peers' opinions (well-respected in the NOC) and the soft-skills they have like mentoring, etc.

"Even though I was only an SDET, Dave took the time to help me fully understand the math involved in the problem domain and how to efficiently model it. This allowed for a 60% increase in my deliverables and put the entire team a week ahead of schedule."

I'm incredibly fortunate then, as that has never been my experience. Working in infrastructure engineering, I've been referred by devs for most of my jobs and in each instance the referral has been a positive. I've also referred a number of devs despite not being one myself, who have also benefited.

I don't understand this. You don't have to be a Director to nominate someone for the open Director position. At least, not at any company I've worked for... is that a "thing"?

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact