Interviews exist because employers need a way to quantify and qualify your ability. Typical computer science problems are a (flawed, but concrete) way of doing that. No reasonable interviewer expects you to recollect breadth-first search flawlessly, on demand, onto a whiteboard. But they do expect you to be able to reason from a problem statement to something approximating a solution. That's fundamentally what programmers do. You should have the chops to think in this way, and you should be eager to try. Emoting "what the fuck?!" and claiming you can't, or won't, because you're not a recent graduate is an excuse and a cop-out.
A few popular open-source projects don't necessarily speak to your talent as a programmer. A 960 day GitHub streak is trivia, not a signal of anything useful. (If anything, it marks you as a target for burnout!) A few Hackathon wins and a couple hundred GH stars are the artifacts of a successful hobbyist, not a proxy for professional ability, or a gateway to employment.
My recent hires are all just people I met at events. I'm told by people outside the company that they're all very happy workers.
We do not do technical interviews. We talk casually about past work and what people want to achieve. After hiring we usually find out they have a different skillset and then we find the right tasks to match experience vs. career path.
Knowing this works for my team I would never want to go back to full-time employment at a place with HR ever again. HR has become an insular group of "specialists" who do not deliver value to organizations, and often ruin them IMO.
You are my hero. If you ever end up having an HR department you can put in mediocre software developers to do the menial tasks. Most mediocre software developers will out do your average HR drone. For hiring and interviewing use the very best people that you have.
I went on a lot of interviews (maybe close to 100) before starting my own business, and I'd say at least 9/10 of them were very positive.
By positive: I felt we both left the interview a bit happier than we went in. We both had a legitimate good time, and enjoyed it. It wasn't a grind, both people got to know each other.
I'd say I had a 10% offer rate. The most common "why didn't you select me?" follow up answer I got was "we need someone who is an expert in XYZ language." Sometimes it was even that they need someone who is an expert in ABC library (when I was already pretty-much an expert in the language.)
The ones that did lead to job offers I didn't want after the interview.
The other 1/10 was because they had me take an IQ test, or do some really super long project, or meet with literally all 20 people in the company, or they made me sit in the lobby for 2 hours past the meeting time.
I would think this is something that should have been made clear from the get-go in the initial technical phone interview.
Anyone who goes to an in-person interview should expect, at minimum, that the interviewers have done their homework.
For example, if they just want Stanford or MIT grads, fine. But to bring someone into an interview - which may involve more than a day of travel, lost vacation time etc - just to tell them they went to the wrong college afterwards, that's plain infuriating and shows complete lack of courtesy.
There's nothing really wrong with that as long as they advertise such and make it clear before even starting the interview process that you won't be hired unless you're a library ABC or language XYZ expert. The problem comes when they don't advertise that, bring you in anyway, and waste everybody's time.
Someone eventually called me to reschedule, and it took every last ounce of my willpower not to perform a verbal auxiliary anus installation over the phone.
To name names and shame the shameful, it was SAIC (before they split into SAIC and Leidos).
It wasn't that big a deal, but they didn't have a good reason to make it not a deal breaker.
I've seen this directly happen - the interviewers were two Asian and one White male, who came to this job as their first outside college. Looking over the engineering department, I saw, with one exception in fourty, young White or Asian males. The one exception was a lone Asian female, who had a doctorate in CS.
Hiring "People Like Us" doesn't help a company.
IME, that's much more a consequence of the limited diversity of candidates, not a limited diversity of hires. In most places I've worked at, even if they hired all women and blacks (or other non-Asian minorities), there would still be at most approximately 5% women and 0% blacks (the latter would probably be higher in the US).
I have personally been rejected after a phone interview, not because my technical skills were lacking, but because the interviewer didn't like something he heard in our conversation. I could quite literally hear him checking out of the conversation and going back to surfing the internet.
You can find full exams with answers of CCNA and every Microsoft cert imaginable, I've worked for companies that incentivise certifications, and i watched coworkers use these 'tools' en masse to gain easy pay bumps and pad their resumes.
I've had two interviewers ever to ask me technical questions that actually tested my knowledge or capabilities. I look good enough on paper so usually interviews are with a couple managers and if there's a culture fit I'm in. Doesn't matter if i dont know a damn thing, if my resume says MCSE on it.
Microsoft and others further incentivise these certs by giving bonuses and breaks to companies employing certified individuals.
If you think people being hired because they passed a test they cheated on is 'more objective than other fields' then maybe i just have a warped view of how objective other fields are
I agree with you that certifications are crap but I haven't encountered a single good company that cares about them.
When my team would look at candidates, past accomplishments and projects counted for tons more than a cert. Still, a cert told us what someone was interested in and what they wanted to bone up on. They could expect questions relating to that cert field during the call. If they had trouble talking on the subject, then we knew that the cert was likely there to be a pad or just something to fulfill some billet requirement. In good cases we'd find that the cert was an indicator of something they were passionate about.
So I consider it a pointer of sorts. Never worthy of hiring someone on it's own, but I'd recommend them to people who are /ready/ to demonstrate knowledge and may not necessarily have the project experience otherwise. I'd just recommend to them that they be ready to talk to it in an interview.
Vastly fewer responses and recruiters on major sites, but i've got about another 6 months to go on a really interesting couple of projects at my current job, so I'm not too upset about it.
Sometimes it is almost required (see MS certs) to be competitive.
So, they only care because it's a business thing (not a qualification thing) but they do care.
They are very keen on getting employees with existing certs or getting current employees to actively be getting more certs.
There are specific incentives from microsoft based on how many employees have certain levels of microsoft certification. it comes with discounts and marketing materials/privileges (Who doesnt want to say they are a Microsoft certified Gold support partner?)
As such, it is conveniently ignored that the guy with an MCSE isnt going to get fired because their gold level partnership lapses if they dont keep X number of MCSEs on staff
The certificate shouldn't matter in the interview itself - it is just a way to pair down the choices of who you interview when you have many candidates. If you have knowledgeable people in the interview process (i.e. at least have a dev present for a dev interview, not just managers or HR people) then you should be able to catch those that don't really have the skills.
Heck, even people who genuinely have a certificate might be rubbish overall but revised well for the exam days - you'll hopefully weed those out in a good interview process too.
Have you had many IT interviews? I have, they are vastly non-technical interviews.
Unless you count "have you ever used #CommonSoftware? how about #SlightlylessCommonSoftware?" as a technical interview.
I'm an IT administrator, not a programmer, so i'm speaking to my experience in that field, I am not trying to make any claims about how programming interviews work.
I've never had a whiteboard in an interview, I have in two interviews ever been asked to walk through a diagnostic process for problems, or asked any infrastructure or network design/architecture questions.
Its worth noting I work and live in Detroit Michigan, so I'm not exactly interviewing with major technology companies regularly. But I have worked for MSPs which glossed over technical interviews in favor of culture fit as well. (MSPs are especially incentivised to have employees with certs, they get discounts and kickbacks)
I guess maybe i insulted some CCNA or MCSE-ers
The article starts with:
>This is a story about my interview experience in the tech industry.
I also work in the tech industry, I was sharing my experience of interviewing in the tech industry.
More specifically i was responding to a comment that said:
> Hiring in CS / IT is massively more objective compared to most of other fields.
That is the illusion many people have convinced themselves of but literally 0 of my coworkers could answer the questions I was asked simply because don't have similar responsibilities despite us applying for literally word-for-word identical job ads.
The reality is, unless you do X regularly, you simply aren't going to be able to impress people with your answer to X.
It's just really hard to decipher one's ability when everyone puts technical experience on their resume because they took an online class or read a book on some thing once.
2. It likely gives you valuable experience to answer white board questions.
I don't think it's a perfect process but it weeds out fakers/resume padders. Also a lot of companies arent looking for specialists but want broad skills to handle a variety of problems cause the problems 5 years from now could be very different for the company.
GitHub stars are worth much more IMO. It shows that other developers value your work. That says a lot.
In the end, we have to realize that most developers are just average. Why go through the ridiculous process of finding average developers who by luck (or some homework) happen to solve the problems you throw at them perfectly?
I'm always happy to get a kudos on GitHub. But those of us who don't play the game of promoting our own code on social media, or who have code we can't post on GitHub, would be at a significant disadvantage in a hiring process based on that metric.
Never underestimate the value of hard work, preparation, and a can-do attitude. It eclipses natural talent every time.
Such trivia is the opposite of useful hard work; preparation for it truly wastes time that could be better spent on other things; a can-do attitude would imply rejecting or ignoring requests to do such things.
As a result, the people who actually do take the time to complete it will not be completing it because of a good attitude or work ethic, yet you will mistakenly believe they are.
I think the concerns people have about the efficacy of this interview style is valid, but extending it to the point where you start to make claims about how people who can pass them aren't as good is ridiculous.
The types of candidates who spend the time necessary to memorize algorithm trivia for the sake of passing these exams are exactly like overfitted learning algorithms. What they happen to know is unlikely to generalize well. Of course you could get lucky and hire someone like that who can generalize, but that's rare. More often, since hiring is political, you pat yourself on the back for how "good" the candidate is (based on some trivia) and make excuses when their on-the-job performance isn't what you'd hoped, and find ways to deflect attention from that so that you, as an inefficient hirer, won't be called out on it.
Willingness to waste time overfitting yourself to algorithm trivia absolutely predicts worse later-on performance than candidates with demonstrated experience and pragmatism (e.g. I'm not wasting my time memorizing how to solve tricky things that rarely matter. I will look them up / derive them / figure them out when/if I need them).
If given the choice between hiring a math/programming olympiad winner vs. a Macgyver/Edison-like tinkerer who may not be able to explain how to convert between Thevenin and Norton circuit forms, but who took their family radio apart and put it back together, Macgyver/Edison wins every time (unless you're hiring for bullshit on-paper prestige, and of course many places are while proclaiming loudly that they aren't).
But I grant this is reasoning just from the anecdata that I have. I can believe that winners perhaps represent a higher degree of skill, but then we're talking about an extremely small number of people.
Generally you're facing a tradeoff where you have to choose between a sort of rustic self-reliance skill set versus a bookworm skill set. People from either group can learn the other over time, but you can't predict how well by testing them solely on trivia that constitutes their current main group. My preference is to hire for self-reliance and learn bookworm stuff later. I used to believe the opposite (e.g. hire someone good at math because they can always learn to be an effective programmer later) but my job experience made me believe the opposite (e.g. actually it's pretty easy to teach people stochastic processes, machine learning, or cryptography, but it's incredibly hard to teach people how to be good at creative software design).
Source? or just trying to justify your own shortcomings?
 < https://en.wikipedia.org/wiki/Overfitting >
> Overfitting generally occurs when a model is excessively complex, such as having too many parameters relative to the number of observations. A model that has been overfit will generally have poor predictive performance, as it can exaggerate minor fluctuations in the data.
 < https://en.wikipedia.org/wiki/Generalization_error#Relation_... >
> The concepts of generalization error and overfitting are closely related. Overfitting occurs when the learned function f_S becomes sensitive to the noise in the sample. As a result, the function will perform well on the training set but not perform well on other data from the joint probability distribution of x and y. Thus, the more overfitting occurs, the larger the generalization error.
Shall I fetch a ruler?
> The types of candidates who spend the time necessary to memorize algorithm trivia for the sake of passing these exams are exactly like overfitted learning algorithms.
I should have asked for a source for that portion. It's the part that is ridiculous.
Again, my bad.
> Source? or just trying to justify your own shortcomings?
is clearly, unequivocally unprovoked and needlessly antagonistic (what do my shortcomings have to do with my point ... either you engage with the claim or not, but use ad hominem insults is not a valid discussion tactic. Yet you seem to assume no responsibility for your completely unprovoked hostility, and continue on with sarcasm, even sarcasm about your own ridiculousness.)
In terms of data, obviously no study is perfect and we should not simply base everything on a single study, but the 2012 PISA results offer at least some evidence < http://www.oecd.org/pisa/keyfindings/pisa-2012-results-volum... >.
The particular sections of the actual study that cover poor performance of those who focus on memorization is not available in the free sample (Chapter 2), but it was reported on, e.g. here: < http://hechingerreport.org/memorizers-are-the-lowest-achieve... >, including this:
> The U.S. has more memorizers than most other countries in the world. Perhaps not surprisingly as math teachers, driven by narrow state standards and tests, have valued those students over all others, communicating to many other students along the way – often girls – that they do not belong in math class.
> The fact that we have valued one type of learner and given others the idea they cannot do math is part of the reason for the widespread math failure and dislike in the U.S.
There is also the discussion from Google that test scores and brainteasers do not predict later-on job success < http://www.newyorker.com/tech/elements/why-brainteasers-dont... > (and many algorithm interviews are absolutely the same kind of brainteaser nonsense).
From the NY article:
> The major problem with most attempts to predict a specific outcome, such as interviews, is decontextualization: the attempt takes place in a generalized environment, as opposed to the context in which a behavior or trait naturally occurs. Google’s brainteasers measure how good people are at quickly coming up with a clever, plausible-seeming solution to an abstract problem under pressure. But employees don’t experience this particular type of pressure on the job. What the interviewee faces, instead, is the objective of a stressful, artificial interview setting: to make an impression that speaks to her qualifications in a limited time, within the narrow parameters set by the interviewer. What’s more, the candidate is asked to handle an abstracted “gotcha” situation, where thinking quickly is often more important than thinking well. Instead of determining how someone will perform on relevant tasks, the interviewer measures how the candidate will handle a brainteaser during an interview, and not much more.
The other thing we have to fight against is self-selection. It's not necessarily in everyone's interest to publicize that their riddles and algorithm hazing process isn't working. Especially not for start-ups which need investors to feel like they are crammed to the brim with stereotypical nerds or something. So if you go looking along the lines of "well, my company uses riddles and we have hired well" you're already done. You haven't hired well. Your company (if it's a startup) probably isn't even remotely proven yet, even if it's well-funded, and the verdict is out on whether the people you rejected really should have been.
Also, the question is, is the ability to write a BFS on a whiteboard a useful way to screen programming candidates for most positions. No it is not.
Next to arrays and lists, trees are such fundamental part of computer programming that I can't find an excuse for not being aware of some basic operations. I am not suggesting here that the RB or AVL trees and all the tricky stuff about them should be your bed time reading, but a certain baseline should be established.
The interview scenario is just not similar to on-the-job coding. It has different social pressures, time pressures, access to help, access to privacy, etc. etc.
Plus, for someone like me (I studied machine learning and stats, and all data structure knowledge is self-taught post college, yet I have 6 years experience writing scientific code, performant database stuff, etc.) I am always hearing about data structures and algorithms that are supposedly "fundamental" but I never even heard of them before and have yet to need them in a job.
Basically, all of my experience, and all of the experience of hiring software developers in any of the companies I've been in just completely suggests what you're saying is actually not true. It's just a story we tell to allow us to keep our crab mentality hazing rituals.
Could I have dug up a library to do what I needed, sure I guess. Easier to take 20 minutes and just write the tool.
What? That would be a huge red flag in my book. Where are your unit tests? I'm not trusting your 20 minute off the cuff reproduction of classic algorithms in any business critical piece of the code, not ever.
This would get you booted from a lot of places, or at least given a stern talking to for doing something that seems slick, cool, and time-saving in the short term (yay, let's roll our own!) when really it's immature and time-wasting in the long run.
Never (!) homebrew that shit unless you have to (like, you're in an embedded environment or your use case requires some bleeding edge research algorithm).
It's like seeing someone write their own argument-parsing code. Holy shit, what a bad idea. Never (!) do that.
Correct arg parsing is actually significantly more complicated in comparison because of all the possibilities and edge cases.
> Never (!) homebrew that shit unless you have to (like, you're in an embedded environment or your use case requires some bleeding edge research algorithm).
I can't help but think this mentality is what lead to the recent left-pad debacle
> The pseudo-code of BFS is all of 20 lines, ...
If it's only 20 lines, that's great because it means it was easy for the other 500 library writers to write it, write tests, and observe and fix issues over time. So, whew, that's 20 lines I totally shouldn't waste my time on, plus unit tests and routine maintenance I don't have to commit to.
And also, given the wide acknowledgement that a good programmer should be writing at most a few hundred lines of code per day (otherwise it's probably mostly junk), 20 lines of code is not trivial.
Probably most programmers write a whole lot more than that per day (especially if counting copy/pasted code), but this is in part a sign of a bad programmer, or perhaps more so a sign of the anti-quality constraints placed on them by most employers.
If I were ever to want to traverse something in a breadth first search manner, then I would just write it. It would take longer to search for another library and read how it's somehow implemented this search than it would to write it, test it, get it reviewed, and we've still avoided adding another potentially crappy dependency to our project.
It's not category error to compare this to leftpad. leftpad actually makes more sense imo, as it's a common microfunction that may be used more than once. BFS is typically far more coupled with the underlying data and is more naturally written inline.
No, you've just added the crappy dependency == your off the cuff implementation (which is even more work since you have to handle issues and unit testing for it too).
> What the hell sort of library would you crack out to do a breadth first search traversal?
In Python I would use networkx probably, unless there was a good reason I couldn't (such as working in an embedded environment where I couldn't install large libraries). It ought to require an exceptional circumstance to stoop to implementing it myself.
Tell me, if we switched from talking about breadth first search to, say, inverting a matrix with Gaussian elimination and pivoting, do you feel the same? Are you going to trust your implementation over something you can get from netlib?
In this case, someone who didn't have the time to visualise exactly how the bfs is operating shouldn't be writing the algorithm. Pretending like you don't need to know the details of how it traverses the tree when you needed to choose the specific implementation for this algorithm to be performant is a lie. It's not like Guassian elimination, where the concept is far more abstract, this is a concrete idea of how you're moving through the graph and shouldn't be some black magic.
On top of this, who writes this code and doesn't test that it does what you expected it to do on sample data. If you bfs implementation was bust, then you'd miss nodes or revisit them, which becomes immediately apparent from testing the algorithm you're writing.
> It ought to require an exceptional circumstance to stoop to implementing it myself.
Maybe the problem here is that I see a bfs as a fundamentally simple algorithm. Saying writing it yourself must be an exceptional circumstance sounds about as crazy to me as saying no-one should ever use for loops, they expose you to making mistakes with an index you may forget is zero-based, please use anonymous iterators instead. When something takes 5m to write, is heavily tested by default (as it will be a feature under test anyway and possible flaws will be highly obvious) and is at the same level of abstraction as the code you'll be writing around it, then just write the damn thing.
EDIT - Try writing a BFS that operates on an acyclic graph that outputs the value of each node, that compiles and runs and prints at least two node values that has a bug. Try writing bugs into this, it's actually difficult to screw up.
Yes, this kind of hubris is often the root of the problem when so-called 'simple' algorithms get homebrewed, then later on unexpected corner cases pop up, testing isn't adequate, you didn't implement it in an extensible way, your implementation is unacceptably inefficient, etc. etc.
> On top of this, who writes this code and doesn't test that it does what you expected it to do on sample data. If you bfs implementation was bust, then you'd miss nodes or revisit them, which becomes immediately apparent from testing the algorithm you're writing.
Yes, but the point is that testing isn't free. If you commit to writing your own implementation of something to interface with whatever larger business / domain-specific project created the need for it in the first place, then you have to dedicate not just the time to write it, but also the time to test it, and to test how it integrates, and to maintain the API for it and maintain its integration (possibly backward compatibility, etc. etc.)
If you adopt a standard to use a certain library, then up to the degree to which you trust the library, those problems are mostly offloaded to others. For things like classical algorithms, the libraries are extremely trustworthy, so many of the pitfalls of adding a dependency don't apply.
> In this case, someone who didn't have the time to visualise exactly how the bfs is operating shouldn't be writing the algorithm.
It's not about being too pressed for time. Even if you had all day, it's a poor use of time to create more future work for yourself by yoking yourself to all the extra responsibilities that come along with rolling your own. Yoking yourself to those responsibilities should require an exception reason for doing so.
> Saying writing it yourself must be an exceptional circumstance sounds about as crazy to me as saying no-one should ever use for loops, they expose you to making mistakes with an index you may forget is zero-based, please use anonymous iterators instead.
This is an extremely fallacious comparison. Basic control structure are built in features of the language. You don't have to test or maintain the runtime execution of a for loop, that's the job of language designers. Making mistakes in the usage of something (like an off-by-one index) is completely non sequitur to this entire discussion. I don't see how you would think that type of bug is related to anything I'm saying.
In general, you also assume a much higher fidelity of testing than what happens in the real world. In 99% of software jobs, you'll be extremely lucky if someone even documents their homemade BFS algorithm, let alone writing even the most superficial of tests. Expecting them to give you an adequate bank of automated tests is like believing in the Easter Bunny.
> Try writing bugs into this, it's actually difficult to screw up.
So we did that. works well. I dunno, maybe this company is just full of incompetents. I mean, I doubt it, but it is possible.
I appreciate judgement without context as well. Really, it's awesome.
I don't think it's controversial or insulting to say so in the least. It's a solved problem in every major language and most fringe languages. Spending time re-inventing it is just obviously wrong.
It's not insulting or judgmental to say that it's wrong. It's just a fact. I do not understand being offended or angry about my statement of that fact. If we're talking about what is effective engineering and what is not, writing your own argument parsing tool for actual business usage (as opposed to doing this as a pedagogical side project or something), without evidence of some exceptional corner case, is ineffective engineering. It's practically the definition of ineffective engineering.
If you think that AT LEAST one part of the hiring "process" isn't broken, you're living in lala land.
Lets just pick one: recruiters. You're telling me that every single tech recruiter you've come across is excellent at what they do, an amazing and decent human being and has amazing communication skills?!
(Generally/Most of the time) Recruiters don't--and yes, they should--care about candidates. They're interested in pocketing their cut of the candidates salary, so as soon as the company says they're not interested in the candidate, the recruiter is done. Most people don't get call-backs even informing them that they're out of the running, much less a reason as to why they weren't deemed qualified.
Recruiters--especially tech recruiters are NOTORIOUS for being bottom-dwelling scum. I actually know a few good recruiters, but they are--by far--in the minority.
The fact that you don't recognize this indicates that you've either had extraordinary/atypical results, or aren't very familiar with the "process". Dude.
Edit: spelling, puncturation
How do you know his skills aren't useful to the market? That seems like a terrible thing to say.
Maybe I am conceited, but I can't convince myself I am an inferior programmer because of it. I can make money for your company. I can work well with a diverse of group of people. I take pains to conduct myself with integrity. Shouldn't all that be the central measure of how people are hired ?
If having source code and demonstrated traction for a few open source projects to review don't speak to one's talent as a programmer, then what does?
A professional programmer, the one hired by a company to work alongside other professionals - as opposed to one working by themselves on small projects - is capable of presenting previous experiences and applying them to new problems. Unless you want to be put in the basement and handed small chunks of work to solve. It's about how you mentor juniors, resolve disagreements on how to proceed, deal with set backs and changing requirements.
Given the whole post is about how terrible the interviewing process is the above quote is the most telling, and backs up the GPs point - not being prepared to have a conversation about your previous experience when interviewing for a job is pretty bad.
In under 45 minutes? Are you really concerned with their ability to even naively solve an algorithmic problem in that time frame? I would think not. No, this is not only flawed, but it's so deeply flawed (and easily exploitable), that you're not even interviewing for skill set anymore, simply either 1) someone's luck for having been exposed to the problem before or 2) their ability to remember things they don't use day to day.
Congratulations, you're hiring lucky people or idiot savants.
She was offered the job on the spot.
Almost in any interview I had, I been asked about CS stuff. Never anything related to my field.
I don't blame them. They ask about what they know and what they think it's important, even if it's not relevant to the position.
The author seems to have a chip on his shoulder...
Gee, I am also very tired of people for which stuff went well, and think it is be the same for others. Can't you even imagine there is a variety of situations out there? Can't you imagine it may happen to you one day? Do you think people who encounter this kind of problems never had a shiny position earlier, and their "social skills" were considered perfect before some conceited internet stranger decided they were "socially inept"?
And you dare to talk about "experience", seriously?
I did not call the author "socially inept"; that sentence was a hypothetical example of how an individual's incredible technical skills can be overshadowed by other problems.
I am sorry if I offended you. I thought my posts were critical but logical...
As someone who watched previous coworkers, who are older, go through the interview ringer, I can sympathize with the author on the pigeonhole one can get stuck in due to seemingly trivial reasons: I don't have a classic CS background, people think I'm past my prime, etc.
These aren't imaginary problems, and I struggle to believe that I would be all "go-getter" and "pull yourself up by your bootstraps" after continually being rejected, too.
Front-ends almost always involve working with data. It is increasingly common for front-ends to have filters, searching, sorting, etc... of that data as well.
Doing that efficiently is key to ensuring a fast, responsive UI.
Wrangling with divs and CSS may take a disproportionately large amount of your time but that doesn't mean it's the most important skill to look for, nor does it mean the CS questions are somehow irrelevant to the work.
There are some positions where you don't have to care if people can really problem-solve; as long as they can tweak CSS until it works right-ish, or know the particular arcane implementation details of installing some WordPress module, they can do the job. But there are a lot more positions where you want someone who can solve the problems that need solving, pick up new technologies as needed, and just generally be a flexible and productive contributor in a way that doesn't relate to the fiddly details of one particular technology.
If you have the "mental dexterity" to coming up with your own solution which is identical to Dijkstra's pathfinding algorithms in terms of complexity within an hour interview, you are a god among mortals.
And that's what they expect - a solution to a problem that is equal to the best minds of our field. I say this from experience: I was expected to do exactly that, and criticized when I came up with a O(n^3) instead of O(n^2).
If everyone is an 10x, then everyone is an 1x .. Isn't?
Wait, really? I've had to do more than that just for internships, the bar for perfection at many companies is just that high or higher.
I agree with the rest of what you're saying.
Heck, what if I picked 5 problems from a national level math-Olympiad. You've surey studied high school math right? Should be able to solve them in the 5 hour interview I invite you to!
I thought that was a great question. It got me talking about something I cared about, knew very well, and showed my normal thought process and development process outside of an interview. It also gave the interviewer plenty of things to ask about, to "quiz" me on if they wanted, or to just make sure I could actually code.
If I'm ever in a position to interview someone one-on-one, that's most likely going to be my main question.
By opening the discussion on "my turf," he put me at ease in the conversation, but by driving the interview with very technical questions, he also got the answers he needed about my technical ability.
I've asked this question in interviews a number of times, but it's primarily to gauge communication skills rather than technical skills. The biggest problem with using it as a technical screen is that candidates are really bad at choosing a project to tell you about. I've had candidates choose as their problem building out some responsive dynamic front end and their entire solution boils down to, I picked Angular and Bootstrap.
I didn't even want to go to the interview which landed me my new job. In my previous phone screen they were looking to hire a single person.The perfect fit. Probably not me - what the hell do I know about video players? And it's one of those interviews where they'll boot you to the curb if you do poorly in the first half. Whatever, I'll go anyway. And as luck would have it, I didn't get booted. I did damn well. And in my final phone screen with the CTO, I got asked how to find the Nth last spot from the end of a linked list.
I really wasn't good at interviewing for a long time. And from the outset, I didn't know everything I needed to get the job I wanted. It was consistent studying and a buy-in to the bullshit that interviewing is that landed me a new job.
So while I lined up clients and started transitioning from the current job I applied for a couple positions (note that as an executive I did this by contacting other executives (when possible) at firms, not filling out forms on the website).
I had a few phone interviews, and these were fine.
But a couple times it would get kicked to HR and they would send links for personality tests - before I ever talked to anyone.
I guess if I was desperate for a job (really glad I didn't take one) I might have complied, but I decided I didn't want to work for a firm that approaches recruitment like that.
I do think it's important to have a fit on teams, and I have hired people with awesome skills that were terrible cultural fits more than once.
But I'm not going to fill out personality quizzes unless I am desperate. And I work really hard to try to make sure I'm never desperate. :)
What country was this in? In Ireland that qualifies as employment discrimination on the "civil status" ground.
Oh, single person. Ha, it is illegal to discriminate on the basis of civil status. They meant it as "one person".
If you apply to a Web company without much web dev experience (or knowledge) then yes, it'll count for less.
But that's true of anything. The bay area is probably the easiest place to get hired in tech right now as an engineer.
There is another side to this which leads me to ask why companies feel they have to be so defensive? My wife is a registered nurse on a cardiac critical care ward. If she doesn't know what she's doing actual people can actually die, something that is a rare outcome for even the worst software developer. Nevertheless, she has a BS, and work experience, and when she interviews they don't require her to stand up at a whiteboard and prove she's a nurse all over again. They respect her experience, and the questions are more about process, work habits, personality, etc. In the software development world we appear to have zero respect for experience, and I wonder why that is? Have employers been burned so often? Or is this more of a geek cred gauntlet thing?
There is no such regulation in the tech industry. One bad hire I made faked his technical capabilities and I regretted it a lot when I had to work with him. I hope I've now learned how to spot that type of candidate (without resorting to interview hazing).
Companies that have the kind of engineering culture required to produce safety-critical products already treat software engineers like any other engineering discipline and expect them to act with the same rigor has other engineers. That's actually why I cringe every time I hear about a Valley software company wanting to pivot or branch into products like that.
Way back, I was part of a 4-person team that did some phone interviews of contractors. We asked some pretty basic questions, based on their resumes, and after the call voted on him/her. One of my questions to see if they knew anything about C++ was just to ask if they could tell me what a class was. A bunch of candidates couldn't answer that question.
It really isn't that hard to ask some basic questions and talk about work experience, to get a handle on whether someone is actually a competent programmer or not. If you're looking for top-tier talent, this might not be sufficient, but most jobs do not need top-tier talent anyway.
I later discovered his resume was an identical copy of his coworker's resume (both were submitted several months apart for the same job).
BTW, why didn't you check references?
I don't doubt for a minute that there's some kind of alpha geek thing going on with a lot of interviews. But it's also a reflection of the fact we don't have a widely-accepted accreditation body that can vouch for people.
As the infamous CodingHorror post points out, an awful lot of programmers can't actually program. That means interviews have to check for more than culture fit. It also drives extremely high selectivity, and when you know you'll be rejecting 90% of your applicants it becomes that much harder to give everyone a smooth and welcoming interview.
Of course, none of this deals with the problem that (as Sahat found) many software interviews don't actually check for relevant skills. That's just stupid.
When getting licensed in a new state (ie move from Montana to California), you just need to submit basic paperwork, background check/get finger printed and then you are licensed for the new state (no need to retake an exam). All these comments seem to indicate nurses have insanely high selective pressures to weed out people who suck. It's really, really not the case. The stories I hear about certain nurses who completely lack common sense are astounding.
When my sister gets a new position (even when she started travel nursing after only 1 year of experience), she has a phone interview with a company hired by the hospital to find travel nurses. They may or may not contact her previous employers/references and then she finds out within a day or two if she has the job. It typically takes her 1 week to get a new assignment, 3 weeks tops if she is being exceptionally picky and trying to work in a specific area/hospital. That's all there is to it.
She's gone through this process about 6 times in the past two years (I've even seen her do a phone interview while at a wine tasting! -- and she was offered the position) and she rarely stresses about these interviews as they are not technical.
Every time credentials come up with respect to this industry there are a whole bunch of people who complain that credentials are meaningless because people can just cheat or coast their way through. They go so far as to include CS degrees themselves in that category. What makes you think the education and credentials for nurses are so much better than for software engineers that nurses can be hired based on credentials and software engineers cannot?
Board certification is not a legal requirement in any jurisdiction I am aware of. The licensure tests and board certifications are separate, and the former is generally taken very close to when the doctor, lawyer, or nurse graduates from their program and completes their internship period.
> The second is that licensing, certification, and recertification are functions of state government, not voluntary industry guidelines.
Yes and no. State governments decide what they will recognize, but industry provides the training. Some of the shit that counts as "continuing education" for the medical profession is little better than what you get at DeVry for programmers.
That said, at least these professions have strong professional associations that birddog state licensing boards to keep the bullshit out. My original comment, in fact, was motivated by IEEE and ACM's attempts to do the same for software engineers and the way the industry seems to be laughing at their efforts.
License requirements are by state, but yes, there is a licensing process.
> Does she have to have a certain number of continuing education hours every year?
It depends on the state, but yes, some states require a minimum number of hours of education to renew a license.
Reciting the implementation of some obscure algorithm from memory is entirely irrelevant to the daily job of a software engineer. Being able to do that does not make you a good software engineer, and most good software engineers cannot do that.
They do these "programming trivia" style interviews because it makes them feel clever and superior to the interviewees, and because it's a lot less work for them than conducting an interview that actually probes what level of software engineering talent the candidate has.
On the one hand: the interview processes this post describes are hilariously broken. Stand up at a whiteboard and implement breadth-first search from memory! You know, like no programmer at their desk staring at their editor ever does. I think "that's the one where you use a queue, right?" is a fully valid and complete answer to that dumb question.
I also think you're within your rights to demand that your interviewer implement Kruskal's minimum cost spanning tree from memory at the same whiteboard before you have to do BFS. That's an extremely simple and important graph theory algorithm that nobody memorizes either.
These interviews are nerd status rituals. Really good candidates know this and game them. If you know anyone like this, ask them for stories. I've heard some great ones. But obviously, this isn't a good way to select software developers.
On the other hand...
The idea that "front-end developers" shouldn't need to be able to implement a BFS (at all) bothers me a lot. If you're a software developer, you should grok basic conceptual computer science. You should be able to work with a graph. If you're doing web work, you're working with graphs all day whether you grok that or not!
I've been doing front-end work for the past month, and I've had to do more low-level profiling and performance work here than in the previous 4 years of low-level systems work.
Frontend devs can do many tasks that systems engineers (who can supposedly whip out algorithms in their sleep) cannot do:
- Make pages render properly in all popular browsers
- Make responsive UIs
- Align text of variable length in the vertical center of a page
- Know when to use tables and when not to
- Understand when to use a SPA app and what not to
- Make SEO friendly pages
All this has nothing to do with said algos. I really think that people who don't understand this should not be interviewing frontend engineers.
I have no problem with thinking engineers should be able to understand and implement algorithms. I have a real problem with thinking they need to know them off the top of their head.
Wouldn't a more reasonable approach be to give an algorithm and ask the person to implement it? If that's what they're actually going to do (since most of us aren't PhDs) day-to-day, and they may more likely find an appropriate algorithm on Wikipedia and implement it, isn't that more relevant?
Lets say your software as a service website has a real time analytics graph, so that customers can look at statistics of how much they are using the service in real time.
One potential question a customer might want to know is "How many API calls have I made over the last hour". This is a moving window average question, and it needs to be displayed/done on the front end.
My company that I work at has such a feature.
If I were applying for your company, I would expect to have to implement an algorithm like this on the job, but I don't see why I should have to know it to get the job. As long as I was capable of implementing algorithms as described, wouldn't that be the most important factor?
Instead, the PM might say "We need a system which can count how many unique events of each type has occurred in the last 100k requests, with small impact to time, high accuracy, and up to 100mb of space"
But even that would be unusually specific. It would usually be: "the service breaks sometimes, can you figure it out?" And then the engineers figure out that it's because of unusual event distributions, and they figure out what needs to be done, how much space / time they can afford to do it, etc. For instance, do you trade off accuracy by having a periodic job flush old events out of the map? Do you quantize by time to save space at the cost of resolution?
That's what I do at my job, and that's the kind of question I see in interviews. If a candidate is expecting to have the algorithms dictated to them, then I would not consider them to be a software engineer.. coding up a program from a specification would have been a technician's job 30 years ago (now largely automated by compilers, synthesizers (for HDLs), and other tools). The software engineer should be coming up with solutions, which often involve using algorithms and data structures.
If your goal is to simply know how someone explores a problem, fine I guess. I don't agree with your approach but I can see your point. But if you expect a working implementation, that's unreasonable in my opinion and I believe you'll be rejecting many fine candidates using that method.
Don't you have release deadlines, even if they're measured in weeks?
Depending on the complexity of the problem, I've been expected to produce code. That tends to be tricky and I don't always get it 100%, but I've not finished my code and gotten just the general direction, and still passed (received the offer).
> you'll be rejecting many fine candidates using that method.
I could see that being true. Without knowing what kind of company you work at I wouldn't know. Since I work in a big tech company who interviews in this fashion, all my coworkers who I'd consider fine candidates have passed this (partly arbitrary) bar.
It's not something I was good at, but I decided that I wanted to work at a big tech company, so I decided to play the interview game.
What algorithms have you seen in practice in web frontends? I have seen nothing even minor, not even binary search or bublle sort.
I can cite examples from the front-end I'm working on right now, but that wouldn't sound like a fair example (it's an AVR debugger). But all sophisticated applications are complicated in their own ways, and most of them require developers who can actually, you know, program.
Even if all you're doing is CRUD, how much computer science is embedded in even a straightforward framework like React? "Lots", is I think the answer.
React, Angular, Ember, etc, are all basically just giant tree diffing algorithms. The DOM is a tree. Understanding how to manipulate trees is important. Memorizing algorithms, less so, but it's absurd to suggest that front-end developers should get a pass on being able to code generalized solutions to problems.
More and more, "front end" is coming to mean "the whole application," and the difficulty is moving away from the gritty individual DOM manipulations and into writing structured algorithms to process large sets of application data and wrangle state, which you then throw into your deterministic renderer of choice.
It sounds like when you say "front end developer," you might really mean, "person who is comfortable with HTML and CSS." In that case, sure, that has nothing to do with algorithms, but that person is also not a software developer.
They were disappointed, they wanted to see runnable code right there on the whiteboard.
I think you're right, you should know basic conceptual computer science. I felt like I did a good job of explaining that I understand enough to do a real implementation if I had to, with man pages and my IDE and so on. But nope, they wanted a piece of acrylic to be a compiler, and that's just wrong IMO.
Anyways, I interviewed a company my friend Nate was working for, on a lark (my company was sort of winding down).
First question I got: "explain Bellman-Ford routing". Bellman-Ford is one of the simplest distributed systems algorithms there is, and certainly the simplest foundation for a routing protocol (it's what the RIP protocol does). I had, prior to that interview, implemented RIP twice, in two different jobs.
I totally bombed the interview! My mind just went blank. I could explain how link-state routing with shortest path graph reductions worked, but not how a much simpler algorithm worked.
The interview made some clucking noises about how interesting it was to be interviewing someone who wasn't a Stanford CS student.
The next interview asked me to implement Towers of Hanoi non-recursively. I refused.
Obviously, I didn't get an offer.
What? Obviously this isn't a great interview question, but this seems massively pointless. I don't think there is an obvious way of doing this that doesn't just turn the call stack into an explicit stack (which, again, is pointless). And the recursive code is the only code that could possibly be understood by someone whose whole world isn't dominated by understanding those ~12 lines.
EDIT: also, ya know. Who cares. I have not even heard of a software engineering legend where solving anything remotely similar to towers of hanoi was necessary.
What do you do if you have a nested data structure (like a tree) and want to print it in order? You write a simple BFS on it.
It's not even a question where you need prior knowledge on the algorithm, it's just logical. He could've easily asked "Print the contents of this tree to screen" and it would've been the same.
I can see how some questions like implementing quick or merge sort can be annoying, most libraries have a sort() function that usually implement either (or similar), you don't often have to write them yourself, but a BFS does not fall into that category in my opinion.
What I have a problem with is the assertion that a front-end dev shouldn't have to grok BFS. Not "be able to implement from memory", but "be able to quickly implement on demand given a few minutes research".
As a self taught programmer things like binary search trees and linked lists are a foreign concepts (especially as a self taught frontend developer). When I am asked to solve a problem in a way I've never encountered before, people are pretty open to explaining how the problem works.
I don't get frustrated if a problem seems arbitrary or obscure because that's typically not the point. The point is, if you're going to join my team, how do you approach a difficult problem; do you get upset? do you clam up? I don't want someone like that on my team. I'd say most people would prefer a teammate who is resourceful rather than one who only wants to solve problems they're comfortable solving.
This is a great, great point.
I'd like to add one more: Admitting when you don't know the answer. I've been in a number of recent phonescreens where we'd ask a technical question of a simple "good/bad" sort. My advise to those reading: If you don't know, just say so. "I'm really not sure." or "I knew at one point, but I'd have to go look it up again." Perfectly acceptable. No one's a walking encyclopedia, and to say you don't know show's humility. I'm much more comfortable with someone who says they don't know and will go look for the correct answer than someone who might stand in front of the customer and try to poker face.
To those reading this and taking notes for an interview, I'd say instances like these are a good opportunity to discuss. If you don't know, ask what the correct answer was. Ask why. See if you can build off the answer, sometimes you might get asked a question you couldn't recall the answer for, but once given, you can expand upon and demonstrate your knowledge in other ways.
I'll +1 this, but with a slight caveat. Some interviewers don't get it, and will ding you for this. Those are the companies to avoid.
OP sounds like someone who will rage quit and have tantrums. Those aren't employable qualities.
But is that an accurate impression in context? He obviously has a practical background, likes to code, and can solve problems.
How do you tell if he'll be a net benefit or a net loss to a team?
I don't think asking anyone to improvise a BFS tells you much about that.
When I interview candidates, I state up-front that I care less about the solutions than the steps taken to reach one. It's okay to ask for help or hints when confronted with a previously unfamiliar problem. But if I need to provide too much help, teach the use of basic tools (whether they are logical patterns or universal utilities), and even then getting to A solution requires excessive hand-holding, I will hold that against you.
Also, I will happily invert a question if needed. Showing one way to solve a given problem, I would expect a qualified candidate to be able to point out where it could be done differently.
For experienced people, it's not what you know, it's who you know. You tell your connected friend that you're looking for a job, he tells you who's hiring and gives you a recommendation.
You still have an interview, but it's no longer adversarial, it's a formality, it's friendly, it's just a screen to make sure you're not faking it. One of the people on the other side of the table has seen your work before, so is on your side. So when you get those stupid questions, it's a joke that you all laugh at. You handwave at it, and it's enough.
There are obviously problems with the above process -- it leads to hiring friends rather than the "best" candidate, but I'm surprised that we've moved so far away from it that the obviously well-connected OP is having so much trouble.
When I interviewed at Google after working for a small company for 13 years, it was my first "newfangled" tech interview. I didn't know anybody in the room. And it was more than a formality. I'm a fairly terrible interviewee, and although I got hired (finally, after a very long process), I was hired at least a level below where I should have been. I blame my fuzzy memories of algorithms I haven't had to implement in 20+ years. I finally got promoted to the correct level nearly 2 years later.
Contrast this with Netflix, where I found the opening via my network. I knew more than half the people who interviewed me, and the interview itself felt like it was basically a formality. Days later they made an offer that I accepted.
Those went smoothly so I was asked to fly out to Los Gatos, CA.
On-site I was supposed to talk first with the senior engineer I had already interviewed with over the phone. He was out sick so they changed at the last minute.
In walks a guy in a fedora with full tattoo sleeves. He glances at my resume and laughs saying he hasn't looked at it at all and really knows nothing about me.
He proceeds to ask me detailed questions about Node.js. I told him that I was very clear with them on the phone, that I've played around with Node, but I hadn't used it to any serious degree (at that point in my career).
He continued with the Node.js questions for a while, some of which I knew because they were the same as the browser (console) some of which I wasn't familiar with at the time (EventEmitter).
We shook hands and after he left, the engineering manager came back in and I was basically escorted out. He said I was good but that I wasn't what they were looking for.
I've had some retarded interview experiences but that one took the cake.
I'm totally serious: I really wonder how many of these baffling interview experiences where seemingly highly-qualified candidates get turned down is really about non-technical and non-work-related factors, but no one wants to actually admit it. How much of it is due to personal biases by the interviewers, who may discriminate against people for various reasons? A lot of people want to surround themselves with people just like themselves, so if you have a company full of hipsters wearing fedoras and someone comes in for an interview and they don't look like that, they could easily be passed over because they're "not a fit for our company culture".
I don't think I'm half as competent as the guy who wrote this article, but I've had a much easier time getting jobs in general. My background isn't even CS, it's EE, so I totally suck at all the algorithm questions. But OTOH I generally apply for embedded programming positions where knowledge of algorithms isn't that important anyway. I even interviewed at Google (one of their recruiters contacted me) and had pretty much the same experience as him; I won't waste my time with that company again. A bunch of recruiters have tried to get me to interview at Bloomberg LP, but I would never work in that crappy open-plan environment that they're infamous for. But I frequently wonder how much of my success is just from being tall and in-shape, not having any obvious personality quirks, and "fitting in" with the look and the company culture (I interview with more stodgy places, not places with hipsters with arm-sleeve tattoos), rather than due to my technical proficiency.
There is some merit to trying to maintain culture if they have some formula for what works there and they want to maintain it but an engineer could pretty easily think something along the lines of "he's not hipster enough" and then make a case that they don't fit the culture but really not care about netflix's corporate ideas about culture.
I wish Sahat had reached out to more of his network before responding to random recruiters. Tech interviews as Sahat experienced them are broken, but tech hiring is slightly less broken especially when you leverage your network.
Has his network moved forward in what they do? I've found with my own network that so many of them are in the same (or equivalent) spots they were when I left to advance my career. Not precisely positions of power which I can take advantage of for the next step forward in my career.
Otherwise, a reference will only add to the burden you have to overcome with the interviewers - they will consider not only your qualities, but the referrer's qualities as well. "Richard vouched for Jane? But Richard is writing testing software, and we're hiring for backend. We don't need a SDET or we would have hired Richard."
Because I can't be expected to know their work as well as they do I describe their peers' opinions (well-respected in the NOC) and the soft-skills they have like mentoring, etc.
"Even though I was only an SDET, Dave took the time to help me fully understand the math involved in the problem domain and how to efficiently model it. This allowed for a 60% increase in my deliverables and put the entire team a week ahead of schedule."
> To be fair, I already knew about Google’s idiotic interview process that is optimized for hiring book-smart academic candidates who know their algorithms and data structures cold, so my expectations were rather low to begin with. I also did not get much sleep that day, so my problem solving skills weren’t at its peak.
The author is already annoyed, antagonistic, and planning for failure. He knew what he was up against, but didn't adequately prepare to succeed by studying. And didn't sleep the night before.
And none of the reasons for not succeeding are the author's fault, in his mind.
You can claim that the screening process is dumb, and in some ways you're right. But at the same time, this process does select to some extent at least against the people who make excuses instead of bringing their A-game when it counts.
If you want a job at a place like this, you can do it: commit yourself to a plan of action for success that involves real work and preparation. Wanting to change the industry practice of the top tier firms is interesting and all. But if you want to get in, work and preparation will reap more rewards than complaining.
If a candidate isn't willing to put in the work to prepare, it's not much of a cognitive leap for me to envision them telling me that implementing this or that feature is dumb, impractical and not needed. Many developers have this "start with no" defeatist attitude as their default position on most problems of moderate complexity, and it's a real problem. Given a chance to select for candidates that instead figure out a way to make it happen I'd rather take the winners.
"This person didn't study for the interview, so what did they expect to happen?" Oh, I don't know, maybe to be asked questions directly related to the position, amongst other things.
No, I don't find it strange that I want them to prepare for something they're not going to use normally, but agree it might be confusing.
For the actual positions I'm hiring for, nobody in the world outside our teams understands the systems. The candidates are going to need to work hard to learn what's going on when they start their new job. Some of it is going to be hard to do and not always a complete pleasure. Essentially I see it as a personality test that selects for the kind of people that figure out how to overcome technical adversity.
Willingness to do the work to be prepared and eagerness to dive into technical arcana despite it being not very easy-- these things are very relevant to the positions I hire for at Microsoft.
<mind blown GIF goes here..>
> Essentially I see it as a personality test that selects for the kind of people that figure in it how to overcome technical adversity.
This seems at odds with the original "ask", as it were: study up on some basic CS algorithms, and now I know you can handle what we're going to throw at you here. Ignoring the obvious "well, we need to make sure they can learn basic things", how is this even a viable test if your environment is so advanced, so high up, that you expect new candidates to encounter actual adversity?
I think this pretty much sums up what is wrong with hiring.
That being said, though Google has contacted me a few times for an interview, I've never gone through with it mostly because I'm not that interested. I hear from a good friend who worked there that Google is full of "algorithms walking around looking for a problem" and as much as I like programming and problem solving I hate it when its lost sight of serving a human need and I skew strongly toward the experimentalist side (https://www.cs.purdue.edu/homes/dec/essay.criticize.html) of this whole thing we do.
If Google wants to ask crazy algorithm questions, that's at least sort-of reasonable. There's at least a hypothetical chance your work will involve that knowledge.
But I've interviewed at companies who's entire tech stack is a simple CRUD web app, or a simple RESTful mobile app, who's so-called "big data" is less than a few gigabytes in total, and they still want to throw algorithms and brain teasers at me. These same companies then post some inane blog post about the "developer shortage" and how "we can't find talented people".
If these companies could actually find the candidate they're testing for -- this hypothetical hyper-intelligent developer who had a IBM-Watson-like memory of every CS algorithm and it's application in every language, that person would get completely bored working there and quit in 3 months or so. These companies could never retain the type of person their own hiring process exclusively selects for
Hiring doesn't need to get fixed, companies need to get honest about what they are and what they actually want. To restate this in 1-10 scale : Many companies delude themselves into believing they are a 9 or 10, and are trying exclusively to hire 10's (and hypothetical 11's that don't exist).
When in reality, most companies are in the 4 to 6 range, they only need people in the 4 to 6 range, but are rejecting 7 and 8s type candidates, because they aren't a 10.
This process itself isn't broken, so much as the businesses holding the power in the process are delusional about...everything (what they are, what they need, who could provide that, etc).
I would however disagree with your handwaving statement that 'they only need people in the 4 to 6 range'. Sometimes companies need some highly skilled types to try and rein in the chaos being creating by the 4s and 6s.
And as they're all similar style questions, they should just read up on these. Hold off on interviews until you're confident, or just schedule the less-desirable firms first, so that you have enough time to read up.
Interviewees know that these questions are coming, they can control the schedule, and they're the only ones who know their own energy and state. Nobody tells them that it's up to them to do flow control. But it's a hot market and people want good programmers. Even if you don't do tree searches or inversions in your normal work day, you can show that you're good because you can understand this stuff. Most candidates can't. So please, take your time, skim the textbooks, take a few notes, maybe do a few flash cards.
So please, take your time, skim their resumes and maybe do a bit of github code reviews before inviting someone to sit down. Hell, do it while you're taking a shit in the office if you, as an employer, are overworked. There's a pun in there.
As for their code, I learn a little bit about their programming style -- and it's often unflattering and unhelpful, as these are usually personal projects where people write code below the standard they'd use at the office -- but it tells me little of what parts were hard, what parts involved real insight from the candidate, and what parts were pasted from stack overflow. Given a github URL, how much time do you think it takes to figure out where the interesting parts are? And what's the information payback of that time, given the little information you get about a person's actual capabilities from a bunch of text saved on a server?
I know that everyone's their own special snow flake, but I don't have more than the 45 minutes to give them a fair shake. And it has to be fair - a question that they should probably have enough context to know, and who's answer can't be bull-shitted. I've got 45 minutes to give as objective an interview I can, balancing the Type 1/2 errors of missing someone good, and hiring someone bad.
1 a week was an example of what I was doing when I went interviewing. I mentioned that as slow enough to read up as much as you need. Did you want more? Less?
[Edit: not speaking for my employer, whose name I don't want to mention. ]
I kind of get the impression you're bitter or projecting something, but I can't quite put my finger on it. Not trying to be snarky, just saying.
Job hunting is a big deal. It's real work, and literally everyone who's doing an interview has been interviewed, almost certainly in the same process. I studied up before my interviews, and I worked to manage my energy and capability when doing them. The article's premise that they should walk into an interview room and mystically get evaluated on traits that you can't directly measure, or compare objectively, is ridiculous. I read the article thinking this is the same type of guy that probably complains about having to see homeless people in SF. Completely entitled.
I take offense to the premise that I don't have empathy for the candidates - evaluating a person for a job is a deeply emotional affair. I have to walk a fine line between the needs of the candidate and employer. I can't hug you or take you out for drinks, as that could result in a lawsuit. The best I can do is find the interview question that lets you show off the best of your talents.
Asking for empathy without giving any in return isn't going to get you very far.
Your post was unwarrantingly personal similar and reeked of projection. Like I said - you're not relevant to my original argument if you're truly good at it and actually spend the time to care. Others don't. You're not who I'm talking about. Christ, you're taking this more personally than you should be....
Also, for what it's worth. Since you used your real name here, I looked you up on linkedin (sorry). The interview process there was the absolute worst and most impersonal series of interviews I've ever experienced in my life. I absolutely refuse to work for them after this: https://news.ycombinator.com/item?id=11451431
It was the fact that I'd taken the time & effort to meet with these people and to not even get so much as an email back... it's not right. People shouldn't be treated that way, job candidate or not. It's basic manners.
It's kind of like that date that you both absolutely loved over a long weekend, but the other person refuses to respond to. You can't really be mad, except think bitterly, "...really?"
It probably isn't the "right" way for programming interviews, but once you identify the rules of the game, it behooves you to play the game accordingly, no?
Fundamentally, I agree that it is silly to ask questions that don't have a high degree of relevance to the job in question. But I am conflicted.
Why haven't we (as an industry) done better with this? Clearly there must be a reason or friction in asking relevant questions in lieu of the current batch of algorithmic type interviews. What is that reason?
Perhaps OP should try avoiding recruiters and big sexy firms.
They report having a positive interview experience but were disappointed for being passed over.
"How is this possible — I have received emails from people all over the world, who praise and give me thanks for the work I’ve done on my open-source projects and tutorials (#humblebrag)"
Perhaps there was simply another qualified candidate? OP could use a little actual humility.
I do understand OPs point that the interview process can feel disconnected from the actual work. However writing software is only part of being a software engineer. Communication / working with others is also important. This is where attitude fits in. Being a professional isn't just about wearing the right clothes/etc.
Don't mean to beat up on the OP. Perhaps they would be a great employee and are feeling a little discouraged. Still, seems like all the praise for developers (e.g. being solicited by recruiters) has gone to their head.
Did he say that this should immediately qualify him for any job? No, his point was that although he has real-world projects used by hundreds of devs, this gives him barely any credence at all in interviews. Is it so "entitled" for him to expect that his work should speak on some of his behalf?
That seems like a reasonable interpretation, emphasized by when he got through all the programming questions without a problem, and still didn't get the job.
One thing I've found when I'm interviewing, I try to be enthusiastic and act like I really want to work at the company, even when it's a boring company. Attitude makes a huge difference.
Yes, I do. I don't act genuine because when I'm my genuine self, I glare at everyone in the room with the rage of a thousand suns.
I don't know if people can tell that I'm hiding it or not, but when I try to hide and act enthusiastic, I get the job, when I don't, I don't.
Somewhere along the line, though, I learned to stop caring about pretending to be excited and just accept that a majority of companies won't hire me for what they perceive to be a lack of apathy. It's a pretty good filter for finding companies that I'd be happy working at, but I understand that it's not for everybody. I just hope for a much more genuine tech/otherwise community so that I don't have to pretend anymore. Pretending just feels like shit.
I'm not saying any of this under any financial comfort. I'm friggin' broke right now. Though, I'm much more content with myself, my quality of work has skyrocketed and I've learned to bulk cook a mean chicken thigh and lentil soup.
I gave up at that one. People told me, "find a job you enjoy" but I think it's a lie. No one pays you to enjoy yourself. I haven't found a job to make me happy. Maybe someone found that kind of job but I didn't.
So now I find a job, work for a couple years, then quit and take months off at a time. I use the money I earned to fund my travels, which I enjoy. And sometimes at work, I look back at something I built, and the feeling of having built something is a really great feeling (even if no one cares about that thing). Then I go back to glaring at my coworkers.
If you're ever in Chicago, hit me up and we can grab some drinks.
Surely your prior work is under some sort of NDA? It's all proprietary (unless it is open source).
I got an offer and accepted it.
i wouldn't want to break that anonymity. also, i already had the displeasure of having a candidate turn his laptop to me and show his great coding a skills with a competitor online app store (yes, a major brand app store source), and i had to get involved in all sorts of annoyance from HR and legal. maybe I'm just dumb for taking those things seriously... but i think that's for the better
> I do not consider myself to be an excellent programmer, maybe an average at best (who can’t even pass a single tech interview), and a GitHub commit streak has absolutely nothing to do with one’s competence, skill or ability to code...my job at Yahoo was my first and will likely be my last one. From this moment on, I would rather be unemployed and homeless rather than do another tech interview. Enough is enough. I’ve hit my limit with this broken hiring bullshit in tech. No. More. Interviews. Done.
This comes off more as a "Oooh, I have a great post idea that will pick up traction on Medium!" move than anything else, unfortunately.
> At some point during our conversation I casually mentioned Webpack 2 and tree-shaking, which was followed by a question on how would I implement a tree-shaking algorithm. How the hell would I know?
Attitude cuts both ways.
I sort-of agree that these specific tasks shouldn't be used as an indicator of programming skills, but a breadth-first-search is really quite simple. It's mode of operation is in the name, and even if you haven't implemented one in a while, you should be able to derive some valid pseudo-code after a while. Especially if you do as much coding as this person claims, and thus are confronted with problems to solve on a regular basis.
This is a normal mode of self-conscious thought for a huge number of people who are otherwise excellent at their jobs and perfectly functional (e.g. introverts).
Plus, after you've worked a job for a while and know your team, know what's expected of you, and can get in the zone with some privacy, then these thoughts no longer really affect you. So they only play a significant role during an interview specifically because the interview is not at all a realistic representation of what working will be like.
I find it incredibly disturbing that you see fit to infer anything from the author's so-called "inability" to program "under stress." When someone makes that kind of inference, I think it says a lot more about their dysfunction as a colleague than it says about the other person's programming skill. Even if this problem had been Fizz Buzz or swapping two numbers, it's so, so far from simulating what a real work social situation would be like that it renders it useless for assessing the candidate, no matter if they answer it perfectly or fail completely. It's just a useless scenario.
I spend maybe four hours per coding in a moderately complex domain where I have zero need to know how to write a bfs. To ask someone to do it, then to ding them on it is just self-masturbatory behavior.
Something like that, right?
He proved that he's a decent programmer by releasing a bunch of semi-successful open source projects. He worked at Yahoo. He has 900+ days GitHub streak. That makes him passionate about building things more than a huge chunk of programmers.
Not being able to implement a random algorithm on a whiteboard while applying for a front-end position has absolutely nothing to do with the lack of passion. Of course he gave up, and so would I if I was in his shoes.
Tbh, that could be a negative, there were a lot of lousy programmers at Yahoo. Doug Crockford worked there, so there were a lot of really good programmers too, but you can't know just based on working at Yahoo.
Sorry, I forgot that I was talking to oompaloompas