Hacker News new | past | comments | ask | show | jobs | submit login
[flagged] Rant HN: I hate hackerrank
239 points by throwaway_415 on Oct 8, 2016 | hide | past | web | favorite | 150 comments
I'm currently going through interview stages for several companies for the first time in years and the en vogue trend seems to be a timed hackerrank test. Without fail this involves an algorithms-heavy rip off from leetcode/projecteurler/googlefoobar/topcoder culminating in some trickily smart O(N) DP solution. I'm an experienced developer with 5 years of thorough product experience in the domain. I'd completely appreciate the importance if the role actually did involve trying to eek out efficiency in a distributed system with high throughput... but 9 times out of 10 it's boilerplate business logic.

On the one hand I like that it normalises the application process against prejudice (I was lucky enough to go to a top engineering school) so it allows a fair entry level to all, however, it appears to be just an utterly irrelevant IQ hazing.

Anyone else going through this pain? What are we to make of the recruiting scene going forward? Are we now at a point where an engineer should at all times be intimately familiar with competitive programming and codegolf techniques?




I hire engineers for a living, and find intellectually dishonest any interview practice I wouldn't enjoy myself. Paper coding, whiteboard coding, brain teasers, and algorithmic beatdowns are out. Representative take-home work samples and conversational problem solving are in.

My candidates are told throughout the process that what we're looking for is a demonstration of how technical collaboration might work if we were employed by the same company. This takes away most of the stress of interviewing, which I know via candidate surveys.

My advice would be to steer clear from employers who use a soulless cookie cutter process that makes people feel like a commodity. This is how you'll also be treated during daily interactions and in conversations about your career development. Don't be under any illusion that you'll be able to find yourself on the right side of such a situation.


> soulless cookie cutter process that makes people feel like a commodity

I agree most should steer clear from them, but the interview process is often an accurate representation of the level of the team:

* If the interview involves a copied a set of questions off the internet, verbal or written, they don't have the skills on the team to have a dynamic discussion.

* If the interview doesn't have enough standard questions, then they're either inconsistent or might rely heavily on certain people's intuition, the latter which could go either way but the former could mean complete disorganization.

* If they expect you to know the circumference of the Earth (equatorial 24874 mi/40030 km, meridional 24860 mi/40008 km) and capital of Sumer in 2281 (Uruk), then they could be interested in someone that has a scientific mind, great memory, and likes trivia or is very into history.

* If they give you problems to solve, they want to see and hear how you think.

* If they give you homework, they probably just want you to provide solutions in code and figure things out on your own to some extent, and the amount of time they give you to do that is indication of how they estimate tasks.


"* If they expect you to know the circumference of the Earth (equatorial 24874 mi/40030 km, meridional 24860 mi/40008 km) and capital of Sumer in 2281 (Uruk), then they could be interested in someone that has a scientific mind, great memory, and likes trivia or is very into history."

===> ... or who has the skill of typing "google.com" on a browser


>"Representative take-home work samples and conversational problem solving are in."

Exactly. How many times have you reached out to your network and said "HEY! I am trying to solve problemX and I am stuck on Y -- anyone done this before?"

You cant expect everyone to know everything - ESPECIALLY in an interview, and even more-so in a panel interview.

Measure their problem solving skills, not their intimate knowledge of tech/lang X....

and then on-top of that, judge their fit for working well in the team!

only ask that when they come back from a takehome issue, that they say exactly how they solved it:

"I had to call my buddy over at BigCorp and say, hey dont tell me the answer - but lead me to where I might figure out how to solve this problem"

OR

"Hey joe, I did this - but i am not sure how efficient it is - did you do something similar?"

Tell them to get a slack channel of their peers to help them succeed.

I am tired of everyone trying to be the hero - all my contacts try to support one-another, the interview process should be no different.


> I am tired of everyone trying to be the hero - all my contacts try to support one-another, the interview process should be no different.

Amazing, until you put it into writing I never realized that I do this too.

At least once a day I'll get a random question from my peers in areas where I have more experience than them, which also helped me gain real world work experience years before I even had my first job.

And I do the same when I wander in areas where I haven't had so much experience, discussing approaches and common pitfalls, which makes so much sense.

Yet in interviews it's like you're going to be working alone forever and have to be the best in these exact technologies/languages/stack or you'll never be able to do your job.

I guess this is what you end up with after calling everyone a ninja-rockstar-guru.


Having a friendly interview process, to us, is a competitive advantage. As is not requiring multiple onsite interviews, too many phone calls, and even making the onsite interview not take all day (we try to wrap up by 1pm unless a candidate is shadowing an engineer for a day, which we typically do for more senior hires). Having a concise interview process often lets us get offers out the door while candidates are still navigating 'Round 3' and 'Coffee with a lead engineer' and other nonsense with the bigger companies.


I wanted to use and like hackerrank as a hiring tool but found the questions had nothing to do with what devs do every day - mainly solving business problems in elegant ways. I even contacted them and told them so but did not get much response.

I treat interviews more like going out for coffee and often do just that. I like to understand what people are passionate about both in tech and personally.


Do you find that you can determine a person's technical skills from such a discussion? I've found that I'm not very good at determining such things from just talking. Often I talk with people who sound like they know what they're doing, only to find out later that they don't.

As for coffee - do you ask your interviewees if they want that? I don't drink coffee and generally hate coffee shops because the only other options are sugar and fat-filled starch bombs like scones and cakes, and other drinks I don't enjoy like tea. I usually eat a decent meal before interviews so I don't end up unable to think due to hunger during the technical part.

That said, I also find puzzles and unrealistic tests to be annoying. One thing I've done for the technical side is to pull questions from codereview.stackexchange.com and ask the candidate to review them. (I generally print it out and ask them to do it on paper in front of me, so they aren't looking at the existing answers.) This gives me a better idea of their technical skills without the "gotcha" feeling of whiteboards. It also normalizes for different background (e.g. I'm used to Xcode, but this interview had me working in Eclipse and I couldn't find anything!).


We discover people's technical skills with a take-home question. We took a real business problem that we already solved and get them to solve it. They don't write any code, but just give an explanation as to how they would go about solving it and why. There is a little bit about database normalization in it too. It's incredible how well this simple 'test' works in seeing how people think.

Often the interviews are over Skype, but we keep things pretty causal, and give the interviewee lots of opportunity to ask questions about the company and culture.

Cultural fit is very important to us. We live in Nelson, BC, small town Canada - very far away from Silicon Valley. Our team is very friendly and care for each other. We want to ensure any new hires contribute to that positive vibe.


To your latter point, any time we do technical exercises, we _always_ let the interviewee use their own IDE. Nowadays it's easy because most people have laptops, though we do offer to set one up if they do not have a laptop to bring.

Their choice of toolsets and setup can also be a really interesting discussion topic. More than once during an interview, someone has shown me something really cool that I adopted (new key binding, a new tool, a particular way someone had their terminals split-paned, etc).


I've done a couple interviews over screenshare, where I setup my own IDE, and liked that more than the typical Google doc or CoderPad


What if the candidate doesn't have a machine of their own (company owned)?


We have a laptop they can use and make the offer that they can use it. It hasn't happened often, but I could easily see it being set up with a few different environments already. (Adding that to my todo list.)


"I wanted to use and like hackerrank as a hiring tool but found the questions had nothing to do with what devs do every day"

But then...

"I like to understand what people are passionate about both in tech and personally."

OK, so what do people's personal passions have to do with what devs do every day? Are you really going to hire someone based on whether they prefer playing music, or bicycling, or spending time with their kids?

Also, since we have a limited amount of time for the interview, I'd much prefer that you didn't waste time asking me personal questions that are irrelevant to the job requirements (or telling me personal things about yourself). I'd rather have time to ask you questions about the job and the company.


Hiring someone purely on technical skill is myopic, I'm not going to blabber on about "Cultural Fit" but in small to medium teams hiring someone with vastly opposing viewpoints can be detrimental or it could be greatly beneficial.

How could you expect to measure the social impact of a new member on your team when you look at them as a technical robot and not a human being.


> I even contacted them and told them so but did not get much response.

Probably because it's much easier to rank a algorithmic solution. The question of "does your code compute the correct answer in the allocated compute time" provides a nice clear answer. Solving business problems in elegant ways, on the other hand, takes a human to judge, and the results are not always clear.

The difference in the difficulty grading the two is exactly why the former is so popular as an interview technique (requires low effort to get a binary answer), and the latter is better at actually judging an applicant (reflects the human behind the test).


> I wanted to use and like hackerrank as a hiring tool but found the questions had nothing to do with what devs do every day

Surely you can add your own questions.


i really like your "how well do we collaborate" assessment method and the underlying advice of treating people meaningfully. teamwork is so important to satisfaction, achievement, and even a feeling of responsibility.

that isn't necessarily at odds with assessing technical ability via tests, of course. in a perfect world, we'd be able to tease out both great technical ability and great interpersonal skills. it sounds to me like our original ranter disagreed with the scope and focus of the test--timed cleverness and ingenuity in an esoteric problem space. that kind of ingenuity is often helpful but not often required for many technical jobs.


Full stack engineer here. Are you currently hiring ?


Yes - e-mail address is in my profile. You can also have a look at our careers page. https://canary.is/careers/


I couldn't agree more, and I've shaped our hiring policy around the same ideas. I try to be very respectful of the time of both the interviewers and the interviewees, while still providing enough data to make a good decision. I'm sad to say that we learned the most from bad hires: people who were terrible communicators, people who became hostile after hired and their work was critiqued, and people who were expert at talking shop but couldn't produce functioning code on their own.

We typically do a short phone interview first to assess if there is enough common ground to work with. We're looking for huge red flags at this point, such as an inability to talk through very fundamental programming concepts, difficult to communicate with, or a generally disagreeable personality.

Next we do a take home project, where we share a functioning, boilerplate web app, and ask the applicant to spend a couple hours addressing some portion of "requested" functionality. The barrier we're looking for here is actually not very high. We want to see that the applicant was able to figure out was going on in an existing code base and that that they were capable of making a new, original addition to it (even if very small).

The next stage of the interview is an in-person interview with 3 or 4 us. We usually start off with a 30 minute paper test, where the applicant can pick 5 out of 10 questions to answer. Pseudocode answers are expected, not syntax perfection. This is really another datapoint where we're trying to make sure the applicant truly is technically competent.

We then sit down and talk for about an hour. We ask questions about their resume, the project, and the paper test. A lot of this is making sure that they can talk about the things they chose to list on their resume. If they indicate expertise in TDD then they can expect questions about frameworks used and software patterns utilized to improve testability. If they indicate expertise in a particular database server, they can expect detailed questions in that area. This is also an opportunity for them to ask us questions about our company, culture, methodologies, etc.

The final part is a collaborative exercise in defining the architecture of a proposed system. I recognize whiteboarding a solution is tough in such a stressful situation as an interview with people you have not yet developed a working relationship. So we try our best to reassure the applicant, and to make it as collaborative as possible. Sometimes we'll debate different options amongst ourselves to see how the applicant participates and which direction they choose.

I usually close with a tour of our dev area, as well as a few areas of the company so they can see if it feels like a place they could call home.

Interviewing is hard on both sides.


Recruiting and hiring are garbage right now in the industry. Part of the problem is easy to recognize -- idiots designing interviews. This is where the puzzle crap comes from, as well as theoretical stuff that's far removed from the day to day of what a person will be doing.

The other problem is that the people who could be designing better interviews aren't stepping up. There are plenty of intelligent people who aren't 1) taking the time and effort to introspect and think about why they are effective as people and 2) taking those insights and translating them into an interview process that selects for important traits in simple, reproducible ways.

It's bizarre. A job could require years of experience with linux, programming, and networking, all of which could be tested with a multiple-choice style test to get a sense of where a candidate stands. Instead, we look at their resume, check off that they have our requirements buried somewhere in the forest of buzzwords, and then move onto whether someone can finger-paint their freshman year CS lectures onto a whiteboard. Then when we end up hiring a completely ineffective person who spent their entire time trying to game the interview system, we are surprised, even though we've been selecting for that kind of person all along.


Another thing missing from interviews is being able to find various talents in engineers for the diversity of the engineers out there. Are they good at finding corner cases, are they creative with product design, are they good communicators with a non technical audience.


> This is where the puzzle crap comes from, as well as theoretical stuff that's far removed from the day to day of what a person will be doing.

I find looking through past project work is one of the best indicators of their ability. However I have found this to be extremely difficult to get. NDA restrictions of their previous employers is one major restrictions, or he is transitioning from a different domain or career path.


Went through this a couple times lately. No pain.

And the battery of automated tests helpfully written for you saves a lot of time.

If the companies you interview with choose problems you don't find relevant, it may be their fault, not HackerRank's.

Or even not a fault: they have to screen out a number of applicants who are good at boasting but can't actually code. You obviously can. It can be a bit boring, bit it gets you to the next stage with significantly fewer contenders.


Yes, the test itself is fine, however the issues I've seen:

- Hidden tests/metrics that also get evaluated

- Companies with an exaggerated sense of self-importance and unrealistic expectation of results

This is a test, you don't have the feedback that you have on a company (code reviews/talking to a colleague/better understanding of the conditions/real data tests/etc). Don't expect me to guess your conditions.

Also, I can't cover all points in the time you give me, I'm focusing on making the thing work, other stuff is secondary.


I feel the time limit is usually the most ridiculous part.

I enjoy solving hard problems as much as any other engineer, or I wouldn't be in this field. But if you want to judge me on code quality, expressiveness, architecture, and test coverage, in addition to performance and correctness, you should really give me enough time to review, refactor and test my code to cover all of those criteria. You know, time that any responsible engineer would spend on any piece of code they write before sending it off to production.

If the position does actually involve writing production-ready solutions to hard algorithms problems within ridiculous time limits like 1 hour, let me know beforehand so I have the chance disqualify myself from it, and we can both avoid wasting any more time with each other.


Exactly. State the conditions and evaluate conformance to the stated conditions, not something else.

It may be a good idea to ask for clarifications before or even when you've started the assignment. (Applies to real work as much or even more.)


This. A thousand times this. I've been stuck with team members that were consummate BS artists. By the time it is apparent, sunk cost fallacy and a desire to save face sets in and the org has to live with them for a long time.


Then you managers are poor I have only ever worked with one person who was totally unsuited for working in IT.


The thing I dislike about the automated tests (and this might have been an option specified by the employer maybe?) is the nature of the test isn't revealed to you.

I remember doing one problem for one company, and the automated test suite just gave you "5 out of 6 tests were passed", without revealing what the 6th one was testing for.

I kept on having to think "maybe it's a null safety check here?", "or here?", what if the user passes -1 etc, but to no avail - the test just wouldn't pass.

The worst part was I gave up in the end because I had to move onto the next question, and they don't reveal at the end your performance and where you went wrong.


Using something like HackerRank as a screen to check that candidates can actually code seems like a good idea to me. But that means setting problems which depend on that, and not specialist knowledge.

For example, an old employer of mine used something like "write a function which, given two strings representing durations down to a hundredth of a second in the format HH:MM:SS.SS, prints the difference between them, in the same format". To do that, you need to do some basic parsing, some simple maths, and some formatting. You don't need clever algorithms.

On the other hand, a problem which requires a "trickily smart O(N) DP solution" requires specialist knowledge, and is not simply a test of whether someone can code. If a company actually needs people who are good at particular algorithms, then this may still be a useful test. But barely any do.

A more ambiguous case is a test i did recently which asked for a function which printed all primes smaller than N. As it happens, i know a couple of (simple!) algorithms for finding primes, so i just coded one. But i know people - including some brilliant colleagues - who haven't studied maths beyond secondary school, and so won't know those algorithms. They might work one out, given time, but that's time they don't have to spend coding.


Frankly, coming up with an inefficient but simple algorithm for finding primes from first principles should work. That is, accumulate primes found so far, check if the next number is divisible by any of them, or even by any number lower than the latest scanned, if you're really bad at number theory, but fine with logic.

Many Project Euler problems are about such special things that you likely didn't know before encountering the problem, but the definition suffices.


"Hackerank" is just the "chitty" culture taking over unfortunately its designed for those who only went into the biz for primarily financial and /or family/cultural pressure


It could be worse. My personal recruiting peeve is what I call the "Dunning-Kruger interview", where they ask algorithm questions they don't understand and don't even realize that fact. It is unfortunately common, and made doubly worse when it is plainly my area of expertise and the reason I was recruited in the first place.

More than once, I've been recruited by executives at Famous Tech Company to run major new initiatives involving vast volumes of spatially organized data, since my expertise in that area is well known, but there is a technical diligence step where I am grilled on spatial algorithms by a shockingly ignorant (e.g. doesn't understand R-trees) Principal Engineer or similar who actually believes no one can know more about the space than they do. (If that was the case, they wouldn't be trying to recruit me.) If an interviewer wants to test my expertise, they better be able to have a substantive discussion on the subject matter and understand the limits of their own expertise.

I view algorithm gotcha games as disrespectful of an experienced software engineer's expertise generally. I like to turn it into a substantial discussion about the algorithm class generally; if the interviewer is incapable of having a substantial unscripted discussion about said algorithms, it is a red flag and they have no business asking those kinds of questions. These days, I just walk away from an opportunity when this kind of nonsense happens.


... maybe the point of putting you in with the shockingly ignorant Principal Engineer was to figure out if you could work with them?


Heh, possibly, but in practice it usually appears to stem from complete disinterest -- it is an assignment and they do minimal background investigation. I go along with it, but in one extreme case I had a PE-type flatly accuse me of not understanding the theoretical details of a particular algorithm, being unaware that I invented it (and I never mentioned that). When it gets to that point, it is a lost cause.

Some computer science domains are worse than others. Spatial is particularly bad because very few computer scientists realize the theoretical foundations of spatial data structures are completely different than the more ordinary ones they are familiar with -- their intuitions don't apply.


I'll post to this thread because this is very relevant for the hiring companies. Hunter and Schmidt did a meta-study of 85 years of research on hiring criteria. [1] There are three attributes you need to select for to identify performing employees in intellectual fields.

  - General mental ability (Are they generally smart)
    Use WAIS or if there are artifacts of GMA(Complex work they've done themselves) available use them as proxies. 
    Using IQ is effectively illegal[2] in the US, so you'll have to find a test that acts as a good proxy.

  - Work sample test. NOT HAZING! As close as possible to the actual work they'd be doing. Try to make it apples-to-apples comparison across candidates. Also, try and make accomidations for candidates not knowing your company shibboleth.

  - Integrity. The first two won't matter if you hire dishonest people or politicians.

     There are existing tests available for this, you can purchase for < $50 per use.
This alone will get you > 65% hit rate [1], and can be done inside of three hours. There's no need for day long (or multi-day) gladiator style gauntlets.

[1] http://mavweb.mnsu.edu/howard/Schmidt%20and%20Hunter%201998%...

[2] The effective illegality comes from IQ tests disadvantaging certain minority groups.


> Using IQ is effectively illegal[2] in the US, so you'll have to find a test that acts as a good proxy.

You've posted this before and been called out on it. Please stop spreading misinformation.

There is nothing special about IQ tests specifically. Any proxy test will have exactly the same legal ramifications. As long as you can show that the results of that test are relevant to job performance, it is fine. Whether it is labeled an "IQ test" is irrelevant.


Put plainly, the callout and yourself are wrong. To muddy the waters about this may put small companies at risk.

I've spoken with an lawyer about this. There is case law directly concerning IQ tests.

The bar for acceptance to prove the use of IQ in hiring in a discrimination case is unattainable by most software companies. Hence, it is effectively illegal.

NB: I would desperately love to use IQ tests in the US myself. I want to be on your side and use IQ tests, but wouldn't risk my supper for it.


I didn't intend to imply that IQ tests are okay (although the test I suggested is in fact what you have to prove). I'm actually concerned by the suggestion that IQ proxy tests are any different. They are bound by exactly the same requirements.


I'm trying to find a source, but I recall reading the legality of IQ testing is debatable. The same law that outlaws IQ testing also outlawed using educational requirements for the same reason (minorities are underrepresented in higher education.) Yet almost every company requires education credentials, so I don't think it has much teeth. And lots of places use tests like what OP is complaining about, that are basically intelligence tests for all intents and purposes.


> Are we now at a point where an engineer should at all times be intimately familiar with competitive programming and codegolf techniques?

Yes, we are. Accepting that trend seems to be better than ignoring it.

There is one not-so-bad way to look at these programming challenges. In silicon valley, we live in a condition wherein the company could fail at any time. As a developer, most often, we have to learn something entirely new and ship the code for the business. Those challenges are quite enormous with lots of unknowns compared to these programming contest problems where it is just preparation. The general idea is, if a developer is diligent enough to prepare for these programming contest problems and delivers when required, he is probably going to be helpful as well when the business has a dire need.


On the flip side, now you have no idea if a developer is any good, because all they have to do is memorize a bunch of garbage interview algo techniques / questions.


I think this mentality is actually wrong. I've definitely met good engineers who were rejected for jobs they were qualified for because of these questions, but I've never met someone who was extremely good at these questions who turned out to be a dud after hiring.

I don't disagree with the hate towards the current interview process at larger companies, preferring take homes myself, but I think it's harmful to say that all they have to do is "memorize a bunch of ... techniques" - they're going to have memorize a whole lot to get through the interview process. In the process of memorizing (learning) those techniques, you're likely to learn a lot about the foundations of mathematical problem solving.

I guess I'm biased because I studied math in college, but I think we can both criticize the current interview process without taking away from the hard work of people who are actually passionate about algorithms.


I have met many people in my life who are/were way better than me in chalking out algorithmic solutions but didn't even had a single bit of knowledge how computers and real softwares work. Many of these people are not good team players, don't understand the importance of delivering things, or simply aren't interested in the work they are doing.

On other side, I have also met many people who are extremely good at competitive programming and extremely good at their workplaces also.

So taking these two as suppositions, I have concluded that competitive programming skill is not something which can't be relied upon as a judgement factor for a good candidate.

Note- I once used to be a good competitive programmer during my college so I have seen both sides of it.


I agree - I don't think competitive programming should be used as an interview tactic. That being said, if you look at the rating distribution of TopCoder/Codeforces problems, most interview problems (even at Google) tend to be around the 1500 Elo level. Around the 1900 Elo level and above, I think it's an extremely strong signal for a very talented thinker.

I'd much rather prefer a discussion about a previous project (open source or at work) with pointed questions about the code/design choices, a take home, etc.


> I've never met someone who was extremely good at these questions who turned out to be a dud after hiring.

I have. More than once I've brought on folks who aced algorithm questions but turned out couldn't handle complex business logic. They could basically only understand code that fit on a page / screenful. Anything beyond that and they floundered.


Dunno about hackerrank, but if you can consistently do well on topcoder algo or codeforces competitions, that's beyond having memorized stuff -- that's being genuinely good at problem solving, and having a broad base of knowledge about important algorithms. I'd be happy to work with you.


Ditto on this. Someone who is passionate about algorithms outside of just memorizing interview problems i.e - they find problems like this - http://codeforces.com/contest/724/problem/G - interesting, usually have high problem solving and mathematical aptitude.

Perhaps I'd like to draw a distinction between genuine algorithms problems that you haven't seen before, and the really bad interview questions like "Find a cycle in a linked list". You'd be really hard pressed to find anyone who could solve the problem above.

Edit: To clarify, I would not remotely consider the linked Codeforces problem as a good interview problem.


What is your preferred interviewing method?


Part of the interview for my last job involved interviewing me about debugging, which I thought was neat. "What's wrong with this code?" All the samples were relatively simple - less than 30 lines each?

- Double delete, caused by not following the rule of 3 in C++, which would likely crash - Multithreading code missing volatile or memory barriers (which segued into a discussion of it's disassembly) - Nonvirtual delete through base pointer

Extremely realistic - it's all stuff I've seen and fixed in the wild (this was for a non-entry gamedev job involving plenty of C++)


Note that unless you are coding in Java, volatile is not a memory barrier, and both the CPU and compiler may reorder things around them.


Yes - perhaps I should have written that as "volatile AND memory barriers". Either way, worth mentioning, as it's a common misunderstanding.

Further complicating matters is the fact that Microsoft's compilers will treat volatile as a memory barrier - for x86 code. But not for ARM! https://msdn.microsoft.com/en-us/library/jj204392.aspx


Same thing might be with jvm. On x86 'concrete machine' volatile might be a memory barrier, but this behavior is not guaranteed by the definitions of Java 'abstract machine'.


For people with 5+ years of experience (like OP), I look at and talk about their portfolio; he did actually built stuff that runs in production, so let's see how that came to be. And show parts you have written and are proud of. We are probably in that interview because your skill set matches (somewhat) the problems we are trying to solve, hence your past projects will have similarities with our projects. That interests me. I can find out if you are a self starter, if you have any business sense by chance, management/tutor potential etc. That you can solve some complex thing in 5 minutes is of no interest to me at all, ever.


> The general idea is, if a developer is diligent enough to prepare for these programming contest problems and delivers when required, he is probably going to be helpful as well when the business has a dire need.

How? It's not like the applicant can study for the "dire need" like they can algorithmic interviews. I mean, sure, such an employee can attack the problem with fervor and hours of time, but that doesn't mean that they'll come up with the right solution to the problem.

The ability to choose the "right" algorithm has no correlation the ability to solve problems: The ability to associate pathfinding problems with the proper variant of Dijkstra's algorithm won't help when troubleshooting why customers are getting intermittent 403 errors.


In silicon valley, we live in a condition wherein the company could fail at any time. As a developer, most often, we have to learn something entirely new and ship the code for the business. Those challenges are quite enormous with lots of unknowns

I don't think any of these are true. Companies do not fail 'at any time' because of their programmers nor are they saved by them in the nick of time. If there's something you should be prepared to for it's writing code that's less prone to catastrophic failure rather than coding your way out of some disaster. It's not ER Medicine.


I accept the trend. If you ask me to invert a linked list, I will also mentally note a big fat minus for company culture. I assume others do as well. So bad interview practice will lead to worse hires in the long term.


This makes no sense at all. The skills needed to diligently revise algorithms from a book are completely different to the skills needed to keep a startup afloat.


Absolutely all the textbook q in the world wont help you when your the person keeping the company's billing system running correctly.


Why mot just require to be able to juggle 7 balls? About as relevant and as good for showing diligence.


If the job is in a circus, certainly!


> I'm an experienced developer with 5 years of thorough product experience in the domain. I'd completely appreciate the importance if the role actually did involve trying to eek out efficiency in a distributed system with high throughput... but 9 times out of 10 it's boilerplate business logic.

This is what I don't understand about software engineer hiring. They are ignoring important criteria about fit for the actual job, and focusing on criteria that is either orthogonal or completely irrelevant.

It's worthwhile to ask in these situations if the exercise is representative of work you'd be doing. It almost never is, but they tend to excuse it as...

> it normalises the application process against prejudice

But does it? By ignoring the actual criteria relevant to the job, they're dumping qualified people who don't do well with the hazing, and delaying evaluation of everyone else until after they're hired. That evaluation tends to be a lot less objective, because "hiring is expensive"; it selects for people who do well with the hazing, and reproduces the problem for the next round.


> This is what I don't understand about software engineer hiring. They are ignoring important criteria about fit for the actual job, and focusing on criteria that is either orthogonal or completely irrelevant.

Oftentimes, the people actually designing the hiring process have an imperfect understanding of what work software engineers actually do. They ask the software engineers for input, but optimizing the hiring process to be orthogonal a software engineer's core work responsibilities, so they aren't necessarily going to give ideal advice.


For their own expedience. It's an employer's market. It costs 2.5k/yr to keep an SO posting up and get thousands of resumes in your inbox.

I'd also say a fair share of those in Who's Hiring on HN aren't in dire need of filling seats, but just trying to see how far people with fling themselves through the mazes. Blissfully unaware they're not the only startup and have no way of offering the stability of a large company.

At one of the places I've seen, we didn't read every resume. We overlooked mountains of talent and shot ourselves in the foot.

Instead of hiring coders that had their heart in the right place, we hired streetwise careerists that put their own interests before the team. But they could do palindromes, fizzbuzz, and whiteboard data structures and algorithms.

But when we wanted them to do something generalist or in another language, they'd refuse. One even went so far as saying if they could program X in Y editor, they'd just leave the job. What use is passing all these tests if you're totally inflexible?

We also snubbed people enthusiastically espoused the startup gumption and idea of building, but didn't cope well with the white boarding we thrown at them. Those whose heart was in teamwork and open source, we overlooked ignorantly, while continually putting up walls to see who finally gets past all of them.

There is some toxic cultural thing amidst in startups of insularity and smugness. If I could go back, I'd say screw it with the whiteboard games, come freelance with us for a week. That way I can gauge your temperament, how you work with teammates, your technical skills, etc. in a realistic setting.

And if someone asks for a code sample, and you already have projects on GitHub or your portfolio, don't be afraid to redirect them to that instead. If they don't look, assume the employer is not serious about filling the spot, but just putting in the least effort themselves to see how many hoops people jump.

It's not you OP / other programmers. If an employer doesn't bother to give you a phone call to talk to you as a human being, maybe they're not so eager to have a position filled.

Don't let it effect your self-worth. Always be coding. Don't be afraid to stick your head out there at a meetup and shake some hands, you'll be surprised how much more decency you court when you represent you're a human being, not another resume in a stack of thousands.


As someone who has been on both ends, it definitely is not an employer's market for engineers. The ones who meet a high quality bar usually can get an offer almost anywhere they want, and can easily choose to dismiss the employer wooing them if they don't meet their bar.


"Come freelance with us for a week" seems even worse to me.

For starters, there is a very small and fairly specific subset of the population that can afford to carve out a whole week for an extended interview, even if you do pay them for their time.


Yes, this omits the people who already have a job and wouldn't want to quit just to freelance for a week.

But I doubt this would leave out genuinely interested people who aren't employed. Assuming a preliminary acid test on the employer's side for skills needed and on the candidate's side for interest, this seems reasonable.

You may want to have a Plan B for the employed.


The interview + take home test works as the best initial filter. You know their history and can see their work and code.


As an aside, I also dislike hackerrank; for me, it's the sheer ham-fisted ineptitude of some of the alleged learning exercises.

Look at this mess: https://www.hackerrank.com/challenges/preprocessor-solution

This purports to be teaching use of the preprocessor. It's horrific. It's someone's amusing "look at how you can create your own programming language by abusing the preprocessor" mess; fair enough, it's always fun to see someone do something this painful, but this is being genuinely presented as a preprocessor learning exercise.

One day I'll have to work with people who learned from this (and others like it), and it will be a long painful road to help them unlearn.


  One day I'll have to work with people who learned from
  this (and others like it), and it will be a long painful road
  to help them unlearn.
Quoting from the link you've mentioned:

  #define add(a, b) a + b
I couldn't agree with you more.

--

To elaborate:

  add(1, 8) * add(5, 1)
wouldn't yield 9 * 6 = 54, but

  1 + 8 * 5 + 1
which is 42. One might say that it's correct, since it's the Answer to the Ultimate Question of Life, the Universe and Everything.


Which is why anyone who has experience programming in C would instead write:

  #define add(a, b) ((a) + (b))


One may also do:

  add(x, foo(x))
Which is why anyone who has experience programming in C would use (an inline) function instead.


I loathe algorithm questions, and I almost always never ask them when I'm interviewing a candidate. Too often have I seen candidates who ace 6 rounds of algorithm interviews, only to struggle when it comes to building actual products and require a ton of handholding.

On the other hand, I've seen candidates who fail interviews that are algorithm heavy, but have done exceptionally well when it comes to the practical world - doing actual work and building apps and contributing to the team instead of writing the next (insert your fav tricky algorithm question here) solver.

It's unfortunate that while many companies that are hiring are just run-of-the-mill SaaS and apps companies that don't require you as an engineer to use algorithms or DP on a daily basis (or even ever), you still see algorithm heavy interviews at these companies.

On Hackerrank, I don't necessarily hate it as a tool, but just the questions that get asked through it. I don't like that it automates an interview process to a certain extent, as a candidate's potential and skills and experience and fit in the team can't exactly be measured via an automated process but requires actual human to human interaction. If a company rejects you on the basis of failing a Hackerrank question and hasn't even talked to you, you're better off working for a different company.


I've also seen a similar tool that claims to do a partial evaluation of candidate's code. Guess what does it do! Assume that the expected answer was

   42
and your code emitted

   4
it'd give you 50% marks for the test case.

--

As an aside, such tools would give you a 0 even if you coded the perfect algorithm but goofed up the final printf.

Robotic evaluations might work, but not in the current form.


I've had a similar problem with one of more recent HackerRank challenges. Part of the input were value pairs, the example contained only two pairs and the ordering of the pairs was not clearly specified, ie it could have been:

x1 x2 x3

y1 y2 y3

or:

x1 y1

x2 y2

x3 y3

The worst part way that their example still produced the same result if you read the values in the wrong order! I spent 40 minutes debugging my solution not understanding why my test cases work perfectly but HR does not accept the solution.


I completely agree with you. Measuring our skill based on HackerRank is completely bullshit. I'm 100% sure, every programmer-beginner (even every math student) can solve the problems on HackerRank but can't even deploy a application on server nor know how to deal with real customers nor read a build-script nor... (image a lot more things here)


I have +6 years of work experience as a developer and for the last two months I have had +36 pre-selection interviews via HackerRank with a wide range of small to medium sized companies. I am not ashamed to say that I have failed every single one of them, but I have to recognize that I have learned more things during the last couple of weeks about algorithms, data structures and prioritization than in most of my career.

This week I was invited by an in-house recruiter from one of the "Big 4" to resolve two coding problems via HackerRank in 120 minutes, plus a third exercise asking about the time and space complexity of my solution. I am 99% sure that I will fail this pre-selection too, but I really do not care, the more I practice now the more opportunities I will have next time. I have talked with people who were hired by Google, Amazon, Booking.com after 8-12 months being unemployed, so — in my case — two months is certainly nothing, I can use the next six months to train myself and maybe next year one of these companies will extend an offer and then I will forget about all this hiring madness.


Knowing the implementation of a specific algorithm is much less important than knowing what class of problem you're facing, where and how to look up the details of algorithms that could help you and how to work well with others to get the work done. If your employer uses hackerrank to choose candidates, you really don't want to work there. People who do those things well are often the worst kind of people to hire: they're really good at certain types of problems but suck at almost everything else. I end up cleaning up systems after these people all the time everywhere I encounter them in the workplace.


Exactly this. The hiring process cuts both ways. Unless you're desperate, you should also be evaluating your employer and the quickest way for me to walk away is to filter candidates through something like hackerrank rather then something much more applicable to what I'd actually be doing as a programmer at the company. It shows me a big disconnect between hiring and the day to day and gives me no indication on the skill level of the team I'll be working with.


You are overstating your case. Hackerrank is useful as a pre-screen. Give a fizz-buzz type problem on Hackerrank, then proceed with the normal interview process. The kinds of people you are describing would be filtered out in the subsequent interviews.


I can only hope you're right. The tone of the OP doesn't directly support either/or however.


One of the things that always interests me around our industry is this.

I'm wondering how other industries do it, I mean, once Doctors get their medical license, if they want to move to a different clinic or hospital, do they have to attend an interview demonstrating their knowledge, doing a whiteboard session on a "Dr House" style medical problem that they have to diagnose in < 10 minutes?

Or do they present their medical credentials, and get interviewed on their bedside manner, their ability to work in a team (if applicable), anecdotes about their past experiences etc?


Med school is difficult. An internship is difficult. A fellowship program is difficult. Undergraduate degrees are a rubber stamp, CS included.

There are 20k oncologists across all fields of oncology within the US. That's also about the number of engineers Google employs. Most other industries we like to compare ourselves to are leagues above ours in individual merit. My father is an oncologist, and I'm reasonably confident he has some familiarity with every genitourinary oncologist in North America, Australia, and Europe. More importantly, he's on a first name basis with all of their educators. When he needs to hire a new doctor, he doesn't post an ad on health stack exchange. He makes an offer to a specific individual who he already knows.

One thing that we as an industry fail to understand is that we're not special. We desperately claw to it in these conversations. I'm willing to admit it: I'm easily replaceable. Very few of us have any name recognition that exists in other fields. I worked in finance for half a decade before moving to software. When I'd go to interview, people already knew who I was because of the basic human interaction I had as part of my job. When I walk into the door of my next interview, the only thing people know about me is what's on my resume/blog/stack overflow answers.

Personally, I find technical interviews to be a cheap and easy filter. You may not always get the best person from your pool of applicants, but you get someone that's better than most of them. The marginal benefit of one vs. the other is rarely meaningful. OP complained about having to do this, but some of the other applicants might have found it difficult. Sounds like it was successful.


My sister is a doctor. In Australia at least, they generally don't bother with interviews or resumes. Right out of university they write down preferences for where they would like to work, and based on their grades they are assigned a position. After that, when moving from one stage of training to another (e.g. resident to registrar or registrar to GP/speciality) they take another test and they are given a position based on their grades on the test. It's like university, but extended to the rest of your life.

Also, in Australia there is the option to do medicine as a 6 year undergraduate degree, which ends up being a very relaxed and easy going degree. Residency and registrar is a 9 to 5 job, where as a registrar you have a six figure salary.

Of course we also have the 4 year long post graduate medicine degree, which would be much harder of course.


Here's how I was assessed for my job at Pivotal:

1. I did a simple tech screen (the RPI). 1 hour. My interviewer had a laptop and asked me questions about what to do next in the scenario.

2. Hey, come pair with this engineer on this real code on a real project on a real task.

3. How about lunch?

4. Hey, let's have you pair with this other engineer on real code on a real project on a real task.

5. Get offered a job on my way out the door.

I know from feedback that we don't always do this right, that we sometimes drop the ball, that many people find the RPI or the pairing to be frustrating, intimidating, or uncomfortable.

Most importantly, many people find that they just don't want to work the way we prefer. Which is good! It saves them the unhappiness of committing to a situation that won't enjoy.

But the core insight is: the best way to see if someone can work alongside us on a real problem is to ask them to come in to work alongside us on a real problem.

It's the best proxy we have short of hiring you. When it's available as an option to do this, I don't understand why anyone would choose a less accurate proxy.


I really like this approach and I reading about it here makes me inspired to try it on our next hire. We do pair programming, so it would work well for us.

What I've been doing which is similar is hiring the dev on contract for a short term to see if it works out. But I think this approach is much more efficient. Do you get and check references?


I suggested a try-before-you-buy contract to a few companies that I applied with and none of them were interested. Maybe it is because the company is afraid of the contractor saying "What is the upside for me to join you as an employee? Just extend my contract if you want to keep me around." :-P


So far as I'm aware, we don't go into references. If not, I don't know whether it's because we don't like to or don't have the bandwidth available.


Pairing for interviews is a good idea if you pair. But it's not a good idea if you don't: I don't like pairing, and I don't really want to work somewhere where it's a big part of what they do. I won't perform as well in an interview pairing, which ksbisfil info of you do pair, irrelevant if not.


Engineers like yourself, who don't want to pair, self-select out before applying.

Which is fine! There's plenty of variety in this industry to find preferred practices, peers and environment.


You are right.

I like pairing. Once a week.

Being obligated to pair and doing it every day is No Way Jose for me


It's also a fantastic way to see how one thinks, one's problem solving approach, debugging skills and the like.


It is!

I hasten to add that it isn't perfect. We hire across the intro-extraversion spectrum, but there's an ongoing concern that we're biased towards extraverts. Especially in Labs, which is the consulting wing.

Another problem is that many candidates are just plain nervous. We do our best to set people at ease and to be upfront that there's no right or wrong or trick answers. But interviewing is just scary. I expected to fail and so felt no pressure -- had I felt that more was on the line, maybe I'd have done worse.

The third -- this is a very common negative opinion -- is the argument that we don't give candidates a fair opportunity to show their expertise.

We will usually try to assign one project where their résumé claims expertise and another one that they're unfamiliar with. The former to take a sounding of their expertise, the second to get a feeling for their approach to the unknown.

It's not always possible to do this, simply because it's a vast field and candidates come with very varied backgrounds. And those candidates who are declined often feel that we've denied them a fair chance by throwing them into an unfamiliar technology.

These are all fair criticisms. My best answer is: we are not trying to trick or exclude anyone upfront. Ultimately the hiring decision is made by future peers, so we want to be fair but firm.

Hiring is just hard.


FWIW, I had a very positive experience with the Pivotal Labs interview process last year. Really enjoyed the pair programming sessions and got a very good idea what the job would be like day to day.


"Hey, come pair with this engineer on this real code on a real project on a real task."

I don't understand how I could pair-program effectively on a code base that I'm seeing for the first time and don't know the first thing about. What would you expect me to be able to contribute?


> I don't understand how I could pair-program effectively on a code base that I'm seeing for the first time and don't know the first thing about.

Nobody expects you to magically know anything about code you've never seen before. That would be absurd.

The point is to get a sense of how you think and work as an engineer. We deliberately say that there are no "wrong" answers, that it's impossible to fuck up.

We want to know about the candidate. The decision is about hiring a person, not a list of codebases.

(Sorry for the slow reply, I was rate-limited.)


I face this problem. I am into server side scripting and automation and this hardly involves algos and stuff. The problem doesn't seem to lie with hackerrank but with one-size-fits-all HR approach. I would also pin the blame on lack of understanding on the part of HR personnel (they seem to be the drivers behind recruitment drives) of how the tech landscape is.They need to educate themselves on what sorta interview process to be conducted for different profiles.


That seems... limiting. Are you sure you want to specialize that much, and rule out doing any programming that does require understanding something about algorithms?


I do know algorithms, it's just that I don't use it in my day to day activities. When confronted with an online challenge with time constraints I seem not to cope since I hardly use algos in my day to day automation it isn't right on top of my fingertips!

Without time constraints I am able to solve the problems.

Timed Algo assessments are very similar to exams u write in universities.. u need to keep practicing to complete those assessments in time and get a "pass mark".


This is one of the reasons I'm eyeing management for long term career growth. I've learned that the problems given to sr. and jr. engineers are roughly the same, and that the interview process is biased towards jr.


I think the same way. Mostly biased towards jr. And management seems like a good option for long term career growth.


Isn't it a bad omen for any industry when the most realistic career advancement path is management?


Isn't that how most industries work, from flipping burgers to the legal profession? Why should tech be any different?


True, also the industry prefers younger population coz they can burn more hours and are available for a cheaper than the experienced folks.

Also, the industry somehow seems to stereotype a developer as some younger geeks. The stereotype works against experienced folks sometimes (not always).


If we had a sane choice we would still stick to tech - tech is like the first love that is going through a rocky phase!

I believe there is the "collective stupidity" problem in the tech industry where a new framework pops up almost everyday, a new language every now and then and the sheer volume of knowledge to accumulate to stay relevant is abysmally high! With the realities of life taking over, the path is inevitable to most senior people.


I think you're underestimating how hard interviewing can be for new grads. There are some that are good and others who lack the background or the confidence.

It also seems a bit weird to give senior engineers easier questions. Do you really think people get worse with experience?


I think the skill set changes with experience but the problems given in interviews don't vary much. For example, a fresh grad is more likely to be able to correctly implement merge sort from memory than a sr. engineer because of how recently they were asked this problem before. Whereas a sr. engineer should be able to better think through "soft skill" problems like estimations, expectation alignment, and how to deploy complicated changes with little impact to the business.

I've come to learn coding is the easy part. Where it gets tricky is when you introduce other people.


@OP: You sound almost exactly like me. I have not done any interviews for a bit more than 5 years. I think the what really can be improved is to test us in a more familiar and friendly environment. I have to spend time on handling time-pressure, and fix/adapt with my thinking habits.

## Coding Problems:

I do think coding challenges(algorithmic) has merits, but I think there are two factor that really hinders a interviewee's performances:

- time limit

- require an unfamiliar algorithm

Time limit is obviously needed in a test, but problem solving in real life is never instant unless it's done before. Nevertheless everyone have their own pace. I have to dedicate hours each day to train myself to adjust my brain to act quicker yet calm for these algorithmic questions. Sometimes my initial solution in my mind turns out to be the best, but I discard it. Nevertheless if one is stuck, the time limit just makes it worst. I'm also known to be the slowest in turn-based board game -- I have high win rates though!

## Possible Solution?

* Allow interviewee pick from a pool of equally-difficult-challenges to solve (within a minute or two). This can solves the "obscure algorithm" or "trick" question. This also helps the time-limit problem as the interviewee WILL interpret the question twice in two different VIBES (with and without time pressure)

Even choosing 1 out of 2 would dramatically reduce nervousness and time pressure.


I think the problem is people generally look to the big tech companies like Google for guidance on designing their interview process and for those companies it may actually matter that your approach is o(n) vs o(logn).

Big fan of 'homework' to walk through/extend in the onsite interview. The homework should avoid any UI elements and ideally just talk to a database or another API. Another thing to do is try problems that require the candidate to learn something new like a new language, database design, cryptography, distributed consensus, machine learning, geo-fencing, telephony, etc. Strong candidates learn very quickly and enjoy learning new things which usually becomes pretty quickly evident.


One company I applied at last year used hackerrank as part of the interview process. There were two tasks, neither was an "algorithm puzzle". The first was a very simple one that can be checked for correctness automatically (e.g. "remove duplicates from this list of email addresses").

The second one was basically a homework assignment, just timeboxed to 45 minutes or so ("refactor this simple frontend app"). In the following call I expected that we talk about my solution to this task, but we mostly talked about previous projects I've worked on (which was a better use of our time, IMHO).

So I guess what I'm saying is that it is possible to make good use of sites like hackerrank.


The important thing is not to ace the final result. The important thing is to communicate your thought process, identify bugs, talk about options in architecture and design, stay calm under pressure and keep a positive attitude even when asked to do something you think is silly.

The interviewers know "this exact problem" is not generally applicable to their work. They are looking at how you interact with the problem.

Projecting a bad attitude toward being asked to code on a whiteboard is the opposite of what you should do and misses the point entirely. Code golf and competition (right answer in min time) is not the point.


you can't communicate your thought process on hackerrank or triplebyte it is a timed online test


Organizations seek out formulaic ways of qualifying candidates because actually assessing an individual's talent and personality takes time and effort, and is a significant investment. The pressure to streamline the process grows as the company grows and hiring becomes a more frequent thing. It's one of the main reasons I like working for small companies. As a case in point I'll describe the process that led to my current position.

After the application and a brief phone call I was given a take-home project with a one week deadline. The project was directly related to what the company does, and was interesting and fun to do. I submitted a pull request in four days. The pr was reviewed by their engineering team and they all voted to move me forward. The next step was a work along day for which I was paid. These are typically done in person but for various reasons on both sides we did it remotely. The entire engineering team participated in a dedicated slack channel as we walked through my homework project, suggested and implemented changes, joked and in general had a good time. At the end of the day I said my goodbyes and an hour later the recruiter called to tell me they were preparing an offer.

The advantages of this process should be readily apparent. By the time my first day arrived we already knew we'd get along, approached work in compatible ways, etc. The costs of the process should also be readily apparent, and it would probably be really hard for a larger company to do things this way.


I don't mind algorithms and brain teasers if they are presented well. As in the interviewer makes it clear they aren't looking for a single right answer but want observe how you solve a problem. Plus I enjoy a good puzzle.

My problem with hackerrank is that the problem statements are often unclear and have artificially short deadlines. I feel like it forces my to use unreadable and sloppy coding to just get any solution out the door before the timer runs out.


These things are really just filters. Due to the legal aspects of hiring/recruiting, especially at large companies, these exercises are par for the course. Once you're in a large company, firing can be very difficult, so I'm all for filters.

Personally, after having taken many of these tests myself, if you're having even minor problems, that is a huge red flag for me. They're really just binary filters on basic skill level.


With the USA's at will and limited worker protection you must be kidding about how hard it is to fire people


In my experience, it is very hard to fire people if they are simply incompetent. You can fire them for mistakes, but just sucking at one's job is this awkward grey area where you wrestle with HR and they usually suggest investing in training, etc. That is fine and good, but when I got a deadline, I'm not going to tow deadwood.


Not many within a team that is hiring either have the time or skills to assess candidates. Employers are looking for a magic solution to fulfill their staffing needs.

Companies such as Hackerrank have thereby managed to convince employers that they have the solution to all the hiring problems. Essentially a magic wand which would enable them to hire Einsteins.


As an aside....

I find it ironic that almost all companies talk about "culture, family, work-life-balance" yet they treat recruiting as robot selection....

IQ Hazing is a good way to put it...

but how do you measure, "Do I even want to fucking work with these people/this company???"


Our hiring process uses only homework. We give you a problem to solve, a single Java file to do code review on and some database problems that you have to solve with MongoDB. Based on how well you do we go for a phone interview and if it goes well we bring you in for an onsite interview. I tried HackerRank a while back but it does not give us too much value. The competition going on it can be fun for programmers who like challenges like that but it has almost zero indication how well you are going to do in a production environment working on a mission critical system with a team and where your code is running on any infrastructure. Using it as a hiring tool is fundamentally flawed in my opinion.


If you look at it, the main problem here seems to be that the way evaluation essentially works (even on the first level of the funnel) has not evolved over the years. People ask algorithm-based questions for assessing developers because there's simply no other tool that allows you to evaluate on these actionable skills. For example, how do you evaluate someone on their understanding of JavaScript's prototypical inheritance? Sure, a text I/O based problem (which HackerRank and others) use cannot help you do this. So you resort to MCQs. Asking a JavaScript guy to solve a problem using dynamic programming simply doesn't make sense.

Technical evaluation needs to evolve.


It's annoying, I agree, and sometimes on hackerrank the problems are miscalibrated, but the best way to handle it is to just do it. They aren't that hard, so spend a week or so getting your skill level up.


I think the point of the take home code example is just filtering out candidates who don't really know how to code. So, I personally think it's ok if you don't pass ALL the tests. An experienced candidate when presented with a DP problem should at least be able to code up a recursive solution, maybe with memoization. They may not get the DP solution correct but at least have an idea of how it should work roughly. You SHOULD NOT have to be intimately familiar with competitive programming to be able to do those things.


I'm going through this right now. I've taken some time (1+ years) off and do admit to being rusty. However, I've dedicated time to studying and I'm not getting past some technical phone screens because of these algorithms-heavy questions. I do well enough on them yet are passed over. I thought the market was "hot" and this would be relatively straightforward, but it has been anything but easy. I'm also starting to wonder if there are other factors at play (ageism, female).


What about for people with no experience or education? I'd love the opportunity to prove myself through testing rather than relying on credentials I don't have.


The worst part of Hacker Rank is that it takes arguments as STDIN, forcing you to to a "read from STDIN" ceremony before actually getting to the problem.


If reading from STDIN is something that's considered a "ceremony", I'm already concerned about the rest of the code.

STDIN is one of the most common input methods I use for most of the commands I interact with on a daily basis. It would be like discovering that somebody found "logging errors to STDERR" to be a "ceremony"


In real life, I agree. But if I'm trying to solve an array sorting problem, why can't they just give me an array? LeetCode does it and it works fine. I get to focus on how to sort the array.

The other commenter (natdempk) makes some good points though.


The days of Perl CGI are a distant memory. More than that, it is a distraction. I'd be happier with the use of a parameterized test framework that lets you get to the point much more quickly.


If you work with a modern framework, chances are high you'll never actually read from STDIN. For example, I have no idea how I'd read from standard input in ObjC or Swift.


I could see why that might be annoying, but it forces the coder to make choices about their data structure and makes it easier for hacker rank to support many languages.


In most of the problems i've done, there's boilerplate code to read from standard input supplied as part of the problem. You just have to implement the algorithm.

In the problems where it's not, yeah, it's annoying. But it's also the same one or two lines every time, so it's not a big deal.


There is no shortage of reasonably competent software developers in USA and West Europe. The very existence of things like Codility and HackerRank proves that. However, there is significant shortage of sanity and common sense in the culture of developer hiring.


I just wouldn't apply anywhere that does that, it's a ridiculous way of testing a developers fit for the average role and a strong warning indicator that it'd be somewhere not to work.


maybe what you should really hate is a company that adopts the en-vogue trend as the standards for recruitment. think about what that says about the team that passed their interview


Well said!


if you don't like hacker rank. suggest a technical phone screen over a shared notebook or a homework problem as an alternative. See how that works.


Hiring manager who recently added HackerRank to our interview process here.

While far from perfect, I think these types of systems do have some advantages. Keep in mind, I think they are best used as a tool for pre-screening candidates for graduate positions (where we have a LOT of applicants), or candidates we may otherwise pass on due to a lack of well known engineering school or well known companies on their resume (and I'm sensitive to this given that I moved to SF with neither of these). Also, my company is in a very technical problem space, so we do actually use algorithms + data structures on a daily basis.

* I don't buy the "I have 5 years of experience, I should be exempt from coding in HackerRank / phone screens / on-site technical questions" argument. I've done interviews with many people with years of experience and Senior Engineer on their resume, who are unable to solve trivial problems like finding simple patterns in an array. This might not be the majority, but it's enough to create a lot of noise in resume screening.

* As a hiring manager, my job is to make sure that engineers on our team are not getting pulled from their day to day work to do phone or on-site interviews with sub-par candidates. While lots of people on HN tend to complain about interview processes, the reality is that once you start at a job, most of the time you want to focus on writing code and solving technical problems, not performing multiple phone screens per day. Designing a good interview process involves BOTH creating a good experience for the candidate, and not overwhelming your existing team.

* Certainly a strictly better alternative is take-home challenges (which we used to use, and still do for some candidates). However, to get any valuable information from these (and give justice to candidates who spent a couple hours building something), an engineer on our team has to spend time unzipping, running and looking through them, and writing up their thoughts. This might take 30 minutes of their time, and probably an hour or more out of their flow. To do this with more than a couple candidates per week is not possible (not to mention the fact that understandably engineers might not get around to reviewing it for a few days, which is not fair to candidates). For this reason, I think simpler HackerRank type challenges are a better way of pre-screening candidates.

* As a candidate, HackerRank is one of the easiest possible steps for you to pass. Almost all of the problems are up on their website! They may not be exactly the same as the ones given to you by specific companies, but there is a lot of overlap in these types of questions. If you spend a few hours practicing you will be able to ace almost any HackerRank challenge given to you.

That said, HackerRank is a tool, and I think there are a few implementation details needed to make it work well:

* Many of the suggested questions for candidates are terrible (e.g. "will this code compile", or really unclear problem descriptions). For our quiz, I chose all the questions and answered them myself before ever giving them to candidates. If a company lets their recruiters set up a default quiz, it will be really bad for candidates.

* As I mentioned, we usually use this for grads, or candidates who we are not sure about based on resume alone. If you come in through a referral, cold outreach, or TripleByte (who only work with really high quality candidates) you usually get to skip this step.

* I don't think these systems can every tell you how good a candidate is. They can and should only be used as a method of filtering out candidates who don't meet a minimum standard. As others have mentioned, writing algorithms is only part of the job of a good engineer, and they do nothing to test your architectural skills, teamwork skills, motivation level etc. For this reason we only use it as part of our hiring process, as a minimum bar for entry into further interviews.

I'm also constantly looking for ways to improve our hiring process, so open to suggestions to any of the above.


Sort of surprised to see most of the comments on here actually defending the companies that put so much weight on these types of tests.

I think many people assume that interviewers are looking for "thought process" but from my experience as a hiring manager for 6 years, in reality you'll find that most are just looking to "gotch-ya" the candidate. Many interviewers seem to enjoy making a candidate "sweat" as some source of pride. Again, not saying everyone does this, but often times programmer interviewers believe that the harder and more obscure a programming question is the better.

We should all agree that coding tests are helpful in assessing a candidate's programming capabilities, but not all coding tests are equal.

From my experience, bad coding tests that have little relation to whether a candidate will do a good job are these arbitrary ones. For example:

* Implement a merge/quick/radixx sort algorithm that you maybe did 10 years ago in college and have never had to do since.

* Implement a linked list/hashmap/some other random data structure in Java even though you would never write on yourself.

* Write a program to determine whether a string is a palindrome.

* Implement an algorithm to solve this random problem from Project Euler

Ones that have been worked better attempt to be comparable to what they actually might do in the company:

* FizzBuzz - (While controversial, this helps weed out people who just don't know how to code)

* Build a JSON REST API in whatever language you want to manage groceries in a shopping cart.

* Write a web scraper in whatever language you want to count the most popular words on a website

* Here is a random UI framework that you have never used, use whatever documentation you can find on the web and write a To Do list application with it.

Again, YMMV, and depending on your domain certain questions make more sense to ask than others. If you're interviewing as a researcher for Google/Amazon/IBM/Microsoft, then you actually might need to know how to implement some random sorting algorithm because it may be what you will need to implement it in some new SDK/library. But I don't believe that for most companies this makes sense.

If you are a hiring manager, ask yourself this: If you had to run one of your current (positive) team members through your current interview process, would they make it through? Would they say they had a positive interview experience?


> I think many people assume that interviewers are looking for "thought process" but from my experience as a hiring manager for 6 years, in reality you'll find that most are just looking to "gotch-ya" the candidate. Many interviewers seem to enjoy making a candidate "sweat" as some source of pride.

What you describe is a horrible approach to interviewing, and points towards dysfunction more than anything else.

In my opinion, the second most appealing non-technical character trait for an engineer is empathy. (Curiosity being #1.) If you have an interviewer who goes on a power trip and actively tries to abuse a candidate, what does that tell you about their company?

Everything is PR, and how we interview engineers tells a lot about how we deal with each other. Would you like to work in a place where being nasty is considered normal - or even desirable?


> Everything is PR, and how we interview engineers tells a lot about how we deal with each other. Would you like to work in a place where being nasty is considered normal - or even desirable?

Couldn't agree more. I tell my employees this religiously.

I would also say that the 'power trip' isn't a boolean characteristic. There are different levels for different employees at companies. I think most of us have probably experienced or known those who tends to be on one side of the scale or the other.


Just to add two points that bother me the most:

1. It doesn't actually test my ability. Most of the time there is a stackoverflow solution and I'm going to just look it up and regurgitate it. I can't remember where I read this but allegedly it took Knuth a day of thinking to come up with the most optimal solution for one of the presented challenges (it was either the maximum subarray sum or stock sell problem).

2. It's all very well preparing for these interviews at my age (under 30, no responsibilities or family, generous severance from my previous employer), however, what happens 10 years down the line with responsibilities and hungry mouths to feed.


Seriously why was this flagged? Can't there be some level of criticism without it being censored? Jesus christ anything not in the best interest of making YC money will be crushed.


That's a bad falsehood to spread, since it undermines users' faith in this community. The post was flagged because users flagged it. No one at YC censored it or ever would. The only thing moderators did to this post was unkill (i.e. reopen) the thread after the user flags were enough to kill (i.e. close) it.

HN moderators are strictly instructed to moderate the site less, not more, when a post says something critical of YC or a YC-funded startup. With a post like this one, we would normally have edited the baity title and downweighted the post for being a rant. But because the rant was against a YC startup, we did neither.

Such a policy doesn't stop people from accusing us falsely, but it does let us answer the accusations in good conscience. I couldn't imagine moderating HN without that.


I agree, not sure why this was flagged. The title is clickbait, but it seems this guy genuinely does hate hackerrank. fair enough.


Because its bad PR for hacker rank. That's why it was flagged.


Ironically often any PR is good PR. So many people who otherwise would not hear about hackerrank will now hear about it and many will use it regardless.


Possibly because I'm brand new (I'm not, I'm a very long time lurker and one time failed YC applicant).




Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: