Exactly on point. Also one gripe I have with hackerrank (I did some of the challenges to brush up on some language knowledge), is that a lot of the questions are poorly designed and from obvious amateurs.
I still think that neither whiteboarding nor hackerrank selects for good developers. In the end you get people that studied hard to solve a certain class of problem that you'll never need in the real world. It's wasting candidates time, and it still shows nothing, except that your candidates know something unrelated to the job.
IMO, what's relevant for a role is the skill set most people currently doing that role have and that's not always a good thing.
I've been the first shouting on top of a roof that we shouldn't ask people to know stupid little algorithm trivias to get a job. But the narative for that quickly devolved to "data structures and algorithms are not important for day to day work". To which I honestly say bullshit. The industry is managing without it: between people who forgot everything from school, bootcamps, self taughts, and people with alternate degree, people who can do actual computer science on the job are a tiny, tiny minority. So the solutions those people come up with to day to day problems are considered a speciality (eg: "machine learning!", "data science!").
But if doing that shit was just second nature to everyone, how would it change the way we do things? What if category theory wasn't so scary? What if people weren't scared of threads? What if hashmaps were not magical black boxes people use because the person who wrote the code before them did?
The foundation is becoming pretty darn weak if you ask me.
I totally agree shit like HackerRank is poorly done. But I'm not sold that the skillset it tries to assess (poorly!) is that useless. We, as a community and an industry, essentially did everything we could to MAKE IT useless. That's different I think.
So say I'm a Java developer. But surely in order to be a good Java developer I should know the innards of the JVM, right? To do that I now have to also have to know C++ because that's what JVM is implemented in or whatever. But surely I can't know C++ well if I don't know C, right? But then I can't really grasp C well enough if I don't understand how assembly works, right? What about CPU instructions, different architectures, etc?
My opinion is that nobody should even try to drink what seems to be an ever expanding ocean of knowledge in IT. Let the people (in academia or wherever) who work on algorithms work on algorithms, let the people who write frameworks continue writing frameworks and let the rest of us code monkeys to use all those digested tools and technologies to do what we have to do on daily basis. You don't ask a common plumber to describe Bernoulli's principle during an interview, do you?
In addition I think the argument falls apart the closer down you get. Granted I am no C expert but I don't think you need to know any assembly to be a very good C developer.
You can get away without knowing assembly, but you might as well just use a higher level language at that point.
As a software developer I see myself mainly as someone who solves problems to help others do their work better. So, while I agree that having a strong CS background is definitely a benefit, I don't think it matters that much if you can't get binary trees just right after not needing to implement them for years.
Yup. And my feel on that is that (assuming the perf issue is related to a core data structure), that -particular- perf issue shouldn't even have happened in the first place. It should have been obvious given the right background and you could have spent that time doing something less fundamental.
Do you have to reach out for google or a book to do a simple if/else statement? What about a for loop? Now what about the difference between a map and a set? Likely not, it's the basics used in any app.
Talking about premature optimization here implies there's some work to be done, things to think through, stuff that's not obvious. And that's totally true at a certain complexity level.
But what is "complex" and what is "obvious" isn't objective. It's purely a factor of what the "average software developer" knows.
What I'm arguing is that while at one point the pendulum was swinging too far one way, it's now too far in the other, and it's affecting software quality industry wide.
For example why cant they just ask. Ok givin an array find all the pairs of the array. Great now whats the big O of that. Great...
Instead they ask some trick question that disguises the fact that you should solve this problem by finding all pairs etc.
I have issues with people who dismiss the problem space entirely. "I don't need to know CS fundamentals to do my job". To which I say: you do. You're just doing a subpar job because -everyone else- is and that's the standard even though it's suboptimal.
(Note I don't have a CS background myself, and thought this for a long time, until I ended up working in environments that were CS heavy and seeing how "hard problems" were trivialized, so we could spend time solving "real" problems instead)
Raise your hand if hiring managers being bad at screening technical candidates for highly technical roles is surprising to you.
For the interviewer for a say intermediate-level enterprise software backend position who needs to quickly sift through 50 applicants, what is the better option by way of code challenges?
Code challenges are a dead end. You can have someone who passes every code challenge imaginable, but is absolutely awful to work with. Software development is more about communication and problem solving than anything else, so focus on those things first. Technical challenges are easier to overcome than personal ones! Here's my strategy:
- Resume smell test to filter out 90% of the applicants (spelling errors, run on sentences, poor formatting, poor experience, etc. If the applicant doesn't care or know enough to fix these on their resume, chances are they will do the same to your code base)
- 20min In person or on-camera interview smell test to filter out another 5% (punctuality, check for communication and personality problems, smoke test of technical knowledge)
- 1 hour+ in person interview to pick your candidate (deep dives into past projects, talk through or whiteboard a common problem in your industry, meet'n'greet with other team members)
- 3 month probation period to fully vet the new candidate and build trust.
There's no way that, out of 50 applicants, all of them are going to be equal, don't treat them that way. You need to fast track ones that are very promising and get them to meet with the team asap.
It could be as simple as: Given a list L of accounts, return only accounts from L that match key K. That's programming 101 type stuff that is applicable to the job.
Or it could be more abstract and open ended: Given some user-submitted form data, what steps would you take to move the data from the browser's front-end to the database. Assume the database has been created. Tailor the question to the technologies on used on the job (Spring, .NET). This type of question gives you a whole host of information about the candidate such as their overall knowledge of the framework, their areas of expertise, potential weaknesses, as well as the ability to nudge the interviewee in the correct direction if they're stuck.
Interview questions don't have to be clever. They don't need to have obscure edge cases to trip people up. I firmly believe that interviews should be about gauging candidate ability to be functional on the job rather than solve the riddle (how many piano tuners are in Chicago?) or non-applicable CS (create a script/program that can output the nth row of Pascal's Triangle [ignoring int overflow]).
Patrick McKenzie (patio11) created a lecture about this topic:
If you are short of time, here is a summary:
Edit: wow, I've clearly touched a nerve. Good thing your doctor is licensed to prescribe something for it instead of a rockstar named Chad who read about MongoDB once but can basically build Uber by this point.
|\ business | - excel macros
| \ + |
| \ users | - web dev
| \ |
| \ | - back end dev
| \ |
| \ | - framework dev
| \ |
| \ | - tooling
| \ |
| coding \| - OS / compiler / database dev
Today these authorities are universities.
A bad programmer could rack you up AWS charges, or leak your clients' credit card numbers to an attacker, or other things like that, but these are just financial losses for businesses that don't do their due diligence (nobody's dying, going to jail, or having their house burn down).
> Moreover, have you also considered its benefits?
I don't think the benefits have been adequately explained to me. Maybe I could consider them if you told me what they are.
Because the healthcare job market is such a success! Remember those accusations against tech companies for colluding to depress wages? Now imagine the same, but with State immunity!
The anti-trust class-action lawsuit Jung v. AAMC alleged collusion to prevent American trainee doctors from negotiating for better working conditions. The working conditions of medical residents often involved 80- to 100-hour workweeks. The suit had some early success, but failed when the U.S. Congress enacted a statute exempting matching programs from federal anti-trust laws.
Unless you have a very good reason why it won't in this case, you're just playing with fire.
I think it stems from an intrinsic starting point among many developers that programming is a true meritocracy and that all hiring questions can be settled with a judicious study of what the person is truly capable of (eg. the fabled personal github account filled with side projects). And that anything other than "hacking on the code" is a waste of time.
Of course this isn't how the real world works. In fact almost no one has the time, energy, or inclination to do impressive side projects. Gauging skills on the basis on resumes and very short interviews is quite hard and time-consuming. Hiring decisions are made on the basis of essentially social interactions and personal references.
I do think there is room for an informal credentialing process in principle, but there are longstanding mental blocks that will need to be skirted. There will need to be leadership from the top to make this happen, ie. the best, most-respected programmers will need to take it seriously first.
Certification exams already exist, and employers are free to require them to avoid the "crap candidates". Other problems, like security flaws affecting the users, can be better solved by imposing real penalties on companies.
How could you do that for software engineers? There are so many programming languages, frameworks out there that are constantly evolving and companies have different demands.
I think it would be a really hard problem to generalise.
An idea that I have thought of is to standardize on some programming language/frameworks/demands that are known to be supported for a very long time (15 years?). If you need something from this buffet, you are fine and have the advantages that this provides. If you have special demands, you are on you own - which is not a problem per se, but as all special demands this simply needs to increased risk or cost.
I'm exhausted and frustrated with the field as it exists just now.
We've had about 60 years to figure this stuff out, and arguably the ground is always moving beneath our feat as the nature and scope of the systems we build keeps changing.
Likewise, but I'm not sure what choice we have, except to hope Ray Kurzweil is right about some of this life extension ideas. If we can survive another 300-400 years, maybe things will be better.
No one cares though.
With those kind of numbers how is anyone with the 12+ years of experience required who doesn't personally know a software PE (of which it seems there are maybe 100 in the United States based on the age of the test and pass rates) supposed to get the certification? I totally accept that my Poli Sci degree means I'll need more professional experience that a Comp Sci or Comp Eng graduate, however the requirement to work under another PE eliminates the vast majority of potential applicants.
Luckily it was on hackerrank, but only as a way to store the exercise instructions with a timer.
The test itself was not terribly difficult, build a small mobile app (I am a mobile dev). It was still enough to demonstrate how you can architecture an app, good design patterns, clean code, etc
After that I had another round of interview with that same company.
I feared that ridiculous whiteboard was on the menu. One of the engineers reassured me when I inquired about the content of this interview : "we think that wb are bullshit, we just want to discuss".
In the end I was offered the job, with a really nice compensation package. I should mention that I live on another continent and did all these interviews remotely.
Is this a general trend in the silicon valley area ? or was I just lucky to stumble on companies that don't believe in whiteboard questions ?
they are entertaining to play with(solve), though i haven't encountered any serious algo problems during my line of work in 15 years. my friend, game coder, always does.
Fast forward 4 years, and I realized that the dev team we have is basically a group of just-graduated Engineers or people with at most 1 year of experience.
My "eye opener" moment was when I realized that by asking for difficult algorithms (tree and graph traversal, subsequences, etc) the following was happening: The people that know the most about how to solve these types of problems are students, or people that had graduated recently. Not only that, but this groups is a very reduced sub-group from the recent graduates group, because these people dedicate their Uni free time to solve these problems and get in ACM, IOI competitions. Thus, they don't play with different technologies, they oftentimes don't feel confortable with shell, and finally, the type of systems they develop are optimized to be "used and thrown away".
My take away: Surely we all need to revise if we seriously want to apply for a job
yet it did feel like the screening was asking me to see how good I am at resolving small problems quickly.
Yet, my 25 years of experience told me that we don't need to solve problems quickly
we need to solve them rightly.
and that takes "mulling over"
It was similar with a Ruby one which I prepared for by running through the Ruby Koans. Lots of time spent looking up syntax, and the vast majority of the time I'm not starting entire repos from scratch, so a lot of simple but not oft repeated things can trick you up to.
And, to be quite frank, when I'm at home, and I want to program, I want to work on my over engineered, terribly underdeveloped personal website where I get to do things I don't normally do, putz around with erlang and postgres, do the devops and all that other stuff.
Ahh the afflictions of the affluent!
- Have great problem solving ability.
- Low emotional intelligence or self-esteem (don't realize that the kinds of companies that make you do these tests are exploitative and will treat you like livestock).
- Don't have specific ambitions and don't place much value on their own time (prepared to spend tons of time studying/practicing for the tests).
Unless you have kids, or a demanding job, or personal obligations, or you don't want to make learning sorting algorithms for a job you'll never need sorting algorithms a full time job on top of your current full time job.
It's just pre-selecting for a group you're in as opposed to a group you're not (the royal you :)).
Well, yeah. I do that too and I've been out of university for 5 years.
You have to prepare for interviews, and to resharpen your skills when you're changing jobs, it's the same whether you're 28 or 48.
I understand finding time is harder when you have a family and all, but it should not take much more than a few hours to get back to a decent skill level, and then you get to the interview stage where you can really stand out with your actual real world experience.
That is why these tests are bogus. If they measured useful skills they wouldn't require any study.
I've been working with a lot of graph algorithms lately. I've already forgotten several I learned a couple months ago. That is OK.
The more interesting question for interviewees is where they would find the answer rather than whether they can implement algorithm X from memory.
If a candidate says "I'd use A* and maybe compare it to similar algorithms to solve the problem" that's enough. There's no reason they need that permanently committed to memory. Almost no one writes A* twice a year.
But place yourself on the other side of the fence : you just opened a new positions, there is a hundred of applicants and you have limited time to interview them all.
If you do spend 1 hour with each one of them, you have spent at least 200 hours on this task, not accounting for the back and forth to find a right schedule and all, probably spread over several months since most people are only available for interview from 17h to 20h or 12h to 14h.
Let's say you have 5 people dedicated to this, obviously technical ones for this kind of task, interviewing everyday for 4 hours (2h interview + 2h of debrief/internal discussion), that's still a whole month of interviews, everyday, for 5 people who are not HR and probably have better things to do, for one not that long interview for one position.
It's simply not efficient, and smaller structure can not handle that.
Alternatively, you can filter out 50% easily by making them take a programming test, with overall few false negative. There's a non negligible portion of false positive as well, but now from 100 applicants you have 50, so interviews are much easier to manage, you have more time to do it and the overall quality of interviewee is much higher.
So yes, in an ideal world, everyone would know the exact level of everyone just by looking at a CV and people would not need to prove anything by a test, but that's not how it is for now.
If you have a better solution, I'm sure a lot of people would be keen to hear it because recruiting is very costly and arguably one of the most sensitive part to build a company.
There are solutions like referals and such which works very well when they do, but when you have a lot of positions to fill it's hard to find enough people.
I would be careful with this kind of statements. It can also show that most programming jobs don't actually require a sophisticated level of knowledge.
As a crude comparison, it would be like a mathematician being asked to recite his times tables. Something he could probably do, given a bit of time and practice, but so outside the normal day-to-day stuff they normally do that it doesn't tell you much other than they spent time preparing. Which is fine if that's what you're looking to measure.
I think the point that the grandparent is trying to make is that if you take a group of developers with a good amount of experience and who everyone agrees are skilled in their field and administer a test with no warning/preparation, and none of them score very well, are they suddenly not very good developers or is the test not a very good predictor of whether someone is a skilled developer?
Hackerrank in fact seems really poorly suited to web development positions. No tests on CSS, HTML, design patterns (yer model-view-* patterns, dependency injection, responsive web design, etc etc.), data transmission structures (eg JSON and XML), browser structure (eg the DOM), dynamic communication (eg Ajax/XHR, Websockets), etc.
Heck, Hackerrank may show that you can write some SQL, but ironically for something with "hacker" in it's name, it can't show that you know about how to mitigate the common "web hack" tricks (eg SQL injection, XSS vulnerabilities, URL fuzzing), something I would think would actually be way more relevant for any public CRUD web app.
It is just an advanced version of FizzBuzz : even if it's not enough to determine whether someone is good or not, it definitely burns people that are not able to code at all. It also provide talking points for the interview afterward about coding style and all.
Of course, it's not sufficient, but it's just a stage among others if the interview is done correctly, and it's a useful one
It really depends where you apply. Some companies have interview processes that require a very specific preparation and it can be very time-consuming.
I applied once for a quant position. The interviewer didn't really care about my resume and we started directly with logic and probability puzzles (a few phone interviews). I was about 35 at the time, and this was the type of things I would have been more comfortable doing when I was a 20-years old math undergraduate. On one hand, I like that they put everyone on an equal ground. On the other hand, I think it's a bit of a shame that they dismiss many years of experience.
I'm not saying I agree with the argument (old people have been complaining about young people for a very long time), but the logic is easy to follow.
The argument is that this is because they expect experienced developers to be able to enter at a higher level, but the cynical side of me thinks that it's because they really prefer new grads, and this tips the scales in their favor.
Remember, Zuckerberg said, "young people are just smarter". I fear that attitude is pervasive, but hidden under the surface; they're not going to just come out and say it, as that would be illegal.
And it's just a poor way to assess skill. Learn about modulus once and you'll pass every other wizbang somebody else thinks they're the first to give you.
You'll always learn more about a candidate's ability by just having a conversation about programming with them. Anyone worth their salt will not only be able to colloquially talk about their skillset, they'll enjoy it.
I am a bit late to this discussion, as Sarah Butcher (the author of the article) just informed me about this thread. This is Marc Adler, the person that was talked about in this article. It's great to see the discussion around Hackerank-type coding tests and the hiring process. There are probably some clarifications that need to accompany the article.
First, when I was interviewed for the article, there was no mention of the word "ageism". As something in this thread surmised, it was probably inserted into the title for a bit of "shock value". The article was supposed to be about the correlation of Hackrank tests and the hiring of experienced people who have been coding successfully for a number of years on "real-world" problems. Many of you in this thread fall into that category.
Second, I have about 30 years of experience in the industry. I started out in the mid-1980s as a Windows developer at Goldman Sachs, writing equities trading systems, went on to form my own software company (which I had for ten years and concentrated on programming tools), and then continued as a developer/architect for more Wall Street companies. Among some of my roles were Chief Architect of Citigroup's equities division, and Chief Architect of MetLife (a global insurance company). I am currently CA at another big company that is not on Wall Street. All of this time, I have been coding. Recently, I developed an Uber clone in Scala/Akka/Play.
I am not completely against Hackerank-type tests for hiring junior devs or devs right out of college. But what I am against is the use of these coding tests to hire people who have 10, 20, or even 30 years of experience. When was the last time that you used dynamic programming algorithms in your job? When was the last time that you seriously thought about Big O? In my experience, a lot of apps use linked lists and hash tables. If you want a nice sorting algorithm, there is one that comes with one of Microsoft's C# assemblies. What is more critical is knowing how to find information and how to apply it in your day to day job. And that's the kind of developer I want to hire ... one who is resourceful and productive.
It's impressive the number of supposedly senior candidates that can't follow simple instructions from the platform or write a couple of lines of code in their language of choice to sort the words in a string or something similar.
I fully agree that multiple complex algorithm or puzzle questions are bad - require a longer time investiment that most candidates should be willing to devote to such process, are distanced to the reality of most programming tasks and favor those who enjoy and practice programming puzzles.
I really wish Hackerrank could add more problems or let employers add their own question. Are there any good alternatives to HR?
I like to add simple coding tasks, multiple choice stuff, and a free form text with 500 chars to explain something..
They also allow candidates to add their own test cases easily, which, for the ones that do, tends to be a very strong signal in their favor (they also tend to do better).
In most of these coding test the algorithm is easy, but figuring out the catch in the description is harder than actual implementation (unless you're insanely limited and have to reimplement basics from scratch)
I do relate with the article. For experienced developers, it's probably not the best way to recruit. Ask something related to the job, not the degree.
I have only a vague idea of the requirements in financial technology, but aren't streams of financial data central to it? And repeating substrings correspond to pattern repetitions, so they might be important, especially the longest ones.
I think if you can find the longest repeating pattern in a stream of data, you should also be able to find maximal duplicated substrings in palindromes.
Maybe it's more about the way the problem is phrased rather than about the difficulty of coming up with a solution?
Since they did not save money for the twenty years that they have been paid quite well, they dont like the prospect.
What kinds of challenges to you have in mind?
And contributions to tiny projects tend to get passed over anyway.
Moreover, certain hiring agreements would consider it breach of contract or confidence.
Making your own Github or such portfolio faces the same problems.