Hacker News new | past | comments | ask | show | jobs | submit login

I've never understood why people hate them so much. From the employer side of things it only makes sense to get a feeling for someone's abilities other than an impression based on words alone.

You can't believe the amount of shit solutions we've gotten from candidates. We just let you make a very simple kata. A tiny program that generates some console output, you have to refactor it to make it prettier and you need to add one feature. Literally half of the people fail to make it work. Many others just show zero effort for code cleanliness. That's all we ask, make it work and make it look pretty.




> From the employer side of things it only makes sense to get a feeling for someone's abilities other than an impression based on words alone.

I'd like to believe this is true, but it fails to explain why candidates for other business functions don't receive the same scrutiny.

I'm not aware of analogous evaluations to get hired to other business roles (e.g. marketer candidates aren't asked to demonstrate a working knowledge of the Google ads dashboard, accountants aren't expected to clean up a fake P&L on their own time for review by hiring managers, etc).

I could be wrong and always welcome correction, but from anecdotal experience talking to friends and work colleagues, the bar for SWE hiring is much, much higher, even controlling for compensation.


I'd look at it the other way: Other high-difficulty jobs have mandatory licenses and certifications that weed out the chaff. Lawyers have the Bar exam, engineers have the Professional Engineering exam, doctors don't have a specific test but they have all of med school, EMTs need to get an EMS license/certification. Software engineers can get their foot in the door with a javascript coding bootcamp.


This explanation works for entry-level candidates but fails to explain why senior candidates are often expected to do similar exercises _in addition to_ any work experience they have.

New lawyers, doctors, and CPAs have to demonstrate textbook mastery to pass a handful of exams once in their career. Engineers are expected to demonstrate textbook mastery for every job they apply to _for their entire career_ (and often multiple times per application!)


> New lawyers, doctors, and CPAs

everyone you mentioned here has some kind of an ongoing public tally going on - Yelp/Google reviews, customer referrals that lead to new business or lack thereof. If I'm looking at a crappy lawyer or accountant, they probably have a 2* average of public reviews and/or out of business because noone wants to refer to them. Is there an equivalent of that for a mid-career programmer?


I don't think this is true. Most of the doctors and lawyers I know work at big firms with a publicly reviewable presence, but there's no practical way to review individuals at those firms.


Accountants don't have to have a CPA. Half the accountants working under my partner (Accounting Manager at a large private university) don't even have a Bachelor's in Accounting.

> EMTs need to get an EMS license/certification

I love EMTs. I was one. I'm a paramedic. I train new EMTs. But the EMT course is 160 hours, and is designed and tested to be passable as a high school junior. Let's not use that as a comparison.

Most of these positions also have zero to minimal continuing education requirements which often, let's be real, are trivial. Quick online courses that can be busted out in a couple of hours, or "go to this hotel in a nice location, spend a couple of days, and go to the conference room off the lobby for a couple of hours in the morning".

Software engineering? You have people saying here - with a straight face - "Yeah, a 3-4 hour take home exam at every company you interview at is entirely reasonable" for the rest of your professional life.


I have a friend who is an accountant. For entry level jobs, those jobs were meant to be done by someone with a high school diploma and the bar for interviewing is literally "hey can you do basic excel" as you get closer to staff level the interviews become far more complex and nearing what you see in tech because they are testing if you _could_ pass the CPA if you had to. This kind of grilling can be skipped by simply having a CPA.

There's really some truth to the licensing thing. In some ways, I'd really like our field to adopt certifications so we can skip the BS of interviews. I got the leetcode certification, let's talk design or something relevant please.


Doctors have licensing tests, also called boards. They take a few rounds over the course of med school and residency.


I think the difference is that it's really hard to tell how difficult SWE work is and whether or not someone's doing it (since the real work is all in the brain). So it's comparatively easy for a fraudster to skate on very little knowledge/ability for a long time. When this happens with doctors or pilots we call it a major motion picture. When this happens with SWEs we call it Tuesday.


Why SWEs aren't required to document their process of problem solving? Like "I have problem X, I intend to solve it in that way, first thing I did is google that shit, found this article, compared those libs, picked that one because these and these reasons, etc.". Yeah, it can be painful and unusual first time but when it's mandatory it can make probably the best habit a professional can have.

This would help everyone from SWE itself (by tracking problem solving process) to his manager, colleagues and everyone who will work after and it's a ready draft for a blog article that'll say more about them than any CV.


It's not particularly easy to do, the parts that don't lead to a solution are boring, and people may make nitpick criticisms that aren't at all helpful.

I work at a place where design documents are supposed to include the alternatives that were considered and rejected, for much the same reason, and this does work to an extent, but it's not quite what you're suggesting.


At companies I've been at (mostly earlier phase startups, YMMV) there has always been an effort to do some sort of technical vetting.

Designers need to present designs / their portfolio.

Sales people need to do a demo.

Product people need to put together a mock roadmap or pitch a feature.

And so on.


> Designers need to present designs / their portfolio.

As someone married to a designer, this is soooo much easier than the hoops programmers have to go through.

Might take a little more work upfront (or just printing out work from previous jobs if allowed), but then you just flip through an existing portfolio the night before, and bring the same portfolio to every interview, no extra prep required.

Meanwhile, a programmer has to perform intense 1-8 hour tests every single time they apply anywhere, and make sure they remember the answers to gotcha questions in about 30 different subjects they could be asked about.

My wife always goes to way more interviews and talks to way more recruiters than I ever have (probably 5x more), because all she needs to do is read through her portfolio and practice some questions for 30 minutes the night before. And her interviews are usually just one or two hours long.

Meanwhile I always have to spend weeks brushing up on Leetcode before making a big new job push to make sure I don't have too many surprises, and I avoid going on interviews because it'll be long grind that I usually have to take half a day off work for.

I still had to do the stupid technical tests for a mobile app job where I could tell them to go to the app store and download a game of mine, with my name on the title screen, and they could play it, and they were really impressed with the game (the Xbox 360 version of it won a game design award in a contest hosted by Microsoft, and it looked and played identically).

Like... come on.


FWIW, I've often given interviewees one of 3 options:

1. Do an in-interview programming test. We try to make this as "real world" as possible with limited time, i.e. we give the candidate some existing code (which is similar but "slimmed down" compared to our actual code) and tell them to enhance it by adding some feature, then in some cases we had another piece of code with some bugs and asked them to fix them and write test cases.

2. Do a take home programming problem. I like to give people the option because some folks just do really poorly under the pressure of an in-interview test. When it's finished and they come back, we review it together and talk about their choices, etc.

3. If the programmer has lots of publicly reviewable code, I ask them to just share it with me so then I can review it and discuss it when they come in.

I basically just need to understand "Can this person write code?", and, related, "Can this person take a request in English and translate it to code quickly and efficiently?" And despite giving these choices, when I've posted a description of this on HN in the past I was still flooded by responses about how I shouldn't expect any of this: "I have a real life, I don't have time to do your take-home problems", or "I've been working in industry for years and coding for a job, all that code is proprietary and I can't show it."

All that may be well and good, but then my interview process is working - I don't want to hire you, and I can find people I do want to hire that are willing to do one of those 3 things, and it's not my job to make you understand that. Honestly, for all of the bitching about technical interviews, I feel a huge part of it is that:

1. People just can't accept that there are other people that are better than them that do do well on technical interviews and excel on the job.

2. Yes, there are outliers, and you might be one of them, but it's hard to craft an interview process around outliers. I also agree with Joel Spolsky's mindset of "It's better to pass on someone who might be OK, but you're not sure, than take the risk of a bad hire." I feel like every time I've made a bad hire there were definitely yellow flags during the interview that I tried to explain away, but I always ended up regretting the hire later and I've become more hardline on "if you can't prove your skills in the interview, I'm going to pass".


Have you considered that that's a function of startups and not any intrinsic necessity of those positions?


> I'd like to believe this is true, but it fails to explain why candidates for other business functions don't receive the same scrutiny.

I don’t know where this idea comes from. Other roles at most companies I’ve worked for have had plenty rigorous screenings for people, often including more reference checks, portfolio reviews, work samples, presentations, and other things.


Having worked a bunch of other jobs, SWE is an order of magnitude mentally harder than most other jobs. It's like being a translator, poet, detective, and puzzle solver all at once. And you have to do it all collaboratively with a team of other strong-willed, high IQ, low EQ teammates. With weekly deadline pressure. And management who thinks it's taking too long.

Of course my cousin who is a lawyer at Cravath works like 3x more hours harder than I do. She gets paid like 2.5x more too. They just hire tons of people and let the job weed out the bad ones. Most engineering teams can't do that because we're not trying to squeeze 100 hours a week of work out of our engineers.

Of course, plenty of teams do basic work. But plenty of teams with even basic sounding work have to handle an absolutely huge amount of complexity.


SE is different because those other professions generally aren't creating anything. If SE had a program where it just writes the code for you, then we wouldn't have to test them, just like an MBA can work off existing Excel sheets because what matters is the output of that application. Most new code and bug fixes require extremely detailed abstract knowledge that (so far) hasn't been able to be commoditized into an application. The next few years may be a game changer for that though.


I don't agree that those other professions aren't creative. If anything, the ambiguity behind what constitutes a successful brand design or convincing a client to buy your product seems to require more abstract knowledge to me (as a software engineer) than the ability to read and implement syntax.


the vast majority of people who work in offices just push papers, go to meetings and other mindless bs. the people who build brands are higher level managers/ivy-league over-achievers. sales people are hired or retained based on talent, getting a very low base salary and high commission. writing code is way more than reading and implementing syntax, it's actually making the design work or solving very tricky bugs. People with bare-minimum degrees and no demonstrable acumen are useless. take any given tech idea you want, it doesn't "just work", the devil is in the details.


> the vast majority of people who work in offices just push papers, go to meetings and other mindless bs

I know this to be true for many working in software engineering. Conversely,

> writing code is way more than reading and implementing syntax, it's actually making the design work or solving very tricky bugs.

This can be true but is not always true. I think you were right when you said

> take any given <sales|tech|branding|marketing|HR> idea you want, it doesn't "just work", the devil is in the details

:)


I don't know that it's higher, per se but it's more that being able to discuss concepts isn't enough. A programmer needs to be able to translate those concepts into actual algorithms and working code. I've interviewed people who were able to look at the coding problem we gave them and discuss it intelligently, but when it came to actually writing even pseudocode to solve it, failed miserably.


That’s true for other roles, like an MBA grad that can discuss financial principles but can’t navigate Quickbooks or use Excel.

From my admittedly limited understanding, many of those openings are filled based on resume and verbal interviews with little or no quantitative evaluation of skills.


Use Excel yes. I'd expect an MBA grad to know the accounting principles that Quickbooks is based on and maybe puzzle out how to use it but not be fluent in it to the degree I'd expect of Excel.


You said it yourself - it's a question of engineering versus business roles.

Software engineering doesn't necessarily have a higher bar than other comparable STEM.

And lest we forget many other roles have to pay their dues upfront at a much earlier stage: doctors have the MCAT, lawyers have to pass the bar, many accountants become CPAs, etc.


I've never heard of a civil engineer being asked to design a blueprint in Autodesk with a more senior engineer watching them, or an accountant asked to calculate a department's P&L given 90 minutes and a folder full of Excel files. It might happen, but I suspect it's uncommon.

You're right about exams, but that's a one time thing. New lawyers, doctors, and CPAs have to demonstrate textbook mastery to pass a handful of exams once in their career. Engineers are expected to demonstrate textbook mastery for every job they apply to _for their entire career_ (and often multiple times per application!)

It's also worth noting that engineers have standardized exams and certifications, like CompTIA or AWS Certs, but for whatever reason those credentials do not seem to carry much weight. I've never heard of those replacing technical evaluations, just used to enhance a resume.


And SWEs have to go to college or post-grad. However they're eternally in the low level hell of solving coding questions.


Yes, you can weed out 50% of incompetent applicants, but that is not the issue. The problem is that the people who will excel in these questions are the ones playing the leetcode game for months. The people with real jobs will pass your question but will do so-so compared to the leetcode gamers, and the second group will get the job. Also, doing exceedingly well in the coding questions doesn't guarantee these people are any good at the real job.


Too bad there isn't a test for "fucks given." That would weed out about 80% of applicants. I can work with just about anyone who passes that test.


While I don't advocate for it, a long take-home problem filters for that.


It also filters for "people with children", "experts who realized they aren't show dogs", and "anyone who values their time".


What about a take home test that takes 1 hour with no leetcode/trivia? At a certain point I feel like you have to pick your poison


My last "1 hour" test took me hours to complete. I asked them to implement it in front of me in an hour and they couldn't.

Programmers are terrible at estimating and programmers will choose tasks that are obvious to them because it's the exact thing they do every day, but it might not be so easy for people not in their exact niche.


>a long take-home problem filters for that.

I disagree. Maybe and only if it's paid and paid well. Maybe $150 an hour. Not many who are good will put up with that; because they don't have to.


The old saying is "pay peanuts, get monkeys".

Let me propose a variant of that: "you'll end up with monkeys if you require people to do monkey tricks".


Your take-home exam will not get many high quality candidates. Most people who have an option will not put up with this kind of requests.


That's actually a good task though - do something that at least partially resembles what you'll do in the job.

I think these folks are moreso annoyed by academic quizzes cribbed from 70's programming books that don't flex anything we're interested in, and do focus on things that are typically not very relevant to the job. Oddly they do seem to both prioritize new grads that are willing to shovel shit, and at the same time reject experienced folks that don't have the time for said shit.


What you're doing sounds fine. We did something similar, and what we got back was either 1) the obviously correct solution, 2) try-and-error soup, or 3) extremely complex over-engineered junk (we specifically told people not to do this, so double fail).

What most people object to is stuff that's just really time-consuming to do well. And/or stuff that gets rejected for silly reasons (typically requirements that weren't actually stated).

Or things like "please implement Conway's game of life in 30 minutes. START NOW".




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: