Hacker News new | past | comments | ask | show | jobs | submit login

> And I mean that literally: they're at a loss at how to write a function that adds two numbers or counts the number of elements in a list.

Seriously, Where are you finding these candidates? seriously.

I've worked at a number of mid-sized companies, and interviewed dozens of candidates, and I have never, ever, ever come across a candidate that couldn't write code on this level: "write a function that adds two numbers or counts the number of elements in a list".

I've come across them repeatedly and in some cases they have formed the majority of candidates that got to a phone screen. It's a result of our industry having many high paying jobs in relative physical comfort with no hard gatekeepers. Can you imagine what interviewing for a first year law associate's position at a BigLaw firm or a radiology residency would be like if there were no law and med schools and no board certifications?

Considering the rate of false positives in any software engineering interview process there is every incentive for the underqualified and unqualified to "fake it 'til they make it". It's also difficult to tell the difference between someone failing upwards and someone aggressively managing their career by switching positions with lateral raises in this hot job market, hence the need to distrust the skill of even senior developers and force this rigmarole on every level of engineering talent.

Crappy head hunters. I have some outside firms who have sent me good people very reliably, and then once in a while HR will make me try another company who probably gave us a cheap quote, and I'll get a series of terrible candidates from them. I've even been given outright frauds -- people who paid someone else to phone screen for them.

Cut and paste answers to initial screening questions was my favorite. How I knew they were cut and pasted? I got paranoid after some suspicious answers and started doing searches for random lines from the answers.

The included such brilliant things as cut and pasting an answer from a forum that was followed by a dozen comments of people explaining how wrong that answer was, and someone who answered what should have needed a short sentence with two pages from an Oracle manual giving an answer that did not apply to the question. .

It's not that we expected everyone to be honest about not using Google - it didn't matter, it was an initial screening question. But we did expect them to at least bother to restate the answer in their own words if they looked it up. And get it right..

I've used simple problems 'write a function to give me a maximum from a list' and 'provide a list of test scenarios to validate this function'.

This has a surprisingly high failure rate even in cases where we just email it to the candidate and discuss in a phone interview the next day. I don't think it has anything to do with a persons ability to program but likely derives from a person's ability to understand a work request.

"I need a function to add two numbers together."

Innumeracy is the norm: I'd guess > 85% of people don't understand the concept of a function.

And they probably can code if they were working independently. Or they've done some classes, wathced some videos and think they understand it.

But when you add the pressure of an interview, your unpracticed skills fall apart. Also, you have to think on your feet to fill in the blanks in a question.

That's how it should be, because we're not hiring hobbyists; candidates need to be able to demonstrate that they're pros, and able to do so under the pressure of an interview.

I've done my share of phone screens who were flatly unqualified as developers. (Thankfully we've never had someone completely clueless land an in-person interview. That's also a disservice to the candidate as we should provide better guidance through the phone screen.)

Some of them are junior, possibly they lie on their resume and simply keep applying to job after job. That's the 99% that Joel wrote about[1].

You also occasionally get guys who were in management or similar roles and are looking to transition to being engineers. And I think these may have a similar problem to the senior engineers: they have lost the skill (or never had it) and are finding out the hard way.

[1]: https://www.joelonsoftware.com/2005/01/27/news-58/

I simply can't get behind the idea that having someone whiteboard an algorithm is analogous to either their programming skill or their ability to work within a company. I spend a minuscule fraction of my time writing algorithms in my daily practice, most of my time is spent integrating disparate technologies, data wrangling, and working across teams to get the information I need to make our product. Clearly there needs to be some vetting of someone's skill, but problem solving and troubleshooting are way more valued skills in my group than the ability to write a fibonacci sequence on the whiteboard. I have had far better success asking questions around diagnostic process and troubleshooting to find talented devs than I ever did using whiteboard like tests.

> I simply can't get behind the idea that having someone whiteboard an algorithm is analogous to either their programming skill or their ability to work within a company.

It's not an analog. It's the actual skill up close. They should be explaining their thoughts as they go, and you're asking why they do A instead of B.

> I spend a minuscule fraction of my time writing algorithms in my daily practice

A good problem isn't simply an algorithm, but also tests how they break a problem down, how they compose a solution, how they think through engineering tradeoffs, and how they communicate all this to you.

Consider the difference between an artist and an amateur painter. That the artist has practiced brush strokes is not surprising, anyone can practice painting a lot. What really matters is the artist can take the image in their mind and composing it into a complete scene and then express it all that through their medium of choice.

> but problem solving and troubleshooting are way more valued skills in my group

Is that a good thing? If your group wrote better code, wouldn't they have less troubleshooting to do?

Yes, that's a tautology, but I've worked on code that was kludges on top of kludges. And while kludges can be inevitable, if they persist, it indicates the person doesn't have the mastery to see a better way to express a problem. That's a skill deficit.

When I'm analyzing someone's ability to code, I'm presenting it as a problem to solve. We solve problems by restating them in such a way that the solution falls naturally from the question.

The candidates who can do this well will put together well structured, coherent code, and my team will spend more time delivering features and less time troubleshooting.

Same. I've never had someone come in for a face-to-face interview that literally couldn't code at all. I've worked at and interviewed loads of people from small startups all the way up to a top 5 US tech giant. Still have never came across that case.

It's generally a lack of problem solving skills in my experience. The main coding question I ask is actually a short word problem (less than 4 sentences for the entire problem statement).

Without giving away the question I ask, I can tell you the solution is a for loop and an if statement. If I told them exactly what they needed to solve the problem, I'm sure most could write the code (though honestly some would have still failed). It's a question I would think could fit as one of 5 on an intro to programming class final, yet I've had candidates with 10+ years of experience fail it. I even had one such candidate argue with me over asking a coding question when his resume shows so much experience at different roles.

It happens if you don't do phone screens. People lie on their resumes.

This. I can't trust your résumé at all, but it does tell me what you think should be reasonable to ask.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact