He invited me down for the technical interview. I came down a week later. His wife had just delivered a baby, so he was out. The guys in the room were a business level ex-marine who was there for god knows why, and a technical dude who clearly was pissed he had to sink to the level of doing the interview in the first place.
For the first 45 minutes, I aced everything he threw at me, and I noticed that he wouldn't drill in on anything that I clearly knew very well. He kept jumping from topic to topic, and eventually asked to do an extremely tricky SQL query, but write the code to do it in awk on the whiteboard. The job posting said nothing about awk, and I told him I didn't know awk that well. He then sensed i wasn't a command line master (the job posting said nothing about needing to be a fucking sysadmin or know advanced CL stuff) and hammered me on things that I had already said I didn't know.
2 things became clear:
He wanted to show off to his boss and make himself look like a badass while simultaneously making me look incompetent.
There was no fucking way I was going to accept a job working here if it was offered.
I didn't so much as get a phone call or an email to thank me for driving 6 hours round trip. Nothing. Which screams out to me that this company sucked.
After I left, I also did not get so much as a phone call or e-mail. Neither thank you nor followup. When I attempted to re-establish contact, it was like shouting into a black hole.
The feedback I received on site from the interviewers was neutral to positive, with one interviewer claiming that I was the only applicant to come up with the correct response to their abstract brainteaser.
If I take two whole days off from my existing job to come to you and indulge you in your cute little skill tests intended to prove my bona fides, you ought to have the decency to follow up.
So you're not the only person to be on the receiving end of this unacceptable behavior from interviewing companies.
In my case, I was pissed, but mainly due to opportunity cost rather than the trip itself.
By the way, you should email their founder and let them know. I doubt he/she is aware of this, and would be equally furious to know that their workers are burning bridges like this, because that's exactly what they did.
In my case, the company did actually burn a bridge. My much larger company was looking for a strategic partner with high levels of data science expertise on sensor based data. This particular company came into consideration, and I made sure to remove them from selection. Hell hath no fury like a geek scorned.
Companies routinely pay for airfare and lodging, it would seem to only make sense if they also paid for long car trips, no?
See http://www.irs.gov/2014-Standard-Mileage-Rates-for-Business,... for more details.
But from what I know, running into such people is pretty common in interviews.
You have two cohorts which comprise a majority of your applicants:
1. Unemployed, X% qualified. Has time.
2. Employed, 100-X% qualified. Does not have time.
X is small. If both candidates are qualified, tests for X also find how qualified they are so they can be compared.
His methods are objectively better, provided that both candidates have the same amount of free time. If you compare two unemployed engineers, you'll pick the right one. If you compare two employed engineers, you'll pick the right one.
If you compare an unemployed engineer vs an employed one, however, you'll probably pick the unemployed engineer, even if the employed one is better. Because he doesn't have as much time for your homework or to maintain a Github profile.
Thus, if you're comparing 30 engineers, half of whom are employed, you'll pick the most qualified unemployed candidate, rather than the objectively most qualified candidate.
That's why, even if the test is slightly worse, with an interview room only test you'll consistently pick the better candidate: a large, large segment of talented candidates are employed.
I would love a statistical analysis of Simpson's Paradox and technical interview methods if anyone's ever done one.
I talk to the applicant and try to work something out. I interview weekends and evenings if that's what they want. First interviews are always on the phone. Eventually, they have to come in, but by then, we are pretty sure it isn't a waste of time.
Especially if their day job sucks, which is one reason people look for new jobs.
Not too long ago I was at a job where management demanded 11½ days. Guess how well the team members could handle an interview at 8pm.
That's fine for startup-ish hires, but in most of those cases, isn't the candidate being sought out/referred/etc? Almost all of my work is on non-public SCM systems, and frankly, my job doesn't leave a ton of time to post to *overflow sites, and when I go home, I'm home, and spend it with my hobbies and family.
I also consider myself a really great developer. Especially in the more corporate .net world. When I review candidates, I find that a simple (at a PC, with resharper installed) coding test, with a prebuilt solution, needing only an implementation of a method done, is a great filter for competency. And after all, rough competency is the most I hope to get out of a coding test. The far more important part is always the stuff like how well I think they'll fit in a team, how their approach to solving problems in general is, etc.
Syntax memorization is no longer a measure of a good developer. All it shows is basic competence in a language.
Of course the flip side is, you can probably get by with minimal outside work. Do some code katas, have an up to date best practices simple app in github, etc. I think as long as you have something in github that looks good it will do the job, even if you're not working extra hours daily on open source projects.
Having worked under such terms, I've never felt I could participate in open source without jeopardising the project.
They don't want "You were doing exactly the same thing over there, now you'll be doing exactly the same thing over here", because they don't have the money in the budget for a real applicant.
However, it's true that coding exercises that can be completed in an interview are arbitrary and abstract (by necessity, since you have no context.) In my opinion one should use live coding exercises to test fluency -- can you code at the level you are claiming? Debugging something might be a better test. Then pair that up with a short (short!) take home exam to test problem-solving.
The one thing that really has to go is whiteboard coding. Ask the candidate to bring a laptop or give them a used one.
OP also thinks that high-pressure situations don't occur with programming. But if you work on the web, sooner or later you will be debugging a tricky issue that's costing that company $X per hour while your all your managers up to the CEO hover behind your desk. Being able to communicate with non-technical people and inspire their confidence is also very important. Often more important than coding ability.
I have had my fair share of false positives. And probably I have been one or two times the false positive. And this is something that everyone can relate I suppose.
Sometimes you just need luck - like the gotcha question being something you have debugged soon in a language you are not that competent at.
If only this were true. There is a whole class of people out there who are stellar face to face and not great for your specific technical/business situation.
But past a certain level, it's your job to explain your own weaknesses too. I just had a technical interview the other week and they relied way too much on definitions and whether I'd heard of this and that. I happen to do great at those but I want to be sure I can be productive (the position uses Elixir, which I'm totally unfamiliar with) so I'm suggesting they give me a take-home exam.
The problem is that problem solving with fancy algorithm's can even be part of the job, but are normally just a small fraction.
I've never seem dumb people going incredibly well in technical interviews, that's true, but I've seem a lot of bad programmers giving great answers to those abstract problems. And sometimes, they're knowledge wasn't actually a fit.
A smart candidate will bring a notebook computer, loaded with suitable IDEs etc ready to go, without being asked to. Impresses people when you tell them to continue the interview while you write the requested code (multitasking, competent enough to spare nontrivial brain cycles answering unrelated questions).
Will some of those with opinions of this suggestion describe those opinions, so that I can better evaluate whether or not it's actually worth considering? Thanks!
However, I don't think bringing your notebook to show off some of your work is a bad idea.
Downvotes? Meh. Taking a notebook and writing the solution with sensible tools - while fielding unrelated questions from 4 interviewers - got me an offer.
Why the downvotes?
For my current job I was provided with the specifications for a simple app to display images from a flickr rss feed and asked to code however I felt was appropriate. It was interesting and fun, and vastly less stressful than any whiteboard test.
It was also a great indicator of what working at the company was actually going to be like.
This is how I bombed a DevOps role interview at Twilio, but was picked up as a VP of Engineering elsewhere. +1.
Agree with OP. Coding under pressure is a horrible way to measure good coders, because it's an artificial construct.
For people looking for alternatives to whiteboarding or phone screens where the candidate writes code in the blind, check out https://coderpad.io
It lets you write and execute code with your interviewee in real-time, and provides a much more native programming experience easily and over the browser. Some rather large companies have started using us exclusively in the in-person technical interview, even buying Chromebooks especially for the application.
Disclaimer: I am the guy who makes CoderPad and am obviously biased.
I use a whiteboard at work to pseudo code a solution, and the writing looks like chicken scratch but even my co-workers get what I'm expressing to them. Then I open up a console and start typing out my idea to see if I was right. That's how I normally solve some throw away piece of code. I don't go into a manager's office and call some other random people and in detail explain to them my idea to a solution.
I also got the feeling they never saw my resume because I was asked if I had a github profile, and it's on my resume.
The proper reaction to such a request would be: "So, you guys do all your work here on whiteboards? That's seems unusual, I used laptops or workstations at all my previous jobs."
I've done this -- had people talk to me about their anxiety or problems with it, and put them at ease. For some, I have recommended that they just go practice of an hour or so with a friend -- it's really not that hard to code "whiteboard" level code if you code every day and practice it a little. It's such a common tech interview style, that if you are looking for a job, it's worth working a little at it.
Frankly, in a code-editor, my expectations are much higher -- I don't even require you get any framework class or method name right (or even perfect syntax) -- but the compiler will. It gets in the way of the essence of the question -- which is more about collaboration.
This is for a screen to make sure you are a programmer. If you can't code up a four line function without an editor, there are going to be a lot of jobs you might like that you won't get. Ditto with calling strangers stupid.
I don't give a flying fuck about minor syntax errors, forgotten API, or anything a compiler can catch. If you forget an API, I either give it to you or ask you to make up something reasonable. (Chances are, I can't remember it either). But if you can't describe an algorithm to me in front of a whiteboard, I don't want you on my team. And if you can't translate it to pseudocode that resembles the language you're going to be working in, I also don't want you on my team.
When I interview candidates, I look for a good foundation for what they will be working with like do they know the difference between class and struct and the implications of using them (day 1 kind of stuff).
Then I ask them about how they would go about solving a problem they were unfamiliar with. Google-foo is a skill that must be learned and honed. I don't care if you can spit off all sorts of acronyms, I want to know if you are capable of using common sense to solve a problem.
Last but not least, how do they stay up-to-date and relevant. Not looking for the 8 to 5 developers and not interested in those cutting-edge guys either.
Must have good understanding of the basics, principals and skills in problem solving. Telling a block of code that you have a masters in CS will NOT make the bug go away.
Source: Been interviewing supposedly senior level candidates for the last three weeks.
It is not the technical interview that is a problem; it is the interviewer. I sit quite often on both sides of the table and I have noticed that some interviewers are eager to show off their skills more than trying to learn about the candidate. Or they are auditioning bros to go clubbing or to invite to barbecues . I have not ever cleared a tech interview when I was interviewed by guys in twenties or early thirties. I have a 100% success rate when I am interviewed by people,all races and gender , over 40.
When I am conducting interviews, I place a high degree of importance on the candidates aptitude and approach to problem solving and I usually build up a pretty good team with candidates who the bros wouldn't hire.
Some people suggest looking at candidate's github repo; this would not work for enterprise software developers and most large companies have restrictions on what code a developer can claim as his/her own and publish.
edit : grammar
In interviewing a diversity of experience is especially crucial. Further, being experienced at interviewing is super important, and that sort of experience is even harder to gain than other software experience.
The hard part about the job is
1 Getting the actual real requirements nailed down
2 Designing the system to run in the real world accounting for all those edge cases and falilure modes.
Wherein the programmer plays the role of social worker. Seriously though, the skill sets are very similar. The client is typically the ultimate source of the requirements, but it's never a simple case of asking the client "what should I build? More often than not, the client doesn't know what s/he wants, wants something that will actually hurt him/her, is actually seven different people who want opposite things, wants the roses simultaneously painted white and red, etc. Which is why I find requirements collection the most challenging part of the job.
If on the other hand you are saying, "should I be evaluating product specification and requirements gathering when hiring business analysts, or product managers" then I would say of course! What else would be evaluating them on?
Similarly, there are environments where static requirements are cornerstone for success. Certain methodologies deal with this environment better than others and certain developers will be more successful than others in these environments.
A.) Are great in their dayjobs and don't wish to spend their free time writing Open Source. Maybe they have families. Maybe they have non-coding hobbies.
B.) Work on private repos all the time.
Whiteboarding lets me prove raw problem solving IQ without being negatively judged for lack of twitter followers or technical blog posts.
Portfolios allow devs to prove raw problem solving IQ without being negatively judged for lack of documentation or a debugger.
There are developers who aren't good at whiteboarding interviews because they don't think that way and work better when they're at a laptop and can refer to other source code and documentation and can run their code continuously as they write it.
With those devs, they may not look impressive in a whiteboard interview, but you can look at their GitHub portfolio and draw conclusions about their programming ability.
An ideal interview should be holistic and include both off-the-cuff code and polished projects.
A good judge at a whiteboard won't mark you down for uncertainty about interfaces, which partially ameliorates this, for those cases where you have a good judge...
I'm also tempted to say that needing a debugger to lay out a mostly correct solution to a small problem is weird - can you elaborate on how that might be applicable?
In any event, I agree with your ultimate conclusion - holistic is probably a better way to go.
So in each case we come up with proxies for this measure. Having done the interview side of the fence a lot, my strong intuition is that nearly everything people try to measure in developer acquisition strategies either don't correlate or negatively correlate to hiring good developers.
At this point, the only advice I give people about this is figure out a repeatable, measurable process for your organization and measure every single filter you use during hiring. Test the measures on just hired applicants, applicants a few months out, and applicants a few years out. It is time consuming and fraught with problems but there is no magic bullet out of this morass.
So, what shall we do? How can we very strongly incentivize every qualified person to upload their profile, in a way that lets us curate people's actual abilities?
The suggestions this article makes, to demand githubs or complete projects done for demonstration purposes, have both been rejected. Sure, it may be a good strong signal. But then so is the signal of someone creating a complete application for your company (complete with branding) to actually A/B test on the front page, and then if it does well, for the canddiate to show up for a 3 month unpaid trial during which he or she must contribute as a full member of the team and can be dismissed at any time for no reason. I guarantee, anyone who passes that test would be a great candidate. The issue is that there are perhaps three people on the planet who would even consider applying to a company on those terms - and they're the original founders' siblings.
The problem is that it is not a reasonable burden on job applicants, and neither is a complete github profile, and neither is a complete programming project. It's just too much.
What is needed is a smaller burden that is a good, strong signal.
We think we've found one, but are not sure. (You can see it on our web site as soon as we announce.)
In the mean time, if anyone here has any breakthrough ideas in this space, we would be very interested.
Even if you don't have the right answer, you at least found the right question.
My goal is to answer "Are you honest about what you know, excited and curious about what you don't, willing and able to learn, and driven to excel at the role that we're discussing?" Sometimes answering this requires code, but generally not a whole lot, and mostly I just use it to tease out where someone actually is vs. where they say they are. I don't want to hire a candidate who is best suited to make my problems go away tomorrow. I want, a year or three down the line, to say I built the best team possible.
Maybe I can say this because I don't think the "technical" part of what we're doing is the major challenge, though I think that most engineering teams are actually in the same boat as us. Google has far different problems than we do, so their hiring process is rightfully different than ours. And maybe I'm just getting it all wrong. We'll see what happens.
In my technical interviews, I look to talk through specific technical problems, and they're usually real-world bugs or design issues I've worked through in my job. There's not usually a lot of code involved, and the code isn't that tricky. It doesn't usually even rely on too much specific domain knowledge. It's about being able to take a broad technical question, formalize it a little into something concrete enough to be discussed, identifying constraints on the solution, working towards a solution, reasoning about performance, and discussing tradeoffs of this approach vs. others. In short, it's about technical reasoning. I don't care if we get to the right answer by the end if we've had a fruitful technical discussion.
Code samples are a good idea, but they're as deeply problematic as interviews. Most of the code from most of the best engineers I know is closed-source, not on github. 48-hour programming projects can tell you what someone can cobble together, but they don't say anything about the person's attention to longer-term concerns around design or code quality. Moreover, being able to code up a technical solution to a 48-hour problem is like basic literacy: I expect at least that, but I really want to find people who can make forward progress on the uncertainties associated with a 3-month project, at least.
If it sounds like they are currently not full-time employed as a programmer (and not programming a lot), I explain that they will probably not do well on technical tests (mine or anyone else's) if they don't practice. I recommend they just find some sample questions online and practice them -- let me know when they think they are ready.
I explain it's just like playing an instrument -- you wouldn't audition without practicing beforehand, right? I also explain that it's very hard to tell the difference between someone who is rusty and someone who is not skilled -- I want them to be at their best.
To everyone, currently programming or not, I explain what the interview will be and everything about it that I can except for the questions. I am hoping they will self-select out if they know they can't program (or talk to me about it).
I also explain that I am not looking for people who know all of the answers, but I am trying to calibrate their resume and see what it's like to collaborate. All tech screens are conducted in the language of the applicant's choice.
The meta-question I am asking them: If I tell you a bunch of requirements and some guidelines for success, will you do the work necessary to succeed.
That said, I would say Github contributions and maybe paired with StackOverflow activity may be a good substitute for a technical interview when available.
Our team "technical interviews" by having a technology discussion with the applicant. One of the first lines of question is figuring out what they are most familiar with so we can discuss that particular thing, area, library, or whatever. If a person can't discuss what they are most familiar with in the high-pressure interview, I'm not sure they can discuss something they just learned about in a team design meeting either. It's also a great way for the candidate to figure out if he wants to work with us - something that is just as important as the reverse.
Quit making technical interviews a quiz show. Quit checking off boxes on your form. Quit with BAD technical interviews. But don't remove them entirely - that's just as dumb.
*ps: github as a metric is also a bad metric.
I thought most people came to the conclusion that this is a horrible metric by which to gauge competence?
Now us Python developers, Ruby developers, Java developers, PHP developers, and (God help you) Perl developers, let us put aside our meaningless and largely annoying quibbles to unite under a single banner. For once, let's say who cares whether Django or Rails is the superior framework and just join our voices to call for an end to the modern technical interview.
The final paragraph shows what demographic the OP belongs to, so it's hardly a surprise they think that way.
Once you're dealing with someone who has verifiable experience, then repo activity should be weighed much less (unless it's something really awesome, then chances are you're dealing with an awesome developer).
If you're critical of someone lacking github repos but they have experience, then I think it becomes a symptom of "everyone trying to stand out"-itis. It's probably best to have different solutions for different skill/experience levels.
1) Bad problems - implementing string reverse is a silly question. It has little to do with 'real world' optimization, which is often more about architectural issues: caching, etc.
2) Markerboards - I never, ever write code on a markerboard. Why ask me to write code on a marketboard in an interview? Markerboards are great for high level concept organization, they are terrible for writing code. In the real world I often write a crap implementation of something and make it work, add tests for it, then refactor till it's not crap. Demonstrating that workflow on a markerboard is nearly impossible.
THIS. ALL OF THIS.
I recently had a technical interview where I was asked to write some Java code that printed out every 7th number. I could not, for the life of me, figure out how to do it. And I've been a software engineer for 25 years.
I realized later that it was because I was nervous about how I could present myself in this once chance, and that made all the rest of it difficult.
For my money, I'd expect that most interviews in any field are technical to some degree. If you're hiring an HR person at any level of seniority at all, presumably you'll be asking questions that bear on employment/tax law, dispute resolution and so forth - in short, questions that bear on a body of specialised knowledge: technical questions. The problem is execution.
Having never been asked to do code on a whiteboard, I can't comment on whether that's any use. (I think I'd be OK provided it was acceptable to drop into pseudocode: surely we're not testing for encyclopaedic knowledge of every method in X language's standard library.) 'Homework' challenges have the advantage, if they're executed right, that whoever you're hiring won't be going in completely cold to whatever tech you're using - may as well get some of that newbie-googling done in advance. (Just did one for a job using Mongo/Mongoid - what I knew about that beforehand would have fit on the back of a cigarette packet.)
My experience and impression is its generally the opposite.
Whiteboard is a good place for psuedocode, data flow diagrams, schemas, flowcharts, block diagrams, data structures, and bug related examples. Its a pretty bad spot for source code other than one line notes. I usually have scratch paper with scribbles all over it when I'm coding, so that seems fair. I don't think I've ever written anything in Clojure without a REPL open, it would be a weird experience to write something out entirely on a whiteboard without running all the little parts first. Also the test-driven experience is very weird if you can't actually run tests. Its like testing someones teamwork skills by having them work alone, very strange.
I feel a professional obligation to begin all projects with a bit of google to at least prove I'm not wasting effort on NIH and find some pitfalls, so the suggestions to test a dev without any net access also feels extremely weird to me.
I think the primary fail of tech interviews is its nothing more than a stress test, selecting for candidates who don't really care, or at least are unnaturally calm, and that doesn't seem to correlate very well with ability.
I haven't had many, but every whiteboard interview I've had has been miserable. At best I've just about managed to convey what I mean, at worst I don't even understand the question.
One particularly memorable exercise had the interviewer write out HTML on the board and ask me to write out the CSS to turn it into a dropdown menu. I didn't even know where to start, because I couldn't construct a mental model of what was going on where. Because outside of the white board interview I never have to.
A much better technical interview that I do not with death upon is one that put me in front a computer which was rigged up to a projector that the interviewer could see. We talked through my solution to an exercise as I constructed it, and I was free to switch between the browser and editor as I wanted. You know, like I might in a normal, actual working day.
An even even better interview was one that left me in a room for 45 minutes with a task. After that time I sat down with the interviewer and talked through what I did.
In my experience, being bad in either one of these interview types is not correlative to being bad at being a developer.
Any programming challenge other than the most basic questions is going to have a long-tail distribution in the time it takes a person to answer them. It's easy to get hung up on something stupid, especially in a high-pressure situation like an interview. We've all had moments where we missed something "obvious" for weeks on the job.
In college they solved the time-pressure problem in tough classes by making the homework really hard and the tests really easy (like the take-home project OP suggested), but that relies on an honor code and the fact that getting a good grade on your homework is not nearly as important as landing a job. Also a lot of great programmers don't want to deal with your bullshit project because they're getting recruited by a million other companies.
If some new metrics end up in widespread use, most of the people that pass the test are going to be ones that gamed the system (e.g., added a bunch of GitHub projects), just like the majority of people that get really hard coding puzzles have seen the trick to solving them before.
No matter what you'll end up with low true positive/negative rates for any test you do. I think the right way to deal with this is to come up with a bunch of orthogonal tests and then choose candidates that pass a bunch of them. Have them debug a program, have them do web programming, have them talk about a project they did, give them a project to do if you can, etc.
The main thing I learned after doing hundreds of interviews is that I have a limited view on how to judge a programmer, and that view translates only so well to the question of "how much value will this person add to our company right now."
So yeah, TLDR, I don't think the status quo is great but I don't think any of the solutions proposed over the years are better.
This shows how much of a load on the existing workforce doing technical interviews already is. And the methods that the author is proposing would take even more time.
Given the fact that technical interviews have to be done by engineers, there is a tradeoff between the quality of the signal, and the time spent.
In my opinion, the biggest problem with technical interviews is that they are an art in themselves, and you can get better at the simply by solving more questions (they can be found easily online). The solution, I think, is to level the playing field by pointing interview candidates to the websites the archive interview questions.
When I'm hiring a contractor, I need to know that they really know their stuff and that they can hit the ground running if they join my team. So far, I have found a quick technical interview to be the best "bang for buck" way of doing this. People can have great CVs, active GitHub profiles, etc, but it's only when you speak to them that you realise they aren't what you need.
Phone and Skype interviews can be good, but more often than not I'd like a candidate to write up some code for me. Maybe there are some good remote pairing tools that I'm missing out on?
No one likes working on a problem with someone watching them constantly.
The effect is not limited to physical presence, it can also happen over a phone interview with screensharing. Consider all these thoughts that a candidate is probably thinking while trying to solve the problem:
What's this person thinking? Am I working too slowly? Should I type code faster?
Is this solution what he/she is looking for?
Should I talk more? I'm not sure what to say, ok stop panicking, just focus on the problem. Should I say I'm focusing on the problem?
The problem is not the whiteboard. The problem is that there is someone constantly looking at the whiteboard.
The solution is simple and I'm surprised that I've never seen a company try it.
Give the candidate a big whiteboard or a laptop with no network connection. Give a problem and leave the room.
Come back later. When depends on the size and scale of the problem.
Put the code on a projector and get other engineers in the room.
Now discuss the code. Run the code through some sample data. Make some criticisms and see how the candidate defends their decisions. It's like a long code review session.
Digressions are ok. Maybe talking some data persistence will lead to a discussion about caching. Maybe asking why the candidate did something will lead to talking about concurrency.
A bad candidate can't BS through this. They wrote the code, they have to either explain or defend it. And the way they talk about things also gives a chance to show their level of experience. The way a junior dev talks and a senior dev talks is very different.
The key thing is, let the candidate have some time alone to work. Stop pressuring them with having an engineer there constantly. The engineer doesn't like it because they would rather be working on something. The candidate doesn't like it because it's stressful. And there's no benefit to having someone there constantly. You don't need to hear their thought process in real time. You don't need to see the iterations they went through before the final solution. If you really want to know, you can find out all these things in the discussion session afterwards.
It's a bit of an art to come up with tasks that let you accurately rank candidates.
It shouldn't take more than 30-45 mins for a competent programmer. There's also some instructions in the mini-design which are intentionally 'the wrong way' to do the task. This is to encourage the candidate to get in touch.
I think this shows their coding style, problem solving, but also how they can work with others. Junior devs tend to get in touch about the complexity of the problem, senior devs get in touch about the methods in the design.
This process hasn't yet let me down.
I sent the manager an email telling him that I couldn't write a K-D tree on my own, but I would use the implementations that I found online to solve the problem. He replied that what he was looking for was someone who would've relied on an existing solution to solve the problem instead of trying to code it all from scratch. So I passed his test and was called in for an in person interview which was just trying to see if I would fit in with the team. It all worked great and I was hired. I didn't have to do anything with a whiteboard.
At the end of the day I care about whether or not you can solve a problem.
I really don't care if you solved it because you can use google or because you have a network of people to ask questions.
Technology changes too fast to expect people to always know it all. I'd rather have people that are resourceful and can dig their way out of a hole.
The hard part is coming up with what programming problem to ask a candidate that's fair to the candidate yet provides insight into their capability. For us, I created a simple data structure problem that hits a smattering of things that are taught in a typical undergrad CS curriculum (trees, recursion, etc). The candidate has about 30min to write the code for this in their preferred language using a plain text editor. I care less about perfect syntax when working with libraries, etc. and more about whether the candidate can take a problem statement and turn it into working code.
For an on-site interview, I send the candidate a description (along with screenshots) of a small web app project that hits some UI, some DB, some business layer. I encourage them to innovate around my description and surprise me. They get a couple of days to implement, then show up with their laptop and demo it. I want to see working software, then a deep dive into their code.
Typically by the end of the on-site interview, we have a really good idea whether this person knows their trade or not. It's helped me sniff out the pretenders pretty well over the years. The ones I hire are the ones who obviously take pride in their craft, know why they picked various implementation approaches, and can explain the technology they used.
The result being that before the usual technical interview I dust off the old CS textbooks and try to hit all the high points.
I like the idea of a small program - especially a business-relevant one like you mention. But if you limit it to 30 minutes, often that 30 minutes could easily be eaten up by setup of environment... which can be a huge timesink.
Personally, I don't buy the complaint that asking a candidate to write some code is unfair although I've had a few tell me so. For onsite interviews, I have them spend a couple of nights ahead of time working on a small web app with a few moving parts (UX, DB, validations) to see how they approach writing larger, more real-world code. They demo the app, then show me the code. Some people take real pains with what they do, fully understand what they used and why, and can talk about how they'd improve it if they had more time. Unfortunately, many don't do that for one reason or another. But this tells me what I/we need to know before thinking about hiring someone as a developer.
When they say, oh you can write in psuedocode or you don't have to "really" create the hashmap or array, don't fall for it.
ALWAYS WRITE THE CODE FOR REAL.
That way, when you hit a roadblock, you can run the program and usually figure out exactly what is wrong very quickly.
I guess that's great for people that have a sweet Github profile...
Take a problem that I recently solved or trying to solve and turn it into an interview question.
Email them a coding problem that's very easy to understand.
If you get into one of these types of interviews you might want to think twice about the employer. It’s important to remember that interviews go both ways, I’m interviewing you as much as you are interviewing me. To be clear when I’m an employer I’m worried if the candidate isn’t interviewing me. I worry they’re sold on the coolaide and not looking for a good fit that will reward them. People like this never make it in my teams because we can’t have someone who shows up for the wrong reasons. They will just fail and if we hire them it’s not fair to them or our team.
That said I interviewed at Valve and they had teams of two go after me and ask me to code on a whiteboard and at one point the team interviewed me on Perl. I’ve been coding in Perl since it was released on UUNET in 1987. The people interviewing me gave me a problem and asked me to code it on a whiteboard. I did and then they kept saying “You have a syntax error” and I kept saying “No I don’t” after a couple minutes of this banter one of them got upset and typed it into his laptop to prove to me I was wrong. Boy was he embarrassed, I’ve been writing in Perl for so long I was certain I was right and this gentleman was certain he knew Perl until he met me.
To me this event is proof that the “technical interview” is bogus. It’s a nice to have and certainly will help weed out some people who won’t make it but it also can cause this strange disconnect between skills and raw coding styles and/or understanding of the underlying tech.
To me the “technical” interview needs to be real life problems, potentially something your team is struggling with or perhaps has solved recently. You provide the details and see how the person approaches the problem. Do they ask smart questions, are they thinking about how they’d collaborate, how they’ll look at what others have done to approach similar problems in the past. Again, this is about Attitude and Aptitude not about syntax errors and semicolon style.
One of the best systems guys I ever hired was a pool man, literally cleaning pools when I met him. And one of the best sales people I’ve ever hired was an English teacher. I don’t care about your background, I’m not hiring you for that, I’m hiring you for your future on my team and how you’ll grow to be a major contributor as we solve crazy hard problems.
Past performance does not guarantee future results…
To me the only things that make a different are Attitude and Aptitude, the rest we learn together…