During an on-site, I'm very much evaluating abstract and interpersonal skills. I'm evaluating how well they communicate, how well our personalities mesh when collaborating, and how well they solve our engineering problems. I'm not evaluating how well they can code; they've already been screened for that before the on-site, typically on a Coderpad in the comfort of their own home with whatever language, tools, and resources they prefer.
Popular or not -- objectively speaking, this reflects a rather skewed sense of priorities.
I can think of a huge list of make-or-break attributes for a potential working environment - for example:
- Do they value code as craft (like really, at all?)
- Does their business make sense -- both from the money-making perspective and as something I can see myself contributing 70 percent of may waking hours to for many years on end?
- Are they... nice? Jerks? Ego cases? Is it possible to have a normal, adult conversation about whether X is a good idea (without it turning into arm wrestling or "it is because Y said so"-type discussions)?
- Down to finer points like actual working conditions, tooling, and a minimal degree of sanity with respect to time off + vacations.
In this context -- "Are they well-versed in the finer points of whiteboarding?" doesn't even remotely figure into that list.
Asking to come up with a recursive pathing algorithm for your custom tree implementation in c on a whiteboard when they are applying for a ruby job maybe not so much....
I have absolutely used whiteboards on the job at every software job I've had. Collaboratively brainstorming a system design, working through examples, and conceptual context shares all work much better in a whiteboard environment than pretty much any other environment that I know. I get that it's good to cater to introverts, but I don't think that means it's unreasonable to expect someone to present ideas and collaborate while on the job.
It's fine to criticize whiteboard coding as unrealistic, but the author is saying that whiteboard-based interviews as a whole are "social ignorance and intellectual dick-waving". (Technically they just used the term "whiteboarding" without defining it, but I think if they meant "whiteboard coding", the article needs to be more specific about that.)
Does this person ask good questions? Do they respond to my hints well? Do they give up easily?
Why would you replace a whiteboard with a pen and paper? Many programmers have hardly touched a pen and paper in years. They live and die by their keyboards and monitors. Would you take a master carpenter's tools away and then judge them on their ability to build something using nothing but a spoon? Maybe we should give people a real world problem (not a trick problem), and let them solve it using their tools of choice?
"Design your interview question to be missing important details. This will give candidates the opportunity to discover the need for these details when they get to them and ask for them at that time. If the candidate never asks questions, great! You just identified a serious red flag you would have been hard pressed to find otherwise."
When someone is in an interview under pressure, they are often very nervous. They are being judged on how quickly they can solve a problem, and you (as the test giver) are the authority in the room. Focus tends to narrow under stress. This means many people will simply take your questions at face value. But it sounds like you are essentially penalizing them for not being cynical enough, and you are penalizing them for not understanding that your trick questions are trick questions. That doesn't really seem fair. It sounds like a good way to weed out timid candidates (and potentially lose out on a very talented engineer), though.
Conversational Relationships: I agree with this
Open-Ended Discovery: I agree with this
I would not put introversion on that list. Great engineers can be either extroverted or introverted. It is irrelevant to quality of coding.
On the flip side, more extroverted programmers probably do better when it comes to working on a team and project planning.
Ideally you're hiring a balanced team so they can cover each other's weaknesses.
Stop using the term "whiteboarding". Judging from the confusion the discussion here, some hate it because it means writing code at a whiteboard (which is obviously a horrible idea) while some love it because they think of discussing some conceptual issues while drawing arrows and waving arms (which is great).
Just say what you mean. Whiteboards aren't a problem if you use them right.
This is just another way to ask trick questions. You are purposely leaving out important information. I don't think this is a bad thing, but it is stressful to answer these questions.
One way to source these questions is to observe whenever you have a small product ticket that requires some algorithmic or systems design complexity. Strip away all of the proprietary data and IP and see if you can create a problem around the real-world thing you experienced. If you can constrain it to a solution in 10-25 lines that takes less than an hour to create, you can feel good knowing you've got a real problem with a real solution to give to your candidates.
Also: let them use Stack Overflow/MDN/Google. We use it everyday at work, why should you hold that back from your candidates?
This article nails it precisely. After interviewing dozens and dozens of devs (as a peer), the sweet spot is still demonstrating coding ability, but not on a whiteboard.
Load a repl (like repl.it) in one browser tab, and in another have MDN or whatever official docs.
Give a unique/novel challenge, but keep it simple without any trick questions or brain puzzles. Be forgiving if candidate overlooks something obvious.
Once we settled on this, much fewer post-hire surprises.
Usually with whiteboard code interviews I focus on concepts and ability to break down and solve a problem, and don't focus on syntax or other things that I know they could figure out in 2 seconds from Googling. For example in C or C++:
* While I won't care if they forgot which #include was needed for a particular function or if they invent their own hypothetical string class with usual plausible methods, or if they wrote "v.append" instead of the correct "v.push_back", it is important to understand when and why memory leaks happen.
* It's reasonable to assume certain components exist in order to piece them together to create a higher-level task. If the task is to multiplex data from 3 sockets and de-multiplex them on the other side, for instance, it's reasonable to assume that a good socket library exists and assume some methods; I'm more interested in the multiplexing logic.
That said, I'd love more feedback about how to structure an interview in a better way to test software engineering ability. A lot of people say "whiteboard interviews are bad" -- and I agree -- but they don't really offer a good alternative. I'd love to hear what the good alternatives are.
I totally disagree with this. In my experience I can tell if someone is familiar with a topic by asking a few relatively simple questions. You can't just make up shit about a framework/language if you never used it. There are practical things developers learn that someone will never know unless they did it.
For example, if someone claimed they used Entity Framework with code-first and db first approach. I could simply ask them how they protect data annotations from being overwritten when updating the model in the db first approach. They can do so by using partial classes, anyone with a year of experience would know that if they used db first.
Edit: Actually, I see your point too. Testing actually coding skills won't be solved by my example. It will only test particular knowledge and not any ability to actually solve the problem. However, in my area of work it's mostly creating CRUD applications that solve some business problems. I can't remember last time I wrote anything resembling an algorithm.
Whiteboarding is a practice borne from what i can only assume is equal parts social ignorance and intellectual dick-waving.
Another helpful approach to gauging candidates thought process is to stage your interview problems so there are a series of asks and check-ins. Breaking large problems into Ask, Check-in, Review, Repeat allows you to keep each info block confined
We do the thing in the 2nd paragraph. We use the thing in the 1st paragraph to do it. So there's something that doesn't quite add up for me, reading this.
If you are going to ask them to code, give them a computer with Internet access. Since we look stuff up all of the time in real life, let them do that during the interview. Then talk to them about their answer. If they have a github account, talk to them about what the have in their. Check it out, and have them walk you through what they code does.
Interviewing is hard, being interviewed is harder, try to make it as easy and productive as possible.
As someone with pretty strong social anxiety, I can nod my head at some of the points in this (except the pen and paper bit...trying to write out something with a pen is torturous), but at the same time I realize my quirks shouldn't define the process. I do wish it was more of a "choose your own adventure" process where you could select the path that will best allow you to demonstrate your values.
I like being asked to solve a problem, by email. I bang out a solution, send it in, and provide responses for queries. That's a process I love, and it has worked great for consulting gigs. Those companies that do multiple phone screens, and then a show and tell in front of a group on a whiteboard I just pass on, however, and they are selecting for sales qualities. If you're hiring for technical sales or an evangelist it might be perfectly suited, but it does filter out a lot of people who might be perfect for the role. On the flip side, gearing a process purely for the socially anxious makes the same mistake in the other direction.
Leaving information out, or railroading apparent open-ended questions down to a single thing (like in the SQL example) could easily frustrate and confuse someone, especially if they're not comfortable with challenging an interviewer, but feel more like "they asked me about this thing, but then left out a bunch of important details" or "they asked me about performance overall but then didn't care about anything I knew about performance outside of SQL."
The issue with whiteboarding (or pen and paper) is that programmers use editors or IDEs. We had good luck just moving interviews into a clean slate editor - candidates felt more at ease with more normal tooling.
Once they can describe the abstract process, then you can dig down conversationally to find out what part of the process they have the most experience in, and what they enjoy the most (joinery, design, sanding, finishing, etc.) That should provide some useful insight into whether they can actually build chairs.
Trivia? More like just see if they are solid on the basics, like what a pointer actually is, and what a struct actually is. Pointers are what just about every language with object references uses to implement them under the covers. Really, I don't see a need to filter any harder than that. Plenty of high GPA recent grads from top schools are pretty hazy on something as basic as that.
Not saying it's unimportant, but I haven't used a language that directly exposed pointers since I last coded anything serious in C, over 20 years ago.
About the same proportion of them as drivers who deal with engines with pistons and lubrication systems. I would want a professional driver who can apply some basic understanding of what he's driving as opposed to one who thinks of the car as a bunch of magic stuff. (Back in the Ford model A days, some people used to hang garlic bulbs under the hood to "cure" their cars.)
I haven't used a language that directly exposed pointers since I last coded anything serious in C, over 20 years ago.
Then I hope you at least have a working knowledge of what an object reference is underneath and how it works. The problem with programmers who just think of an object reference as magic, is that they will lack the background to know when something like cache coherence might be an issue, or when they should worry about such a thing, and when they shouldn't bother. A programmer who doesn't have the foggiest idea won't even have a chance.
One interviewee tried to tell me that having a certain variable in nearly every object be null would result in zero added memory. That could be true in Python. It would be false in C++ or Java. (We were using C++ at the time, at the interviewee's option.)
What whiteboards should be testing for is problem solving, and whether a candidate can use a whiteboard to illustrate, communicate and collaborate towards solving a problem.
So if the question becomes whether or not problem solving should be evaluated, i think the answer is obviously yes, and algorithms have a place here. However, algorithms should not be the ONLY way to evaluate this attribute.
I'm not looking for 100% correct code. I'm just looking for evidence that they've actually designed/specified something in enough detail to start implementing, then actually implemented that something, beyond the level of cookbooking assignments from coursework. There is a big difference between those two levels of experience.
You don't even need to make something that completely runs. I just want to see evidence that given a brand new problem, you can understand enough to see what the several implementation problems would be, and I'd like to see if you can completely specify the solution to just one or two of them.
algorithms have a place here. However, algorithms should not be the ONLY way to evaluate this attribute.
If they can do everything with reasonable efficiency with just hashes and vectors, that's fine. What I want to see is if they've actually built something, and effectively dealt with all of the details that come up, or whether they're blind to the concrete details and just want to handwave them.
First - the interview should probably be a lot like how work is done by the employer. The details of the requirements probably are murky, but they'll be clarified later. The deadline may or may not be arbitrary. The goals are different - you don't need a useful, profitable end product. You need a demonstration of the ability to make such a thing (within the limitations imposed by the employer.)
Second - to be kind to your fellow human taking the interview, you shouldn't place them under undue stress. I enjoy David Rock's work on the brain and the chemical processes in our bodies. (I won't attempt to speak on them nearly as eloquently as the original author.) In short, there are five major things that affect our brains in a very similar way to immediate physical danger; the S.C.A.R.F. acronym, for Status, Certainty, Autonomy, Relatedness and Fairness. In an ideal workplace, and an ideal interview, you wouldn't be threatening those things more than you need to.
Status is already threatened in a way - the interviewee likely desires the job to improve their status in life. But you shouldn't be forcing inferiority down their throat.
Certainty - well this is almost in direct opposition to the reality of doing work in many workplaces, but it shouldn't be. In this case, what you're asking the interviewee to do and to demonstrate should be laid out in clear, understandable terms. You should try to remove some of the barriers to getting them to focus on the core abilities you need them to demonstrate. Of course, understanding communication is a valid skill to test, and you might hope to simulate your real workplace environment by being vague and using insider terminology, but you also have to remember that they have 0 days on the job in your particular place of employ. So they have yet to learn shared terminology. Knowing esoteric insider information shouldn't be tested!
Autonomy - while the above means you should be giving hints as to what you want them to do, and perhaps some suggestion of methodology, they should still feel like they are in control of their choices. You're testing for an ability to reach a conclusion, and you will observe the methods used to get there, but you shouldn't remove all sense of self-direction.
Relatedness - this is controversial, because we've always relied on "people like us" in interviews, only too well. You should want to connect interviewee and interviewer on some shared values and interests, but de-emphasize any focus on likeness determined by birth lottery. That is, they should feel like they can fit in, regardless of superficial differences!
Fairness - this just makes me think about the potential pitfalls. Openly mentioning "better" candidates, your buddy, your niece that's applying, doing a group interview and showing favoritism. Maybe someone else can think about this one and offer up suggestions.
What do you think about applying the above mental framework to how you design and conduct interviews?