Hacker News new | comments | show | ask | jobs | submit login
What do the best interviewers have in common? (interviewing.io)
237 points by leeny on Nov 30, 2017 | hide | past | web | favorite | 142 comments

I feel this is somewhat industry dependent, so take this with a grain of salt (finance answer).

They cultivate a friendly banter type atmosphere. As to the actual test part:

The ones that have asked questions that require a judgemental answer.

Bad interviewers try to ask difficult or complex questions. (Stupid monkey puzzles / google style "trick" questions")

The good ones assume you're technically competent and throw you questions where the question is crystal clear but the scenario is ambiguous & open to interpretation & vague. I don't need people that can do monkey puzzles I need people that can respond dynamically in the face of uncertainty.

e.g. How would you go about detecting fraud in a scenario like XYZ.

That has a million possible answers...but depending on which route you chose it'll become painfully obvious to the interviewer whether you know what you're talking about. It doesn't even have to be the right one...the language and logic alone will betray you if you don't know what you're talking about.

I agree completely, and it applies equally to software engineering. It's how I interview. Now, if it becomes clear that there are gaps in their knowledge, I will spend some time on those areas so I (or the relevant hiring manager) can make a decision regarding if the trade-offs are something I (or they) can live with.

For engineers, that can sometimes mean an algorithm problem--but something they haven't likely seen before. The key here being that I don't want to see someone regurgitate a solution, but to understand how they think. If they can't solve a problem fully but can come up with a list of 10 extremely pertinent questions I intentionally failed to mention, that is good information to have.

It looks like their metric is: "good" interviewer means the interviewee enjoyed the interview and to want to move forward if asked.

Likely a very useful metric for figuring out if some interviewers are turning people off about the company.

I (and probably everyone else) would really like to see if this at all correlates with future performance metrics.

> I (and probably everyone else) would really like to see if this at all correlates with future performance metrics.

I don't see how it's not possible. If your manager likes you, s/he will give you good reviews.

That's one oddly insightful little comment. It would be interesting to see the relationship between perceived performance and actual performance (by some impartial metric) in teams/management made up of people that get along well versus teams that don't particularly get along, but also don't actively dislike each other.

The trends in 'emotional intelligence' are largely based upon one psychology paper that did not reasonably show what many think it did. The author broke people into groups and found that average aptitude test scores did not work as a predictor for tasks such as 'planning a shopping trip as a team' whereas the average scores of another test, "seeing in the mind's eye", did.

The conclusion was therefore that a team's performance is not determined by the aptitude of its individual members, but their 'emotional intelligence.'. That is quite absurd since in order to make that conclusion you'd need to take the people who are individually best at 'planning a shopping trip' and put them in a team and compare their results to another team that scored well on the "seeing in the mind's eye" test but individually not as well on 'planning a shopping trip.' I think the author did not do this as she was well aware of what the result would be, and it's not publishable.

The reason for this tangent is that 'emotional intelligence' is now being used as a cornerstone for many things, and this article/company is yet another group feeding off of this. Yet wouldn't be it be quite remarkable if having groups that get along 'too well' could end up being counterindicative for performance. Most people find it difficult to be objective around those they're fond of. Create teams/management systems full of people fond of each other and everybody's going to say everybody's incredible and doing incredible things -- regardless of whether or not that's true.

Can you please share a link to the paper (if available)?


It's paywalled. There's an informative article on Wiki about scientific paywalls [1].

[1] - https://en.wikipedia.org/wiki/Sci-Hub

> If your manager likes you, s/he will give you good reviews.

That is interesting, because I do my best to work with my manager to define success criteria that are independent of how either one of us feels. The goal being that when one or both of us has a lapse in "liking the other", which can happen for a number of unrelated reasons: depression, stress, illness, pressure, etc. It's hard, especially since I am often the only one that has ever asked my manager for that kind of understanding.

Oh, it's only about job interviews. Sigh. Maybe that should have been obvious to me from the title or URL? I'm fascinated by interviews, but job interviews, not so much. Except this, from my favourite book by a mile about philosophy of art:

"Two men meet; one is the applicant for a position, while the other has the disposition of the matter in his hands. The interview may be mechanical, consisting of set questions, the replies to which perfunctorily settle the matter. There is no experience in which the two men meet, nothing that is not a repetition, by way of acceptance or dismissal, of something which has happened a score of times. The situation is disposed of as if it were an exercise in bookkeeping. But an interplay may take place in which a new experience develops. Where should we look for an account of such an experience? Not to ledger-entries nor yet to a treatise on economics or sociology or personnel-psychology, but to drama or fiction. Its nature and import can be expressed only by art, because there is a unity of experience that can be expressed only as an experience. The experience is of material fraught with suspense and moving toward its own consummation through a connected series of varied incidents. The primary emotions on the part of the applicant may be at the beginning hope or despair, and elation or disappointment at the close. These emotions qualify the experience as a unity. But as the interview proceeds, secondary emotions are evolved as variations of the primary underlying one. It is even possible for each attitude and gesture, each sentence, almost every word, to produce more than a fluctuation in the intensity of the basic emotion; to produce, that is, a change of shade and tint in its quality. The employer sees by means of his own emotional reactions the character of the one applying. He projects him imaginatively into the work to be done and judges his fitness by the way in which the elements of the scene assemble and either clash or fit together. The presence and behavior of the applicant either harmonize with his own attitudes and desires or they conflict and jar. Such factors as these, inherently aesthetic in quality, are the forces that carry the varied elements of the interview to a decisive issue. They enter into the settlement of every situation, whatever its dominant nature, in which there are uncertainty and suspense." - John Dewey, Art as Experience

Interesting how many (myself included) assume "job interview" when they read "interview". Especially because German is my native language where the word interview exists but is not used for job interviews.

Now this is hacker news and chances are " interviews" probably refer to "job interviews" but it's not necessarily the case. The title doesn't indicate it, so why did I assume job interview?

I'm not sure I've ever gone through an interview process where it was just me and one interviewer. Well, there was one time, but that was an exceptionally weird one. Frequently I meet with the department head and someone from HR. Or the whole team. Or a bunch of department heads. Also, the last way I would ever describe an interview is as a "unity of experience". Every person has a different perspective (Rashomon comes to mind - maybe someone should make a movie like that about an interview) and while the interviewers may reconcile theirs among each other to some extent, the interviewee frequently does not see things their way at all.

He probably means "unity of experience" in the sense that, one's experience is a unity. Look at the whole sentence:

>Its nature and import can be expressed only by art, because there is a unity of experience that can be expressed only as an experience.

This sounds like more of a phenomenological point, that there's a richness and a completeness to experience that isn't captured in data or theory, and that only something with comparable richness can begin to approach it.

Ah sorry, it was a bit random of me pasting that quote. His prose is fairly repellent. To explain a little, he uses "an experience" in a particular way in that book, not in the usual sense.

The book was a revelation to me, having read a lot of books about analytic philosophy of music and art, each analysing their own little corner of art/music, me feeling like I understood less after I finished each one. Then suddenly came this book, relating everything to life - he talks about poking a fire, firefighters, a job interview etc and how all the stuff going on in these are the building blocks of art. A section of the book is "The Live Creature" - it's remarkable how many aesthetics books neglect the human/life/planet context of art. "An experience" means something like an aesthetic whole, like the experience of reading a story, watching a movie, hearing a piece of music etc. The chapter's called I believe "Having an Experience". Anyway..after that I stopped feeling the need to read books of philosophy of art/music, as all my questions had been answered. (I'm a musician + artist)

No, I don't see it as a unity - rather as I leave an interview, there is a superposition of possibilities.

Edit: And if I don't get called when I expected to, my experience resolves into something different, perhaps, opposed to how I felt coming out of the interview.

this is hackernews, not cooljournalistnews

"Don't be snarky."

If people didn't routinely fail simple "can you think and write it in code" problems like fizz buzz I wouldn't ask them. But they do so I do.

Dijkstra observed that the majority of working programmers can't even write a correct binary search. I tried using it as an interview question and gave up because nobody got it right.

The majority of working programmers rarely have to write a correct binary search from scratch. Why should you ask it? A better question would be to ask when it's appropriate to use the algorithm, and possibly provide a set of data and a scenario (i.e. context) that the candidate might encounter in which using the algorithm is an appropriate solution but not necessarily obviously so, then see if his process of thinking about the problem leads him there.

The issue is not not being able to write binary-search; but, given a set of constraints, a very well defined set of steps, and ample enough time; is the candidate able to write it in code what's there in their head (a pretty straight forward algorithm).

I expect candidates to write in code what they can explain verbally, given that the code is do-able in the confines of an interview.

That said, I also get what you're trying to say: Capably implementing Binary Search may not be the best indicator of skill, since Software Engineering is so much more than just converting thought to code.

Right. There are two ways you get to implementing binary search in an interview: (1) figuring it out from some basics & possibly hints or (2) reciting it from memory. 1 is useful, 2 is useless. Having used inteviewing.io, it facilitates #1 much more than #2, as a good process should. Interviewer ratings help that.

It's not a complicated algorithm. It's one that people might even use in their every day life (the example I used to use is finding a name in a phone book, I guess that's pretty dated now). I still see candidates who seem to have a conceptual understanding of it fail to translate it into code. That's why I ask it, it's a very simple "here's a thing that intuitively makes sense, now formalize it" question. Maybe this question rules out people who can only copy existing patterns and change values within them, but that's fine as far as I'm concerned. I don't want to hire engineers who can only mimic.

> I don't want to hire engineers who can only mimic.

How is a binary search not mimicry? Probably only 1 in a million people could figure out how to do it if they didn't already know how it is done elsewhere. The only reason most people know how binary search works is because they were taught it somewhere along the line.

Ask an interviewee how to Voronoi diagram the surface of a sphere or something. Then you'll see how things look when people don't do mimicry. You don't get original ideas by asking first-semester programming puzzles.

It's actually just how you would find a number in a paper phone book.

No it's not. Some parts of a phone book search resemble a binary search, but choosing your pivot and processing the last few pages do not. I've never seen someone not do a linear scan at some point.

But all that's irrelevant to my point. Asking for a bog-standard solution to an industry-standard question is in no way a demonstration of someone's originality. You don't ask it to prove that they aren't doing "mimicry" you ask it to prove that they can, and thus are capable of learning the basic tools of their trade from others.

I'm pretty sure I independently invented binary search when I was a child (as have many others).

It's a pretty darn obvious thing to do when you're physically handling an ordered collection of records.

It's the recursion (or parameters to functions in recursion that kills most candidates). It's pretty intuitive in the head.

But I agree, plain vanilla binary search is easy.

People are free to implement an iterative version.


> Probably only 1 in a million people could figure out how to do it if they didn't already know how it is done elsewhere.

They already know how it is done elsewhere; it's how one finds a word in a (paper) dictionary or an book index.

Does every language include a generic binary search method? Where?

Or similar, if I asked you to find a specific value in a list of Comparable objects, in my language of choice, how would you do it?

read pseudocode definition of the algorithm and implement in your language of choice, alternatively you could port from one language to another language

In which case I'm curious why it's an unreasonable interview question.

The argument is always "use the stdlib one", but if there isn't one, and you need to implement it, why isn't it fair game?

it's fair game, but to have the best shot of implementing the algorithm the correct way would be to look up the algorithm definition in a reputable algorithms book and implement directly from that, doing so also provides the developer with additional knowledge and context about the algorithm's use and performance characteristics, my 2 cents

Replace "algo book" with "person asking you the question", and I see relatively little difference.

One difference is that this penalizes people with social anxiety.

Don't all in person interviews do this?

Some more than others. It can be mitigated, and probably should be for certain roles.

I guess it depends on what you're interviewing for. If you want a junior to implement specs he's given your point might apply. It's not an appropriate question for someone with basically any real experience, though. Especially not someone who you expect to do anything more complex, like engineering.

So now I ask this:

Given that google probably does more software interviews than you do, why do you think they ask such questions, even to experienced candidates?

Rarely have to? For a professional programmer, writing their own binary search for actual use at work is nearly a firing offense (with a very few exceptions).

Use the library. Don't write your own.

(Yes, I exaggerate. Train such programmers, don't fire them... the first time. But they're wasting your time, and probably introducing bugs in the process. It's deeply unprofessional.)

> Rarely have to? For a professional programmer, writing their own binary search for actual use at work is nearly a firing offense (with a very few exceptions).

You really don't sound like you do much coding. I've lost count of the number of times I've had to write my own binary search simply because the existing ones were painfully inadequate.

Pretty much every standard library implementation, for example, expects the data to be in an array in memory. There's no provision for the data to be dynamically obtained (e.g. from disk or generated on the fly) or in any other form in memory (e.g. unsafe/native int pointers in C#). And have fun running your standard library's binary search, whether Python or C++ or Java or whatever, on something more abstract like the numeric interval [0.5, 1.0]. There's just no way to specify alternate termination conditions like tolerances.

Oh, and this is completely ignoring more mundane shortcomings in a significant fraction of implementations, like how in C# there is (or at least was, last I checked) no way to directly obtain both the lower and upper bounds of an equal range via binary search. At least C++ has equal_range!

I could say the same for practically any classical CS 101 algorithm like depth-first search or Dijkstra or whatever you want, too, except those don't even exist in most standard libraries in the first place... and I suspect their lack of existence is not unrelated to their likely practical inadequacy.

When I had to implement a binary search, it wasn't exactly the version you'd find in a textbook or library. I'm not sure I recall the details now, but I think it needed to return the two closest values if the search key wasn't present. And maybe there was something special about handling duplicate keys, but you get the idea. Sure, it's always good not to re-implement the wheel, but as a professional there are a million mundane reasons you get stuck doing it anyway.

Ex Amazon here. We started asking algorithm optimization question after realizing how important it is.

You don't get to tweak binary searches every day, but if you optimize an algorithm to speed up your product by 1% only once a month and your teammates do the same the difference becomes huge year after year.

Either I've been in only mediocre to average roles(for any of the reasons, including my inability) or you're a very small minority not aware of the fact. I'd like to believe in the latter, for peace of mind, but better to believe in former for driving personal growth. hmm.

I also forgot to mention another case: where the needle you're searching for is of a different type than the items in your haystack. If you're searching for an integer in an array of strings that you know are integers... good luck convincing the type system to let you even call the binary search method, let alone figuring out which argument to the comparator is what type (if they can even be of different types).

> You really don't sound like you do much coding.

Perhaps not. I've been a professional software engineer for 32 years, though...

> Use the library. Don't write your own.

Things are not so simple.

Dependencies on 3rd party libs also imply maintenance costs (generally related to the integration of this lib into your build system). They are moderate, however, they don't go away with time.

Binary search is a small algorithm, which is easy to understand, and whose correctness is easy to check. It's not a good example of a "good" dependency on a 3rd party lib.

> correctness is easy to check

This binary search bug took two decades to be discovered: https://research.googleblog.com/2006/06/extra-extra-read-all...

Seems like a good a place as any to note that I actually discovered this bug in high school programming class circa 1999. If only blogging was popular back then I could have been internet famous! (Surely this bug has been rediscovered many times, just predates the self-aggrandizing blog-post-for-everything era)

I agree with your sentiment but being over dramatic here.

If 100% of applicants are failing a particular question at interview, either the question is inappropriate or the recruitment process has attracted inappropriate applicants.

Either way, this is a failing of the business, not the applicants.

The recruitment process must be sensitive to the needs of the business. These are never entirely bound up in the response given to a single technical question. A good interviewer draws the best from the interviewee, and understands that technical excellence may be necessary but is never sufficient for employees.

> Either way, this is a failing of the business, not the applicants.

I've worked at places that couldn't get someone that passed FizzBuzz level tests but out of desperation hired them anyway. I've not seen people fail but turn out to be decent (or even half decent) coders in real world projects either, at best they can slap something semi-functional together.

It's a failing of the business in that the aren't getting the caliber of applicants they want, but it's also a failure of the candidates when the can't program at all. It's a failure of the industry that those candidates manage to be employed at all.

All of this is true about FizzBuzz, but at the same time, I've seen official interview question sheets that had questions with wrong answers, questions that were too ambitious for 45 minutes, and others that were just way too open ended and asked without any hints to be completely useless.

Those question aren't successfully answered, because the interviewer failed.

Perhaps you shouldn't be expecting candidates to whiteboard a perfect algorithm library function in 1 hr?

"Getting it (exactly) right" isn't the goal of a good interview, unless you are in the business of launching software in 1hr sprints.

Most standard library implementers couldn't write a correct binay search.


> Most standard library implementers couldn't write a correct binay search.

I think that issue with binary sort is not really relevant to interviews. Having a bug with integer overflow is not what an algorithmic question is trying to test.

If you want to test knowledge of undefined behavior show them code with a bug in it and ask them to find it.

Well, if the intent of the interview is just to test algorithms, maybe you don't want to test null pointer checks and off-by-1 errors too?

So much time is wasted in trying to accommodate off-by-one errors (which are a sin in production...just like integer overflows)

> So much time is wasted in trying to accommodate off-by-one errors (which are a sin in production...just like integer overflows)

Off by one is (generally) a much bigger issue. Take for example a buggy binary search which runs infinitely on even length arrays because it gets stuck with the length 2 case. This is a much more likely (in terms of runtime) bug to hit than integer overflow.

Though I generally agree that it isn't too big a deal. I would just mention that there was a bug and only if they couldn't find it would it possibly be an issue. I wouldn't penalize anyone for missing null-pointer checks unless given a valid input (non-null pointers) their algorithm failed because it hit a null pointer.

I think your measure of what is ok and what isn't is just on the basis of what you are comfortable with. Doesn't mean every other programmer is comfortable with the same.

To tell you the truth, I am ok with coding questions. In fact, I demand it. What I'm not ok with is demanding inch-perfect code in 20 mins. The only people who can be perfect are the ones who are doing the exact same problems everyday....not the ones who are doing production work.

I think it depends on the language(s) your team(s) work with or would like to use.

Have all of the reference docs and a dev environment setup on an offline laptop.

Have an obviously toy problem inspired by real problems, yet clearly unrelated to anything your company plausibility needs solved. If it is common public knowledge that you already have a solution it might be fine to use a simplified version of that.

Define solution criteria.

Providing an interface and expecting an object that implements it correctly might be a good test for a go programmer.

When people say/quote things like “can’t even write a correct binary search”, I’m always reminded of this old java bug


Apparently it is not that easy :)

Anyway my personal takeaway from that, is that it is really easy to assume that things you know are easy. And you should be careful about doing so when interviewing.

GP probably used "correct" incorrectly. From my experience interviewing, they probably meant "works for any non-trivial input at all".

Perhaps we need new vocabulary, it seems there exists people who can effectively program computers, but for whom binary search and fizz buzz are to be considered too difficult to understand and get reasonably right. Plenty of HN comments seem to insist on it, whenever the topic of interviewing comes up. Yet, I am absolutely certain that evere single person I've ever worked with in programming would be able to "correctly" implement both binary search and fizz buzz, despite many of them not having been interviewed for them specifically. Indeed, the insistence on those questions comes from an overwhelming consensus in the companies that ask them that the ability to answer them is a reasonable lower bound for the skill these companies understand programming to comprise of.

So, perhaps this group of para-fizzbuzz programmers should be called something else, to avoid embarrassing and time-wasting confusion? (Or we should call fizzbuzz-able-programmers something else? I don't want to appropriate the term, I don't have strong feelings about it.). What are typical tasks that such programmers typically undertake, what sort of companies do they work for?

Note: in the above, for binary search and fizz buzz, I don't mean those literal problems, I mean general problems of similar complexity, ie. that can be fully stated in a few sentences of simple language, and have straight forward implementations in 5-10 lines, using only simple, built-in programming language features and data types, available in any mainstream programming language - if, loop, arithmetic, array, integer.

As someone with much more experience developing software than experience interviewing for positions, I find the whole fizz buzz interview style perplexing. Internally I am thinking 'why the hell are you asking me these questions when you could be asking me far more interesting ones?' The answer is likely that they didn't read my resume, so maybe what the best interviewers have in common is that they read candidates' resumes before the interview.

I think the implication of the OP is that all resumes are presumed to be composed entirely of lies, unless proven otherwise.

To be frank, if an interviewer began asking me questions with that assumption, I would not be offended and would be able to back up any part of my resume with deep answers (or else I wouldn't put it on there), but I would think less of the company and think less of the interviewer.

If no benefit of the doubt is offered to me, I will most likely reciprocate and would assume the company doesn't trust employees to do their job. I'd probably be correct too.

I bet a higher percentage of people would have been able to do things like that in the days of batch processing. I'm getting middle aged, and I grew up with interactive microcomputers. So from the very beginning I was used to typing something into the computer and getting instant feedback. I'm even a bit superstitious when a program compiles/executes with no syntax errors, because I feel like it bodes ill for deeper bugs. So while it seems like very simple coding problems should be a good way to weed out useless programmers in interviews, the fact is, people like me are not very good at programming on a whiteboard, and don't have to do it, because there are employers who don't use such tests. I will say that implementing a binary search is something I have found useful in "real life" - in a SQL database I dealt with a table that had an index on a synthetic key while I wanted to look it up by a non-indexed date. For a simple query, the optimizer couldn't do that efficiently because it didn't know the columns were in the same order. But an explicit binary search could exploit the index to find a date. I didn't do it correctly the first time, nor could I do it quickly enough for an interview. But I got the job done.

I find it's even better to bring things into a practical example that excludes terminology, even if in reality it maps directly back to it. "Imagine I give you an alphabetized list of n book titles. n could be realllly big. I have another book to add to the collection. How should I do this?" By removing the terminology, I think it lets people look at the problem as a problem instead of a terminology test.

This also helps when developers are self-taught and have a professional background rather than an academic one.

How hard is it to get right, seriously?

I mean you just check a bunch of edge cases. Recursion is not even necessary!

If you're just sorting ASCII English it's pretty trivial, as long as you assume ASCII ordering throughout.

But for all possible data types, in all (okay - most) written languages, with full unicode support, dealing successfully with all possible localised edge cases defining postal/delivery address and date/time formats - very hard indeed.

Well that's up to the writer of that class to implement a comparator function to define a total order correctly. I am not even sure there an be a meaningful total order on postal addresses.

The binary sort algorithm shouldn't care about how that's implemented.

Separation of concerns baby :)

I wonder if this quote is out of context. Consider the Java bug that was discovered around 2006.

But surely this just means that binary search is not relevant?

As an interviewer, I’ve found that the ability to talk through the problem at multiple layers of abstraction to be one of the strongest indicators of mastery of a subject. This is independent of the topic as well- whether you’re interviewing for coding ability, sales skill, or good management practices. Glad to see it show up in these findings as a great way to interview people.

One of the best questions I've ever been asked in an interview is: "Can you explain what a RESTful API is and pretend I know nothing about programming but am familiar with web surfing." For the simple reason that it asks the interviewee to show mastery by using use clear, simple language to explain something that can quickly turn into a bunch of rabbit trails.

so if the candidate had never heard of 'RESTful' but the hiring manager hired them anyway, how long do you think it would take for someone on the team to teach the new hire what they needed to know about 'RESTful API'ness ? I seriously wonder if the person asking you that question is hoping you can explain REST to all of their coworkers :)

>What do the best interviewers have in common?

Not sticking to some rigid formula or format. Being able to adjust to the particular candidate in a way that works for both sides.

For example, some people aren't great at answering questions rapid fire. Giving them a chance to mull on something, and come back to it later, might reveal something you would have missed.

Or, if they miss a question you feel was obvious, ask them what area they feel strong in...then ask a question in that space.

Interviewing is stressful, and so you're not necessarily getting a clear view of their skills if you aren't flexible.

>> If you’re asking a classic algorithmic question, that’s ok, but you ought to bring some nuance and depth to the table, and if you can teach the interviewee something interesting in the process, even better!

This! This! This!

If you're going to ask a textbook dynamic programming problem, tell me why that is important.

If you're checking to see they can think top-down/bottom up, that's fine. But penalizing (aka no hire) for not knowing "the trick" of that dp problem is futile.

In the end, you want a guy/girl who can do the job well. What signal does a candidate solving max subarray give you? Does it tell you they'll write your mission critical service well?

Also, LOL at "How to be human". It's sad that smart people have to be taught this.

better interviewing through data Quote from http://blog.interviewing.io/

Writing Good Code is like Writing a Novel Quote from https://hackernoon.com/writing-good-code-is-like-writing-a-n...

When I put above two ideas together, I realized following.

If you were running a publishing house and needed to a writer to publish your next book, how do you go about picking the writer?

You usually look at the body of work from the writer. You read the books the author wrote in the past to decide if the writer is any good.

You don't decide if the writer is good or bad from just

1) reading a page of resume and

2) asking him to write out a clever sentence on a whiteboard in the middle of the interview.

That tiny slice of data is useless for deciding on anything. If you are all about deciding based on data, why would you make an employment decision based on the most tiny fraction of the actual data (resume and 30 min interview)?

Programming/IT/DevOps workers now have ample ways of showing past work. Often it's personal projects in github or a blog with tutorials on how to secure servers. These are far better tools to measure a candidate. Just like reading books by an author is a far better way to predict how that author's works would turn out.

And my personal experiences have shown me that majority of interviewers don't read my blog or my github. I find out in interviews that some don't know it's there. It's at the top of my resume for a reason.

And no one's asked about any question what I've written about in my blog or in my github repo. And these are topics directly related to the position.

If you enjoyed this article, you might like HN'er tptacek's https://sockpuppet.org/blog/2015/03/06/the-hiring-post/

Isn't this the streetfighter.io guy? Pass.

Isn't this the streetfighter.io guy? Pass.

And, ladies and gentlemen, that's how we can get subtle insight into a person's attention to detail.

we are transitioning from a knowledge based economy to a decision based one- it's useless to ask questions which test skill competency if those questions could be answered by a few search engine queries.

your goal as a person hiring (or firing) is to assess the candidate's ability to consistently make efficient decisions given complex scenarios with many rotating pieces. can this person handle their responsibilities and grow with them, or will they be difficult.

some of the most technical and capable people also have terrible interpersonal skills and a fickle work ethic, which means they follow their interests and not yours. do you want to herd cats or work as a team? teams need leaders, leaders need respect and to be followed. a working environment is a benevolent dictatorship not a democracy. if flat structures worked they would be applied broadly across fortune 500 companies, are you a wheel inventor?

it takes about 6 months to train almost anyone to do almost anything in the modern digital/information world. it takes a lifetime to be agreeable, polite, conscientious, patient, dedicated, unbiased. you'd learn a lot more about someone watching them interact with strangers and friends for 5 minutes than from hours in interviews (which ostensibly they have crammed for and are hiding as much of their weaknesses as possible). i assume this is why so many companies want private information on their employees, social media, public profiles. hiring at the moment works as a great filter, there is no best candidate for any job, a better strategy would be to filter out personality traits which would clash with company direction and assume some level of skill training will be required. pretty sure this transition is already happening.

the real stickler will come when hiring agencies realise basic fitness and hand eye coordination tests are better predictors than technical ability, then you enter into some genetic discrimination areas which will only accelerate the relevant bio-technologies, a fissile cavitation.

>leaders need respect and to be followed. This works two ways. Parts of it has to be earned, I've worked with founders, whom I would gladly followed when it comes to s/w engineering, but disagree violently when it comes to product/business calls/experiments to do. The reverse has been hard to detect, but my first instinct is that it's a lower percent.

How good are even good interviewers at picking the best candidate for the job? IIRC what research there is shows mixed results at best.

Or is the only point of an interview to motivate the new employee?

Here is a crazy interview trick that I've seen work well. Group interviews! It isn't as crazy as it sounds.

Suppose that you're hiring for multiple related roles. You have a bunch of people who are ready for their in-person interview come in as a group. You go to lunch with some select employees, and explain that they are not in competition with each other because you're hoping to make multiple hires. Then you come back and are asked to do a basic hackathon over the afternoon. (The available projects for which are all small but real stand-alone things that you could use.) Through the time, you take each person aside, ask them for a basic demonstration of ability, then return them to the group.

Hire all who seemed competent and worked well in the group.

This saves a lot of time over interviewing each one, and shows you something important about how they will be in the workplace that you otherwise wouldn't see in the normal interview structure. And it is more fun for the candidates!

This will go as well as all the "Group Discussions" that many companies conduct go (aka not well). You are expected to treat other interview candidates as your friends and peers while you all know you are competing with each other. But you can't show that, so you resort to fake over-niceties and all to look friendly and sociable.

At the same time, everyone feels pressure to prove themselves over the others, so they will want to say something so that they appear intelligent and don't get lost in the background. Not everyone can do it equally well, so the more extroverted types will dominate and make the others feel like shit.

> And it is more fun for the candidates!

Really? It would be difficult to shake the thought that you're being directly compared and in competition.

I participated in one of these group interviews recently and it felt like I was in the Hunger Games.

That can be a challenge for some. But it is more common to see people swapping phone numbers, leads, and tips.

Looking for a job is an isolating experience. And you don't often get the chance to connect with people who are in the same boat as you.

And for the people who really can't get past trying to compete against each other..would you want someone like that on your dev team? That's one of the things that this is intended to weed out.

Personally, I would hate preparing for a normal interviewing experience and be thrown into something completely nonstandard like this.

Why would they be swapping tips and numbers if they're in competition with each other?

I've never done a group interview, so I'm speculating, but the concept sounds nice, so long as there is also sufficient 1-on-1 time.

> I would hate preparing for a normal interviewing experience and be thrown into something completely nonstandard like this.

Would it a problem if you told candidates they're doing group interviews? Might be a flaw in this particular variant, not the concept itself.

> Why would they be swapping tips and numbers if they're in competition with each other?

From the top level comment:

"You [...] explain that they are not in competition with each other because you're hoping to make multiple hires."

That explanation is critical. If people can get into that mindset it all works. If not..it is a disaster.

You'll filter for one or two extroverts that quickly dominate a group and the rest will just tag along.

You'll also notice the introvert that quietly builds a critical piece.

That's pretty optimistic. I most likely would just check out.

Yeah but you wouldn't hire them, because not working as a team mate.

Just hire the one that spews out the most random ideas without thinking for a second first.

What makes you think the introvert will get a critical piece, instead of the least critical scraps?

It is the job of the interviewer to ask each person what they are working on, and what they think that they could do instead.

If you're hiring multiple positions, you generally have the room to hire different roles. So it is in everyone's interest for people to do the kind of work that they think they are best at.

If setting up this kind of group interview sounds too risky, you can do it more easily. Go to a startup weekend, look for people you'd want to hire. You get a similar environment and similar opportunity.

This is highly dependent on group dynamics.

I'm not at all convinced that the sexiest piece won't get grabbed by the most assertive person in the group, assuming of course that multiple people are attracted to the same piece.

If they grab it, get buy-in, do well, and make sure it integrates with others, then that's a person that you want. The others can show competence with other people.

If they grab it and there is conflict over how they did it, then that isn't a hire.

Getting "the sexiest piece" isn't important. You need to convince the interviewers that they want to have you on their team. Being willing to take necessary scutwork that others don't want is a pretty positive characteristic.


I would like to see somw metrics

When we do interviews, at least 6 people evaluate the candidate. that would certainly smoothen the cons and pros of individual interviewers and provide a fairly honest picture of the company's culture...

The best interviewers have 1 big thing in common : they are the actual team the candidate will be working with.

For short term employment, sure. Teams change, though, so you need to be sure you're hiring for overall ability and not just current fit. Same goes for "culture" as well. There's an argument that "who you'll be working with" as a metric leads to homogenous teams.

I'm so, so tired of doing interviews filled with "did you memorize this particular algorithm" whiteboard coding questions. Especially given I'm not some fresh-out-of-college hire. The industry's research on this is pretty unequivocal that it's a terrible way to evaluate candidates for anything other than "isn't a complete idiot" which would also be accomplished by more reasonable questions.

Why can't the industry finally just drop stupid whiteboard coding and move on to more practically focused questions?

Honestly I find most white board coding questions to not veer into the "you'll only get this if you've seen it before" territory. Not to say that it can't happen, but in some of my interviews recently, I got the following whiteboard problems (all asking for optimal solutions):

Find the largest palindrome substring of a string

Shortest path solver (Djikstra's)

Compute a power if the base is any double and the exponent is an integer

Take an unsorted list and put all the odd numbers on one side, all the evens on the other

Given an array/list of numbers that first increases then decreases and a number, determine if that number is in the array/list.

Out of those five, only one requires you to have something somewhat memorized (path-finding), but even that is pretty basic, and I think they would have accepted non-optimal solutions that worked anyway, so really all you needed to know how to do was write a BFS that maintains the cost of the shortest-yet found path to each expanded vertex. The rest I think anybody - given enough time - could solve if they were a good programmer. Of course sometimes there are problems where you don't figure it out in time, and definitely (a minority of) problems that are too hard for a whiteboard interview, but you're not going to get every technical question right anyway.

It increases the stochasticity of interviewing when you can fail the technical challenges of interviews for jobs you, on another interviewing day/with another set of question, would otherwise qualify for, but this is logical from a hiring perspective. It costs less to interview 20 people and accept one qualified candidate than to interview 5 people and accept an unqualified candidate. It's not that you need to know how to solve these problems to be a good programmer, or even that being able to solve these problems is indicative of a good programmer, it's that a bullshitter/unqualified candidate will fail these every time, while at least a few qualified candidates won't fail.

Finding the optimal longest palindrome substring algorithm surely is in the "you'll only get this if you've seen it before" territory.

Implementing a binary search algorithm is anything but trivial, you've got overflows, off-by-one errors, passing arguments by value, etc. Sure, the idea of finding the point in the array that starts decreasing with a binary search is simple, but the subtle ways you can write it wrong are practically infinite.

That's true about the binary search. I didn't complete that one in time (although I did relay the algorithm structure). Although I think white boarding gets a lot of undeserved hate, one gripe I have is with its occasional focus on compilable code. Obviously you want to hire a developer that is actually capable of implementing solutions to problems, and one that can code in a timely manner.

However, it really stings to get the algorithm right and know what the implementation concerns (overflow/OBOB) are, but simply not have enough time to get them written in a correct/compilable manner. I've failed a few online interviews where I knew the solution to a problem but simply didn't have the time to write it. For example, I got a question that required building a heap, but at some points you needed to remove items from the middle of the heap. This would require being able to call sift-down/sift-up on the underlying data structure. I didn't know how/if I could call sift-down/sift-up on the priority queue implementation of the language they made me use. So I essentially ran out of time because I couldn't look up an algorithm I knew how/why to use.

I have to disagree about the palindrome question only because I got it in the last ~15 minutes of a 30 minute first round interview. With 10 minutes passed and 5 remaining, I still only had the naive solution on the board with less than five minutes to go. After a few more minutes of thinking I eventually realized that I had only been thinking of palindromes from outside-in. With about 1-2 minutes left I hurriedly explained to my interviewer that you should check each character and find the largest palindrome that character is the center (or half of a paired center) of, by expanding outwardly. He asked a few questions, then assured me that it was ok to not write it on the board and that he could tell I understood it. It really gave me a good impression of the company that their hiring practices were so rational, and I now work there.

I really think anybody would have figured out the problem if they looked at it long enough, it's just that sometimes things like this take 4 minutes and sometimes they take 4 hours. It depends on the person but not strictly an "intelligence" way, I just think there's a lot of variance in how long it takes people, even pretty smart people, to solve sufficiently hard problems.

That's a good solution but it's not the optimal solution. In the worst case that's still n^2. Nobody expects candidates to answer the linear time solution for that question.

Bit of a necropost, but -

I agree with most of your points, the thing is, that palindrome question worked out because

1) You had a "realization", which still feels a little "there's a trick to it" to me, which can be a sign of a bad whiteboard Q, I think. Obviously, this is fungible.

2) More importantly, your interviewer didn't suck. They understood the process is what matters, and not the code on the whiteboard.

That #2 is the reason I think whiteboard coding is bad. Most interviewers simply aren't trained well enough to evaluate based on process rather than results. That's why I advocate for only doing very simple ones, or not at all, simply because it's less dependent on how good the interviewer is.

> Why can't the industry finally just drop stupid whiteboard coding and move on to more practically focused questions?

Cynically, I wonder if it's sometimes because it's a crutch for the interviewers. Like anything else, interviewing is a skill that you learn through interest, study and practice. I've been carrying out technical interviews for years, and still wouldn't say that I get it right every time. If the interviewers have not made an effort to learn how to do interviews, or don't really want to do them, or don't actually know what the job will involve, then they can just fall back on asking algorithm questions.

They're sort of like how the military would judge physical fitness by the number of push ups one can do. It correlates with skill in general, but it biases towards people who practice it.

Which is sorta the point of the pft.

Can you give an example of a good practically focused question?

At my Google screening interview, I was asked "How would you teach recursion to a 7 year old child ?"

Apparently, the correct answer was Romanesco broccoli.

I failed the interview and briefly sank into a depression questioning everything I knew about raising young children, self-similarity and recursion.

Well clearly the answer was, "By teaching him recursion at age 6 and passing the age incremented by 1 as an argument because what kind of monster would let kids play with mutable state" :D

How about that old joke: "Pete and Repeat sat in a boat. Pete fell out, who's left?" Whether or not it works for the interview, it definitely works at annoying 7-year-olds.

Interesting, I AM teaching 7 year olds about recursion, but I doubt any of them have seen Romanesco broccoli. Anyway, the example I use to demonstrate the fibonacci sequence is sunflower seeds.

I hope you got another job that you enjoy.

I personally think the more design focused ones are good. EX: "Here's a system were building. How would you build it?" Then you build on that with "Ah, I see you want to use a cache between these two servers/regions, what kind of cache would it be, what are some limitations there, etc?"

Or something like in the article where you start with a little code, and build on it to solve some problem that isn't a totally arbitrary tree traversal like most of the whiteboarding stuff seems to be.

I do the design focus based on the candidates own background... something they have worked on in their resume. This also help me confirm they didn't make things up on their resume.

For the coding part, I try to pick things that don't depend on someone hearing about it previously. I try to solve the problem myself in 1/3 the time I allot to the candidate since they are under pressure. Most candidates seem to struggle with simple problem though.

A simple one I have asked in C is to reverse the order of numbers in an array in place.

> A simple one I have asked in C is to reverse the order of numbers in an array in place.


    int a[] = {1,2,3,4,5};
So, some kind of swap?

    for (x = 0, y = 4; x < 5; x++, y--) {
      a[x] ^= a[y]; a[y] ^= a[x]; a[x] ^= a[y];
See, I wouldn't mind that on a whiteboard. I had similar question for a startup, and passed easily (I think!).

But then the last question I got was 'find all word substrings in a string'. I struggled, giving a brute force answer (for each position in the string, check against dictionary and expand by one character until the end of the string), but struggled to improve on it. I didn't know about tries at the time, but even still it would be difficult for me to even try to code this on a whiteboard. Turns out it was "Google" question that was for optimizing runtime, and I cannot analyze like that on a whiteboard, unless it's obvious (nested for loops, etc.).

Anyway, after struggling and giving up with that question the next guy came in and berated me and my current employer (defense contractor). Not sure if it was because they thought I wasted their time or what. I left after that--not someone I would want to work for.

Should have mentioned that temporaries are allowed. Also you swapped the array elements twice... The terminating condition should be x == y.

Also if you changed this to a int *a (and a count), some people will get confused how to do this all with pointers. Asking how this would be tested is also a good question.

>The terminating condition should be x == y

That assumes an odd number of elements (true in this case but not in general); I would recommend terminating when x >= y.

ah, whoops!

It's a nice xor swap, but I think you reversed your list twice. And xor swapping with x=y=2 is likely not ideal. Seems like exactly what I'd do in an interview too :p

The xor swap failure when working on the same variable has been nicely exploited in the underhanded C contest: http://www.underhanded-c.org/_page_id_16.html

> The XOR-swap trick is an example of coders being too clever for their own good. Here, it gradually poisons the RC4 permutation with zeroes, until eventually the plaintext is just passed along in the clear.

Ha! Those are always good reads. Thanks.

See, I'd need to run and debug it to figure all that out.

But apparently some people can just write it all out on a whiteboard in one go.

If you are trying a new coding or design question, have a few co-workers try it first to help flesh out the wording and get data points on how long it takes to solve.

Isn't this just as bad as the "if you've seen this before" whiteboard questions? A lot of specific problem domains don't make for good general purpose questions. If I used my daily work as an interview question I'd insta-fail the large majority of candidates because they don't have relevant PhDs.

There are good balances. The question I ask is a real problem for a real product but quickly reduces to something small and manageable rather than getting bogged down in incredibly specific requirements.

See, I hate questions like that. They're completely face-level and easy to bullshit. I don't want to work at a company full of bullshitters.

Best interviewer is aware of what the actual position really requires and will adjust interview to find match for that exact position instead of some other generic one.

The best interviewers are the ones that see your dribbble/github/youtube/etc.. matches their company needs and reach out directly.

What sort of companies need people like me with empty YouTube/dribbble/github accounts?

You've built nothing others can see that would make them want to hire you?

Nope. I've built tons of stuff for companies though. Stuff you've almost certainly used before, in fact. I do have a github account, but the publicly available stuff is just a bunch of garbage from me tinkering with new technologies or whatever.

The idea that I need to have some sort of public portfolio to land a job is ludicrous. I have a life outside of programming.

"I do have a github account, but the publicly available stuff is just a bunch of garbage from me tinkering with new technologies or whatever."

Speaking as an interviewer, that's actually still pretty valuable: it tells me a lot.

I have really bad anxiety during interviews and it clouds my mind. I'm considering taking a beta blocker for my next interview.

Do it. It's an open secret in musician circles that many rely on beta blockers for high pressure performances.

"When can you start?" question :)

I saw this was posted yesterday. For reference, don’t delete and resubmit things to HN if they do not get traction.

It's true that submitters aren't supposed to delete and repost (https://news.ycombinator.com/newsfaq.html) but as the FAQ also says, a small number of reposts are ok when an article hasn't gotten traction. And in this case we invited it, which we do a fair bit of.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact