Hacker News new | past | comments | ask | show | jobs | submit login
Dark Motives and Elective Use of Brainteaser Interview Questions (wiley.com)
290 points by 33degrees 9 months ago | hide | past | web | favorite | 291 comments



As someone who does both a lot of interviewing (at Google) and gives a lot of advice on interviewing to people just starting out, this rhymes with my intuition about why some people get into interviews: even without the use of brainteasers (which, for the record, are not used at Google), the interview process is hard. People rarely come out of it without at least a twinge of frustration of humiliation. I'm not immune to this, I got hired at Google despite having a few interviews where I thought "God damn that was horrible," and I didn't get offers at places I know I was good enough to work at because I wasn't performing at my best for whatever reason and nuked the interviews.

I have a pet theory that when people interview they bring this buried frustration into the room with them and use the interviewing process to play the part of the people they feel humiliated them. It's conceivable that people with latent meanness, or as this abstract calls it tendencies for "narcissism and sadism" who aren't aware enough of their biases to try to counteract them, will use these sorts of questions to make themselves feel better.

Also, and this is probably more specific, there's a bit of a mythos around companies like Google and friends that have tough interviews and high standards: they hire the best people, they're doing the best work, they're the smartest, etc. I can tell you from experience, that mythos doesn't hold up once you're doing you day to day job. I've seen people with stunning credentials doing what most would consider exciting work become genuinely bored. Perhaps for some of them interviewing is a way of reminding themselves of how desirable and high status their workplace is, to the detriment of the poor candidate.

And of course, as a disclaimer, this isn't even anecdote, these are my own personal musings on psychology. Don't confuse this with insight into the Google hiring process.


> I have a pet theory that when people interview they bring this buried frustration into the room with them and use the interviewing process to play the part of the people they feel humiliated them.

I think it's possible--but there was an experiment someone ran a few years ago with a guy applying to jobs in both software development and management (while qualified for both and resumes optimized for each). One of the key differences that was found is that software developers essentially started out at zero and every wrong answer / misstep essentially reduced their score. For the management interviews, it was tougher to answer something wrong, but each good answer earned points.

It naturally results in software developers feeling like they blew it, but managers just wondering if they wowed them more than the other candidates. To me that suggests it's something more inherent to software developer interviewing than a frustration bias. If I had to guess, it's just that aspects of software development are more measurable than in other roles, and there's more genuine fear of hiring someone who can't fizzbuzz.


This is one of those moments where citing a source (that experiment) would be incredibly useful...


I wish my Googling skills were stronger, but alas, they are not.

I remember one of the other key takeaways was that the manager role was treated more favorably--specifically that they apologized they could only offer some (reasonable) amount of relocation assistance while none was even offered in the developer role. It was theorized that managers were seen as an "in group" among other managers, but that was obviously just speculation.

I wasn't on HN at the time, so I can't confirm it was here. It was one of those blog posts / write-ups that made the rounds.


I think it was a blog post by Michael O Church, though I don't have the reference to hand.



That's the one! Thank you so much.


I would think it isn't that managers are an in-group but are peers, even if a manager helps hire another manager that hire can be a potential problem for them if they feel slighted, but if the developer feels slighted - well the developer is subservient, not a peer.


> and there's more genuine fear of hiring someone who can't fizzbuzz.

I always hear this fear proclaimed but have never heard an actual story of a candidate failing to fizzbuzz, would love to hear an anecdote or two from anyone who has one.


I have both once failed fizzbuzz in an interview and gotten offers from FAANG companies (different interviews, obviously). At the time I failed the fizzbuzz, I even knew it was a fizzbuzz question i.e. I'd read Jeff Atwood or Joel Spolsky's blogpost introducing the concept. I just got very very nervous. I work at Google now.

There's not much to say really. I think people for like me who get very nervous at live coding interviews, the variation between best and worst performance can be massive.


I've never asked anyone fizzbuzz, but I have seen people badly flub very easy algorithm questions that are basically at the same level.

In every case it was because they over-complicated things. They came up with elaborate logic that they didn't need and then got lost and didn't quite get to a working implementation. Picture, say, starting to implement fizzbuzz using two tries and an associative array.

I would guess that a lot of the people I've seen bomb these kind of interview questions are actually good, capable programmers, but they just got overwhelmed by interview nerves and got lost.


Not a candidate, someone we had already hired through an agency. Two someone’s actually. Never again.

They were great people, but couldn’t distinguish a string from an int.


That's pretty crazy, how do these people exist and get jobs? What kind of development were they supposed to be doing?


And what's the reason they were asked to do the fizzbugz test?


They weren’t asked until we had some internal hackathon where we were trying to implement tested fizz buzz in new languages we were working with.


Full stack Webdevelopment. Had 2 years of experience.


Yes it happens. I’ve interviewed people who couldn’t psuedocode an isOdd function after making sure they knew what an odd number is.


It's like military academy (or fraternity) hazing. You've been abused, therefore you want to pass the load of that abuse on. Maybe you delude yourself into calling it "tradition" or "high standards" or maybe you're just honest and enjoy the sadism of it all.


> It's like military academy (or fraternity) hazing. You've been abused, therefore you want to pass the load of that abuse on.

I really cannot imagine why people have this behavior. I was often treated unfairly in life, so I try to architect systems that attempt to avoid my perceived unfairness so that the next generation does not have to suffer similarly (ideally in way for which correctness can be shown in a mathematically rigid way).


I went to a military academy (and now I interview a lot at Google, too). Doing a lot of soul searching now...

But seriously, I agree with the top comment about interviewing at Google, and as far as the military academy "hazing" it's generally misunderstood. It's very, very rare for an upperclassman at my particular school to be cruel in the movie-sense, or what I think is the stereotypical fraternity-sense. More common is an institionalized set of standards for the freshmen that may seem overly tough or even arbitrary but serve several purposes: a) they give the freshman a crash course in being accountable for details, b) they give the upperclassman a crash course in enforcing accountability and holding others to high standards, and c) they create a sense of shared hardship and camaraderie. These are actually all very important lessons before a second lieutenant takes on responsibility for decisions that could mean life or death for his or her soldiers (which, in the case of my graduating class, was a reality for us as quickly as 4 months after graduation). You have to force the upperclassmen to come out of their shells and make the freshmen wear their uniform correctly, or make their bed correctly, etc. so that when they graduate they will have the courage to tell their soldiers that they really do need to inspect the vehicles, or maintain the weapons, or wear their safety gear. It takes practice to be accountable for standards on a team. It's not supposed to be about making people suffer for the fun of it, but of course there are jerks everywhere.


Note that you can easily do both, unconsciously. Architect systems that are fair, but behave poorly in person. As a poor attempt to illustrate: imagine someone who is not mean spirited but has anger management issues - they would not necessarily build systems that favor aggression, but they might behave like that's their preference.


I think the answer is that most people are not like that, and the ones that are don't think about it from that perspective or that explicitly. All disagreeable human behavior is easy to point out and criticize in its most straightforward and uncharitable representation.


I think the answer is that most people are not like that, and the ones that are don't think about it from that perspective or that explicitly.

That, in a nutshell, is why a lot of abnormal psychology is so very sticky. It's even worse than that, however. It's more that most people are just a tiny bit "like that," and no one thinks about it from that perspective.


I can't speak on how solid this research was, but I remember reading a book (one of Cialdini's, I think), that explained hazing behaviour as group cohesion enhancers. Basically, what you suffer during initiation will make your bond with the group much stronger. I suppose there might be something to it, given how prevalent such practices are all around the world, in all cultures.


There's enough people out there who take this more as a hint that this is an inherently hostile environment through and through and just close themselves off forever/become closet hostile towards all members of the group


The wheel of pain and suffering must roll on, otherwise people might get a clue and unite against their oppressors.

Or in the words of the freemasons, order through chaos.

Once hurt, few seem to have enough integrity to break the loop; so it snow balls nicely all by itself.


I have a pet theory that it's related to the difficulty of accepting a painful truth.

In that moment, you face a choice. Either you do the same thing that was done to you, or you don't.

If you break with tradition and avoid doing whatever it is, you will end up asking yourself why, and you'll have to answer "because it was abusive". It some cases, that thought may be too horrible to even admit into your mind, or at least it's unpleasant enough that you prefer not to deal with it.

If you go the other way and stick with tradition, then you can just do whatever it is, and you can say this happened to me, it's normal, it's part of the natural cycle of things, etc.

Also, sometimes you are blindsided by this and don't realize you're making the choice until the moment it happens, so you don't have time to mentally prepare to do the harder thing.

Point being, it's not always sadism. Sometimes it's denial.

TLDR: The corollary of "that would be abusive" is "I've been abused". It hurts to confront that possibility, so people look for ways to deny it. And the organization offers them one.


> You've been abused, therefore you want to pass the load of that abuse on. Maybe you delude yourself into calling it "tradition" or "high standards" or maybe you're just honest and enjoy the sadism of it all.

What did (or will) my successors do to deserve having an easier time than I did?

(Disclaimer: have not been to a military academy or a frat; but only to an undergrad school.)


> (which, for the record, are not used at Google)

I interviewed maybe 3 or 4 years ago at Google. Definitely got some questions that were not remotely programming related, but were just hard problems that probably were awesome if you already knew the answer.

I recall one was something along the lines of "You're making a bet in the final round of Jeopardy. Presume that your opponent is going to bet optimally given your bet. How much will you bet?". No matter what I bet, he bets such that if he gets it right and I get it wrong, he wins by $1, etc. There were different cases to consider, but all the hard ones boiled down to what felt like a differential equations problem.

It wasn't a programming question; it was a logic brain teaser question. And the interviewer seemed to really enjoy himself.

Didn't get the job. I now ignore Google recruiter emails.


I had a similar teaser question in an interview for a senior technical role which went something like: given a function that has a uniform distribution over N, how to create a new uniform distribution over N+k. I recalled the general idea/theory of building new distributions from old and felt confident I could find the answer with general resources (textbooks I had at my desk or a quick search). The (fairly junior) interviewer didn't recognize the idea and kept trying to push me towards a specific answer. I looked it up after the interview -- the problem is apparently well known in CS circles (it was not for a CS role) and was something I could look up in 3 minutes. I did not get the role.

Arrogant? Maybe I am, a little, sure. But I don't spend my days remembers every minute detail of every technical documentation I've come across -- I can't remember everything! What I _do_ instead is remember _how to recall_ technical items in specific contexts to quickly refresh and remind myself when the time comes.

I don't come across _too_ many arrogant or clueless interviewers, but when I do I am genuinely surprised.


That doesn't sound like differential equations, it sounds like discrete math.


Google has a lot of good people, but in my experience the interviews are specifically looking for a particular bias -- not just knowledgeable and personable people but specifically people who think a certain way.

I had an interview there recently in NYC. I solved the guy's puzzle too quickly and I think that he thought that I'd prepped (I hadn't). Then we got started on a different problem and I tried to explain how it was a good example of "boolean blindness" and that using a specific form of dependent type was an ideal solution. He looked at me like I had two heads and smugly told me that it's impractical to solve problems this way (but it's a bog-standard type checking problem).

So yes, smart people, but you should know the bias that they have before you decide that you want to work there. It could very well be bad for your career if you _do_ get in.


I interviewed at Google (back in 2005, mind you) and I remember being asked a couple brain teaser questions like how to estimate the number of windows in NYC.

I will say that out of my loop, there were only 2 interviewers that came across as intentionally mean, the rest were genuinely wanting to see how I solved problems and understand what my thought process was, which is what most interviewers should be attempting to assess.

The intentionally mean ones need to be culled from your interviewer pool though. Does it really do a service to Google or to the candidate to stump them, and also humiliate them for not knowing the answer you were hoping to get from them? There are ways to make a candidate feel good about failing an interview despite not making the hiring bar.

And remember, these candidates are still your customers, and human beings. They will remember how they were treated.


Aren't the examples of brainteasers actually Fermi problems? Maybe they sound like brainteasers if asked without the appropriate context, but when you're doing a system design interview and you have to estimate your throughput or storage requirements, you're effectively solving a Fermi problem just like the one that asks you to estimate the number of windows in NYC.


Most of the "estimate foo" problems are Fermi problems. E.g. The classic one I've heard is "estimate the number of piano tuners in the world". The route I took to do so is:

      number of people in the world
    / average number people per piano
    * average number of times a piano is tuned per year
    * average amount of hours to tune a piano
    / average number of hours a piano tuner works per day
    / average number of days a piano tuner works per year
So in terms of units, the flow is:

       people
    -> pianos
    -> piano tunings per year
    -> hours spent tuning pianos per year
    -> workdays spent tuning pianos per year
    -> piano tuners
This is all very similar to dimensional analysis most people will learn in a highschool science class.

Just to plug in some random numbers, let's pick 7 billion people, 100 people per piano, 0.3 tunings per year, 1 hour to tune a piano, 3 hours of tuning per day and 100 days per year(I imagine it's part-time for most piano tuners...). We arrive at:

    7,000,000,000 / 100 * 0.3 * 1 / 3 / 100 == 70,000
And, for what it's worth, at least in my field, we do these sorts of ballpark estimations all the time to determine feasibility of systems.

I've never had one of these in an interview setting, but I imagine the important thing is to demonstrate the line of reasoning rather than actually arriving at a concrete number.


I love the term "Fermi problem", because the origin is, of course, the Fermi paradox. The idea that multiplying a series of uncertainties together does something other than blow your error range out is such a weird idea.


> The idea that multiplying a series of uncertainties together does something other than blow your error range out is such a weird idea

Might seem weird but it is massively useful. Having an educated estimate that is _only_ out by an order of magnitude is a massive improvement over wild-ass guesses that are 4-5 orders of magnitude off -- especially in systems design where over- or under-provisioning leads to millions wasted in effort.



This was 13 years ago, and Google has since had many "revelations" about its misuse of useless tools like brainteasers. The current process and interviewer training/selection process is nothing like the process in 2005.


Perhaps that true, but I think it's a recent change. I vaguely remember having a brain teaser question as recently as 6 years ago, but I concede it was a long time ago and I could be remembering wrong.

BI had an article about it as recently as 2015. https://www.businessinsider.com/google-brain-teaser-intervie...


BI... is an interesting publication. With an interesting relationship to facts.


That question doesn't seem great. People who've spent more time in NYC will likely perform better, since they'll immediately start describing classes of buildings where they've spent time and generalizing, but how familiar you are with NYC is irrelevant to the job.

Depending on your applicant pool, that might be correlated with a protected class, like race, gender, or national origin.


I don't think classes of buildings are that imporant. Just multiply the resident population by the number of windows per person. Just throwing a number out there, I'd estimate something like 10 windows per resident, counting residential and commercial.


At risk of proposing a generalization, I think that this comes from the same place that other gatekeeping behaviors come from: you see yourself as the in-group's bouncer.

That puts you in a mindset to run the interviewee through a gauntlet, partly as a ritual to show their desire to get in, and partly through a sense of duty to hire "only the best".

But hiring is not a filtering problem. It's a matching problem. And the more interviewers embrace that they're there to find the right people and to recruit them, not to filter people out, I think the more humane the interviews will be.


I've been mentally drafting a blog post on exactly this topic, so while that percolates in my mind I'll answer with: it totally is a filtering problem.

For a small company whose culture is in flux and needs the right person, it's absolutely a matching problem. You need to find the right person for the job, period.

At the scale of a company like Google, however, it's absolutely a filtering problem. First off, we're large enough that most (perhaps not all) people talented enough to clear our interviews will (given enough persistence) find a place where they can be happy and contribute. We can get away with not worrying about the subtleties of matching and instead only focus on technical skills.

Second off, the number of applications rejected for every hire is staggering, simply because of how many people apply. These numbers are fantasy, but imagine if you had to choose one person out of a hundred applicants, and you actually had the means to interview every single one. You're going to pay this person a quarter of a million dollars a year and aim to keep them for half a decade or more. The economically rational thing to do is be picky as all hell, and this is exactly how Google and friends do it.


> Second off, the number of applications rejected for every hire is staggering, simply because of how many people apply. These numbers are fantasy, but imagine if you had to choose one person out of a hundred applicants, and you actually had the means to interview every single one. You're going to pay this person a quarter of a million dollars a year and aim to keep them for half a decade or more. The economically rational thing to do is be picky as all hell, and this is exactly how Google and friends do it.

That's a good point. It's hard to ignore just how many applicants you filter out in the hiring process. However, Google and Facebook are exceptions in terms of having a non-differentiated hiring test. They are very big exceptions, but still exceptions. They have a standard interview bar and matching with teams comes later. On the other hand, Microsoft, Apple, Amazon, Samsung, Adobe, etc. all hire for specific roles.

I think the way to reconcile the nuance I was getting at with "matching problem" is this: "picky as all hell" must still be fair and backed up by a clear idea of why your requirements are as tough as they are. When you see yourself as a gatekeeper you're more likely to lose sight of that.


Also keep in mind that inhumane interview processes necessarily filter out the candidates who refuse to put up with inhumane treatment.


> ...even without the use of brainteasers (which, for the record, are not used at Google), the interview process is hard. People rarely come out of it without at least a twinge of frustration of humiliation.

Which is unfortunate. To measure something, it's necessary to test to the boundaries of its ability, which basically means crossing eventually into the realm of failure.

Too much of testing both in education and in job interviews is oriented to get a pass/fail result. Interviewers should understand that a good interview is not like this and should communicate this to the prospect: the goal is to measure your abilities and compare it to the needs of the position. Answering well for the most part, but not remembering some obscure academic topic or missing a particularly difficult part of a logic problem means that part of the interview helped find that boundary.

Honestly, if you answer 100% of the questions in the interview it means that they don't know how good you are, what they should pay you, or what you can expect to be challenged by in the job. They will want to know how much more they should offer and ask if they can take on much more difficult projects, but they don't know how much. That's an interview failure by the interviewers.


> I have a pet theory that when people interview they bring this buried frustration into the room with them and use the interviewing process to play the part of the people they feel humiliated them.

That's a very rare individual. I've been on both sides of the desk a lot, and I don't think I've ever run into anybody like that.

The much simpler explanation is that being an interviewer is hard and takes work and practice like any other skill. The problem is that nobody ever applies actual work to interviewing.

Did somebody let you watch their interviews before letting you interview somebody else? Did somebody watch your interview to give you advice for what you did wrong and right? Did you ever role play being an interviewer? Do your interviewers occasionally try to calibrate themselves to other interviewers?

I can count the number of people who can answer "yes" to even one of those questions on one hand without using a majority of the fingers--and that's over my entire career.


My FANG corp matches on points 1 & 2. Occasionally you'll find people who request 3. However, in my experience, that is by far the exception.

After doing my fair share of interviews and sitting through a ton of debriefs, I'm pretty convinced that getting hired boils down entirely to the loop you get. Some poor bastards get the wrong people -- people who either want to prove their smartness, or simply have out dated ideas -- and that's that. Didn't use inheritance in your system design? Well you got Bob instead of Joe. Bob loves old school OOP and thinks composition is a character flaw. If you got Joe, you'd be in, but... alas.

No amount of "calibration" solves bias.


> the interview process is hard

Nah. I've interviewed at a lot of places where I sort of enjoy the opportunity to demonstrate my soft skills. I can't say I'm looking forward to my FB & Google interviews(quite the opposite). My Amazon interview experience absolutely sucked(not a single question relating to the job, stupid algorithm gotcha questions).

Most companies, IME, are much easier. For instance, there is a large media company which pays competitively to FANG companies and the only thing I need to do to prep for the interview is memorize the algorithmic cost and backing algorithms of the common C++ STL container classes. For my current job(arguably not the greatest gig), I had to demonstrate my knowledge of Python, C++, kernel programming and networking. I know those things, and enjoyed demonstrating that. I enjoyed talking about my past work. My biggest worry was choosing the button up shirt I was going to wear. That's how interviews should work.


Many interviews are hard, but there's a distinct difference between good hard and bad hard.

My current gig interviews were all deep storage strategy conversations and hard in the sense of needing to show a lot of knowledge articulately. I left there feeling really good if exhausted.

My Google interviews were hard in the sense of embarrassing regurgitation and thinking nonsense on one's feet. Hard was being made to feel inferior.


Hard was being made to feel inferior.

I had an interview experience, where the interviewer was looking for a particular answer. I give him 3 answers to his design issue. Then the interviewer says he was looking for the 2nd answer I gave him, as if I didn't just say it.

It's one thing to reach and fail. It's another thing to be in a context where the truth of what you know doesn't even matter.


The pattern of "stay silent and look dubious to make the candidate feel dumb" is a really common interview hazing strategy in my experience.

I think the next time I see it, I'm just going to question whether this is how they behave in their job too, even if it means ending the interview right there.


The pattern of "stay silent and look dubious to make the candidate feel dumb" is a really common interview hazing strategy in my experience.

How is this distinguishable from the interviewer hearing something from the candidate where an understandable reaction is to stay silent and look dubious? I've had candidates try to tell me something like: a null pointer member of a struct in C++ takes up no data. The first part of what you described was my natural reaction, not a hazing strategy. (I then went on to ask if they were sure about that.)


I had a Google interview where the interviewer started by saying we will start with a warm up question and then we would go into a real question.

The warm up question was difficult and I was completely embarrassed that I didn't some it right away. As I walked out I knew for certain that was not a warm up question and was pissed that he had consciously shaken my confidence right off the bat.


The warm up question was difficult and I was completely embarrassed that I didn't some it right away. As I walked out I knew for certain that was not a warm up question and was pissed that he had consciously shaken my confidence right off the bat.

How do you know it wasn't just a badly written question? Interviewing is hard in the same way that teaching and writing is. It's hard to put yourself into another person's shoes. Never ascribe malice where incompetence is a sufficient explanation.


You are right. It is possible. I did have to write a bit of code to get the work done correctly and he seemed satisfied with the progress, but I may have done it inefficiently and he was encouraging.


I failed a google interview and it is frustrating. Emotional speaking, the rejection was hard but also other rejections as well.

At least at google i got feedback, audi did not. No feedback after coding project and 2 interviews sucks hard...


This - As an interviewer I hate giving feedback on failed interviewees. It always feels like an insult as well as invitation for them to defend themselves after the decision has been made (which means I have to say no again, doubling my unhappiness and presumably theirs.

As an interviewEE , though, feedback is essential to improving and definitely reduces the sting of a failed interview. (My sample size of both sucesses and failures is small but feels huge anyway).


I have a pet theory that when people interview they bring this buried frustration into the room with them and use the interviewing process to play the part of the people they feel humiliated them.

I've been flat-out accused of cheating and bald-faced lying, on scant evidence, when I was clearly doing neither. I very strongly suspect that the interviewers were motivated by their ageist prejudices in that case. I also have a very strong feeling that your intuition about the interview process is correct. I've done a bit of introspection and realized that there is a bit of dark side in me which can come out in situations where I wield power.

Also, and this is probably more specific, there's a bit of a mythos around companies like Google and friends that have tough interviews and high standards: they hire the best people, they're doing the best work, they're the smartest, etc. I can tell you from experience, that mythos doesn't hold up once you're doing you day to day job. I've seen people with stunning credentials doing what most would consider exciting work become genuinely bored.

There's a class of real world activities of substance and significant consequence where one encounters, "long stretches of boredom, punctuated by moments of terror." Lots of development also falls into this category. You spend a half year, doing basically nothing but "fixing bugs in a CRUD app." Then, there's a high priority bug where error dialogs pop up for a whole bunch of high profile enraged users who quickly and definitively figure out they're being lied to by the system. The logs and HTTPS proxy captures tell you contradictory information, and you discover that some vital logging information is getting swallowed by lazy coding. Then the logging system goes down, staying down for weeks. On top of that, someone breaks the development debug build. All the while, director level execs are breathing down your neck. I'm not making this up. This actually happened to me.

Most of the time, the job can involve almost nothing, then all of a sudden, you need more than your wits about you. It's for times like that, that one seeks to hire people who can actually apply knowledge, as opposed to those who regurgitated it on an exam then promptly forgot it.


I have been interviewed at Google and I was given a brainteaser.

"Can you estimate Google AdWords revenue for Italy - B2B?"


Get population estimate for all countries with adword revenue, get adword revenue for each of these countries, fit to a linear model, predict italy. Or calculate adword revenue per capita and apply.


Er, if you can do that, then why not "get adword revenue for each of these countries, read the value for Italy"?


No, you had to give out a number.


How about $5B with a 99% confidence interval between $0 and $1 trillion. That's a number.


Sorry, that's not close enough. Can you do better?

(For the record, the person you originally responded to did not share that question with the intention of you actually trying to answer it. The point was the illustrate how ludicrous some of those questions are. You don't have access to any of the information that you mentioned in your response in the interview.)


“Why would I need to estimate that if I worked at Google?”


“I’d like to understand how you structure problems, we can do a different topic if you like?”

“With ~50M people capable of clicking ads, and assuming they click one ad a day at 20cnt CPC, it would be around 3.6bn. I think my assumptions are on the high side, but order of magnitude is probably correct”

If done well, solving a problem (not a brain teaser) together can yield insight in how people approach new challenges.

If I’m hiring for someone to do what they already know this indeed not a useful question. on the other hand, if I expect the role to tackle various business problems as well I’m not sure what would be a much better way to get a sense on this. (Specifically because I’d want to discuss a not-too-complex problem that is outside your usual area of expertise)


What I find weird about these questions, is that they are expected to be answered without any research? How often do you expect your people to solve a problem in a domain they know nothing about without doing any research? And the given example is odd, a lot of useful information that would help to answer the question will be public, and the obvious way to answer the question would be to look it up, before estimating the stuff that isn't public (but you won't know what information is publicly available until you look for it)


In the context that I've seen them used, the interviewer usually provides information if asked (and says so up front). e.g. if you don't know the # of inhabitants of Italy it should be completely okay to ask.

In any case I don't think it is about the calculation or the final number. It's really the approach that you develop that's interesting.

I've seen this mostly used in (management)consulting though, where you actually are regularly expected to solve problems that you might not have deep understanding of. When I was interviewing developers I discussed how they would architect an application to solve a specific problem. It gives similar insight and is closer to the actual job.


The problem is that the interviewer asks for a number. If the question was phrased as "How would you go about finding out the GoogleAds B2B revenue in Italy for 2017, which tools and methods would you use", I'd totally go for that question.

The reason is that to many people (myself included) your answer above is nonsensical.

  “With ~50M people capable of clicking ads, and assuming they click one ad a day at 20cnt CPC, it would be around 3.6bn. I think my assumptions are on the high side, but order of magnitude is probably correct”  
Where did the "people capable of clicking ads" 50M estimate come from? Why one ad a day on average? Why not 0.2? Or 15? Why 20c CPC and not 0.001 euro cent? Why not 1USD?

I'm all for "first-principals based structured thinking" but asking someone to demonstrate that by generating a rudimentary formula, that won't survive any contact with reality, and populate it with almost completely random values, seems like a weird way to go about it.


It's typical of people who think that a tiny amount of knowledge makes them experts in anything related to reasoning and data, as a good number of practitioners in this field do.


It's about knowing how to structure problems, whether the components are public or whether the problem space is completely unfamiliar.

E.g. someone who guesses "$3 billion" on only a gut feeling likely wouldn't be a strong hire even if the number is correct, while someone who can correctly decompose the final answer into its components and make reasonable estimates would be a much better candidate.

For estimation interview questions, and problem-solving questions more generally, a "structural" approach to breaking down problems is important because on the job the interviewee will need to be able to identify what levers are available to them.


Oh, that is how they do it. /s


This isn’t a brain teaser. It’s market size estimation, I’m guessing this was not an engineering role?


How is that question different from "please estimate how many golf ball fit in a school bus" ?

The only thing changing are parameters.

EDIT: it was for the Search team, search spam prevention, can't remember role it's been 4-5 years, sorry.


If you were interviewing for a job at a company whose main revenue stream involved transporting golf balls in school buses, that might be a perfectly reasonable interview question.


If it was an engineering role I’d definitely call it a brain teaser. But for say a marketing role, an interviewers wants to see that you understand the way money flows through the business and what the parameters therein are. Maybe it’s still a brain teaser but I would say it’s relevant to the job.


If someone is sending emails about golf balls and school busses, it's probably spam. ;)


"Estimating it from the taxes paid to the state, I would say 0€." (Yes, that’s not the kind of expected answer)


Satisfacton from saying this during an interview at Google would be worth not getting the job:)


TRUE STORY:

I interviewed at Google earlier in my career. In one of the interviews, the interviewer asked me to recite/reconstruct off the top of my head the convex hull algorithm. I remembered the lectures in undergrad algo class where the professor talked about it. I remembered where in CLRS it was covered. I remembered the general outline (efficient algorithms are O(n log n), because you have to sort the points as one of the steps). But I was struggling to remember the details, because, you know, I hadn't worked on that sort of thing in several _years_.

So, the interviewer started pacing around the room, impatient that I couldn't remember the details (or recite in 10 minutes what it took a professor with notes 2 lectures to go through). He told me finally "We expect Google engineers to be able to solve problems. What if you were a Google engineer, and you had to solve this problem, what would you do?"

There is, of course, only one right answer to that question, which I provided: "I would look it up in a book. I know exactly where to find it." I wouldn't spend time trying to reinvent an algorithm that someone else had already thought of.

The interviewer did not like that answer. I did not get a job offer. Now that I interview candidates at a different large tech company, I make a point to not get hung up on things that are easily researchable.


> "What if you were a Google engineer, and you had to solve this problem, what would you do?"

Well... I might ask one of the other people on the team (or in the company) for some help. Because I'm probably going to ask one of them for a review (or someone else will review it anyway). No man is an island, etc.

What did Sergey and Larry do to solve these problems? They hired more people!


They consulted existing literature, they asked in internet forums, etc

Expecting people to know and remember beyond the basics is simply ridiculous especially at the age of Google


> We expect Google engineers to be able to solve problems.

Sounds more like he expects Google engineers to have perfect recall, rather than to be able to solve problems.

That's just a trivia question, nothing to do with problem solving.


If I wanted to have a job that required perfect recall, I would have become a doctor. Maybe I should have if that is what this industry is going to require.


As an interviewer I always advocate for coding problems to be done at the computer and to allow internet searches. On the rare occasions someone does a cut-and-paste solution, I then modify the problem to see if they understand the solution enough to make the necessary change.

It's not perfect, but it seems as close to the actual work experience they are interviewing for.

I interviewed at Facebook recently and was shocked when they used Coderpad but did not allow code execution. Threw me for a loop, because I like to decompose problems and test that everything is still working as I add to it, something this did not allow me to do. I consider that a GOOD quality, why prevent me from demonstrating it?


I do that; I ask candidates to code in notepad instead of an environment where you an compile, and it's intentional.

Failure come in three types - bad understanding (failure to understand the problem), bad planning (plan doesn't actually solve the problem), and bad implementation (failure to implement the plan correctly). The third type includes typos and off-by-one errors that I don't want the candidate to get hung up on. I might ask about an issue if I spot it, but if not I'd rather keep going on the more relevant parts of the solution.

Also, you can leave part of the solution in pseudocode without getting distracted by a bunch of squiggly underlines, and handwave away parts of the problem that we don't care about.


> "What if you were a Google engineer, and you had to solve this problem, what would you do?"

"I would Google it", and this time the answer is very corporate! Still not what's expected by the interview, but it would be funny to watch him argue that the main product of its employer is not qualified for the task.

Besides, I find it silly too to ask for details that are easily found in printed material, with less probability of mistake than remembering it.


He told me finally "We expect Google engineers to be able to solve problems. What if you were a Google engineer, and you had to solve this problem, what would you do?"

At that point - he's talking to you as if he thinks you have the understanding of a child, basically.

There really is no positive recourse available other than to politely end the interview, and chalk up another day of your life (plus N days spent preparing) that you'll never get back.


>I have a pet theory that when people interview they bring this buried frustration into the room with them and use the interviewing process to play the part of the people they feel humiliated them. It's conceivable that people with latent meanness, or as this abstract calls it tendencies for "narcissism and sadism" who aren't aware enough of their biases to try to counteract them, will use these sorts of questions to make themselves feel better.

Also add another factor - Google's approach is (or at least was) in the words of Google's big HR guy: "A good rule of thumb is to hire only people who are better than you." (https://www.businessinsider.com/how-google-hires-exceptional...) Such approach naturally brings out the worst in the people conducting the interview - their high opinion about themselves. Mix it in with the stuff you mentioned - buried frustration, latent meanness/"narcissism and sadism" - and you have the perfect "Google interview" :)


> I have a pet theory that when people interview they bring this buried frustration into the room with them and use the interviewing process to play the part of the people they feel humiliated them.

I think it has more to do with impostor syndrome and a fear that a new person will expose them as the frauds they perceive themselves as.

In my experience, sometimes that fear is justified. The people whom I've run across who are the most condescending to engineers who "haven't proved themselves" often have the most lax coding standards on the team. They're often the ones who build Rube Goldberg programs and who never document their work - you can't be replaced if nobody knows how your core programs work.


"I have a pet theory that when people interview they bring this buried frustration into the room with them and use the interviewing process to play the part of the people they feel humiliated them."

This also explains when people chew out service workers for petty things.


I think that has more to do with the massive power asymmetry, because a service worker is generally expected to smile and take it. They don’t represent someone who humiliated the angry customer, they represent a punching bag.


Completely off topic, but 100% this is why people are awful to service workers.

I worked retail management while finishing post-grad work to pay bills. People who were loud and awful always had something else in their lives that was making them mad - broken down car, not much money, shit job, etc - and they used the retail employee as a punching bag.

I never saw one person who was loud have a reason to be loud other than that. In 10 years.


> brainteasers (which, for the record, are not used at Google)

Out of curiosity, how is this controlled?


Interviewers write up questions asked, candidate solutions, and their commentary and hiring recommendation. Those writeups are sent to a hiring committee which reviews all the candidate's feedback and makes a hire/no hire recommendation.

If the committee sees a brainteasers, disagrees with a question, spots inconsistencies in the feedback, etc., they moderate their use of the feedback or ignore it altogether. There's also a mechanism to send feedback to the interviewer, which they can use to let them know a question is appropriate, banned due to having leaked, etc.


Thanks, that's interesting. It must be a little bizarre doing interviews in a place where lots of candidates specifically prepare for your interview process, and some presumably try to game it.


I recently interviewed at Apple and one interviewer caused me to decide to never work there. I was put into a conference room and had 8 people over 6 hours come in and interview me one after another. Each person asked me about my background and then jumped into a whiteboard coding problem. Most of the interviewers were understanding of the fact that writing code on a whiteboard is nothing like writing code on a computer, and were helpful in pointing out simple mistakes that a compiler would have caught.

While writing code for the last interviewer I had to do some simple division. I came up with the wrong answer and told the interviewer that I wasn't sure if that was correct, and to let me know if it was incorrect. He blankly stared at me and said he wouldn't help me. Usually I would be able to do the division in my head, but after 6 hours of constant interviewing my brain was starting to turn into mush. I went on and finished the solution and he said that there was an issue but wouldn't tell me where. I asked if it was an issue with the division and he said yes. He still wouldn't tell me the answer, so I floundered for a bit while trying to remember how to do long division.

At one point I turned to him and said "I'm sorry this is taking me so long, it's been a very long day" and he replied "sometimes you will have to fix your code on the spot". It seemed like he was trying to see if I would crack under the pressure. Eventually I came up with the correct answer and was able to finish the problem, but it completely drained me.

I decided then that I never wanted to work with that person, or with other people like them.

Some jobs require performance under pressure, but programming is not one of them. The only thing that the interviewer was evaluating was my ability to do math problems in my head after being interviewed for 6 hours. Don't they have calculators at Apple? Or phones, perhaps?


> Some jobs require performance under pressure, but programming is not one of them.

In fact, programming under pressure is the leading cause of programming disasters. It's the exact opposite of what a rational employer should want.


Not to defend that interviewer, but sometimes it's inevitable that you do have to code under pressure. I've been in situations where the whole system is collapsing, thousands of dollars are being lost every second, and I have to push a patch to production as quickly possible. Not that I would interview for such a situation.


You should probably roll back the commit that broke production though, instead of hastily pushing something out hoping you'll fix it.

And ideally you should've been doing a gradual rollout, deploying on a few machines first then slowly ramping it up.


And if you are in that position, it is because you have the most (or enough) context to be able to solve the problem.

In an interview situation, it is like being asked to fix a problem with skyscraper having never seen the interior before. Interviewing is low-context. You cannot test whether someone will fail under high-context pressure in an interview.


"I've been in situations where the whole system is collapsing, thousands of dollars are being lost every second"

That's the situation you get into when you code under pressure.


No jobs are performed best under pressure. Or at least past the body's ability for a brief surge (adreneline etc.). But it happens in every occupation because overall more will be produced by pushing people than letting them pace themselves...


> programming under pressure is the leading cause of programming disasters

I don't want to straw man you, but it seems like hiring someone who is demonstrably better at programming under pressure would mitigate the inevitable situations that force employees to program under pressure.


> mitigate the inevitable situations

Conversely - the person who feels like he's good at programming under pressure might consider it no big deal and not bother to mitigate any potential "programming under pressure later" possibilities.


I will defend about 20% of the interviewer's behavior here. I've had candidates try to negotiate away requirements or get me to give them part of the answer. They'll say something equivalent to, "Hey, could this method I'm supposed to implement have an extra parameter that makes everything easy?" Sure, any problem can be made easy by taking away the hard part, but we're not doing that. Or they'll say, "So what data structure would you use?" That's for you to solve.

From time to time, I ran into candidates with a crazy attitude like this, so I learned to stand my ground. It was an unfortunate and surprising necessity. The interviewer might have been doing this but going overboard with it. (Or maybe they were just a jerk.)

Now the 80% of the behavior I won't defend:

1. An interviewer should have the soft skills to stand their ground without being harsh and implying that you suck.

2. An interviewer should know the difference between minutiae and important stuff. Partly for time management reasons during limited interview time.

3. Even if an interviewer is rough around the edges sometimes, they should understand that in this context, they are the face of the company. And it is stressful and tiring. So they should reign it in and be understanding and nice.

4. The company should be in control of their own interviewing process, at least enough that failures like this aren't common. Maybe an interviewer needs some guidance to improve, or maybe some people aren't cut out to do interviews, but they shouldn't be just left to their own devices.


So you met 8 people, and one of them was a jerk? I'd say that's a pretty acceptable jerk coefficient for a workplace. I've been in places north of 0.5.


The right number is zero and a UBI until they figure out how to stop being jerks. (My primary personal reason for supporting a UBI is I want to be able to say "No rational employer should employ you" without that implying "so you should be homeless and starve.")

But short of that, the right number is zero on the interview circuit. You can employ jerks if you have to, but don't put them in more positions where they have to interact with others than necessary. Give them the legacy products where you can't staff a team of more than one qualified person anyway. Give them an office. And keep the door closed.

Remember that a jerk is basically a 0.1x developer - an employee who causes other employees to lose productivity by spending time working around the jerkiness, taking a morale hit, etc. Even a remarkably productive jerk at best only compensates for their negative effect on people they work with, and most jerks are not consistently remarkably productive. https://medium.freecodecamp.org/we-fired-our-top-talent-best... is a good example of a company learning that lesson a little too late. A company that signals that they trust their jerks enough to make hiring decisions and to represent the company's culture to potential new hires is a company that signals they're actively cool with jerks (or incapable of noticing, which has the same effect).

If a jerk makes a good employee unhappy and they leave, the jerk is a net negative. And if you keep employing the jerks, you'll find yourself with a 0.5 jerk coefficient on your hands very quickly.


Just hiding how many jerks a company has until after the interview process is complete is not effectively any better - it could even be worse for people, as they could have had signal to avoid the jerk. The absence of evidence of some behavior or signal is not equivalent to it not being present at all.

In my experience, most companies have jerks - I’m not convinced it’s even avoidable reliably, as people come from many walks of life, and sometimes the walks form people to be jerks.

I work at Apple, where I’d say that jerks are a very low number of people I’ve interacted with (maybe a couple hundred employees so far) - I’ve seldom seen a company beforehand with less jerks percentage-wise, much less have processes that help with descalating arguments & deal with them quickly.


Your jerk tolerance is higher than mine. I really really don't like to work with people like that.

There were other reasons why I decided that Apple was not for me, but the jerk interviewer definitely tilted the scale from "yes, I could see myself working there" to "no, I will never work there".


If 12.5% of people in an office were jerks, I'd consider that a very toxic culture.


In my opinion the entire process was a jerk. An entire day of whiteboard problems? What a waste of time for everyone involved.


They were a jerk in a situation where both parties are supposed to be on their best behavior. Can you imagine how it must be dealing with them on a day-to-day basis?


That they allowed that person to be a representative of the company when someone is interviewing the company to decide if they'd want to work there... that's a bad PR move. Yes, the company extends an offer on their end, but really, both parties are making decisions. If they can't put their best foot forward.... that's a problem. If they think this was their best foot forward, that's another problem.


Apple interviews are team-specific, so I'm not sure that the existence of a single jerk on a single team's interview loop should reflect on a company with hundreds if not thousands of teams and tens of thousands of engineers.


Division. Damn. That's what calculator is for. I would just whip out my phone to do it.

Human has evolved to leverage tools to augment intelligence. If he can't understand the advantage of tool utilization, he's not a good engineer.


Humans attach prestige to solving problems the old school way. We have ladders but watch pole vaulting at the Olympics.


> Don't they have calculators at Apple? Or phones, perhaps?

He must have been from the iPad dev team.


> I decided then that I never wanted to work with that person, or with other people like them.

sounds like the interview was a complete success.

As a candidate, keep in mind the most important thing you are going to get out of an interview is to assess the job and the company. They need to make a quick decision (employ you for years based on a few hours interview), and so do you.


> The only thing that the interviewer was evaluating was my ability to do math problems in my head after being interviewed for 6 hours.

Although I agree with you about the (ir)relevance of the task, I disagree with this particular move you're making when other people do it.

Typically when people want to trivialize test results they don't like, they say it's over-fitting to the immediate situation (like IQ tests "don't measure anything intrinsic, they just measure how well you do IQ tests"). Unfortunately this criticism can be logically generalized across any statistical or empirical measure whose job is not to let people into the job/tell the truth, but minimize the impact of something bad happening if the test returns a false positive, or the opportunity cost if it's a false negative. This means two things:

(1) Because the chance of error always exists, there are always going to be people who feel gypped by the process; yet on the other hand, it is supremely difficult to come up with good measures of anything that is too costly to observe directly no matter what the domain. Thus sour grapes that want to claim over-fitting.

(2) For software companies like Apple, they can probably afford to have bullshit tests to weed out applicants, because the alternative of having to place each applicant through a probationary period for three months is quite costly, if we're willing to refute all tests as being narrow or irrelevant indicators in the manner above.

With Apple, it's still possible that they have an internal model for hiring that claims to have predictive power that relies on this particular task as a proxy in conjunction with all the other ways that they've tested you, before they move you over to a probationary period. But as the article pointed out this is also unlikely since the effectiveness of interview practices is understudied in a formal setting.


I had a similar experience with a Google phone interview once.

The interviewer asked me about a technology I knew almost nothing about, and I told him right off the bat that I didn't know much about it but I would explain what little I did know.

But then he spent the next 20 minutes asking me more questions about it, how I'd use it, how it could be implemented, etc.. It was so frustrating I should have just ended the call early.


I just become unable to do math during interviews. You know how people hate public speaking, because they feel super nervous, sweat a lot, and have trouble saying what they want? That but with calculous you haven't touched in 12 years.

If they gave me a piece of paper and let me do it without interruption, I could do it, but generally interviewers will sit there and demand you explain what you're doing.


> and he replied "sometimes you will have to fix your code on the spot"

You should have told him that "every time, you get to use a calculator". If I managed interviews, I'd have 1 person interview 1 person in a full-technical interview, without the personal questions, with 3 additional interviewers watching behind a one-way mirror. When the interview was over, I'd go in myself, and ask the candidate how well they thought they did, and how well they thought the interviewer did, with the 3 other interviewers still watching.


I came up with the wrong answer and told the interviewer that I wasn't sure if that was correct, and to let me know if it was incorrect. He blankly stared at me and said he wouldn't help me.

Meaning: presumably he's trying to be helpful, and give you a perfectly accurate representation of how people at his company interact when trying to solve problems together.


Long division under pressure? This sounds as dumb as Password Swordfish.


[flagged]


I don't believe your nitpick is valid here. True, sometimes one has to code under pressure, in the sense that money is lost (or, at least not made) while things are down, but there's almost never a situation where you can't stop and take an extra 20 or 30 minutes to check to see that your code would do what you expect it to do. In fact, if the worst consequence of waiting 30 minutes to deploy a temporary fix is that some more money gets lost, I would argue you should almost always do that rather than deploying code that might have even worse consequences.


> but there's almost never a situation where you can't stop and take an extra 20 or 30 minutes to check to see that your code would do what you expect it to do

The implication in these examples is that you're doing it on your own, always, and... that's generally bad, especially with a known bug. Most of the bugs I have created have been ones where I've specifically not talked through the logic with someone else (usually a client/stakeholder, or domain expert, or whatever) to determine that my understanding is correct (might need more tests or to adjust existing tests to reflect a better understanding of the issue, etc). These 'emergency bug fixes that must be dealt with on the spot in minutes!' scenarios are more likely that cause of the issues in the first place.


I suppose we've been exposed to different parts of the software elephant.

When I say money is lost... it can potentially be a lot of money. Can you imagine what 10 minutes of downtime could cost Amazon, for instance? What about streaming the superbowl? Credit card payment processors? Even something simple like the test harness for a production line may mean that an entire shift of assembly workers now has to stand idle or sweep the floor. When your company is losing thousands of dollars per minute you will understand what I'm talking about.


I understand the concept of multiplication. In such a scenario, it is even more imperative that the fix you make (temporary though it may be) actually fix things. Take the time, get all the help you need, and get it done; don’t just commit code and pray.


There are times when you need to think about the fix, and times when the problem is more obvious. If possible, I'd hack in a fix and, once the pressure is off, go back and validate it more thoroughly.

For instance, my company had a very important demo of our new system on Monday. Friday night my friend was tearing his hair out in the lab trying to find the cause of some network corruption. We had a race condition in the kernel between our network driver and inter-processor communication driver. We're a uniprocessor system, so I suggested just using an if-check to see if we were in the IPC code when the network bottom-half handler hit. It was a disgusting hack but we made the demo and no one had to work the weekend. The following week, my friend crafted a correct solution which was robust even if we enabled more cores. It's not the high-pressure straw man I mentioned earlier, but it's fresh in my mind and represents the sort of on-the-fly hacks that often need to be made to allow businesses to make money.


Next time just tell him you are a Millennial and you never heard of long division before.


So you will never work at Apple because you had 1 bad interviewer out of a sample size of 8? That seems foolish.


As a programmer I have the ability to be selective with the jobs that I take. The first 7 interviewers did not convince me that Apple was a good place to work. The last interviewer convinced me that I didn't want to work with them or people like them.

Of course, things change. It would be difficult for Apple to change my mind about working there, but it is not impossible. Ask me again in 10 years :)


My point was that Apple is a large company with many teams. I didn’t understand why OP would write off an an entire company because they didn’t like 1 team.


Stay hungry, stay foolish. Someone said that.


I interviewed a full day at a mathematical hedge fund. Not exactly brainteasers, rather a barrage of good, solid math and stats questions. Such a day is exhilarating (like skiing the Olympics) but truly exhausting, and only partially related to whether one gets an offer.

My advice to anyone doing this would be to give up the last 10% of performance in order to focus some on contextual awareness. These puzzles are a form of pre-surgical anesthesia, before they ask the real questions. You need the spare cycles at the end of a long day to imagine what casual conversation might really be exploring, how other applicants might misread the subtext, what mistakes in previous hiring they're trying not to repeat. One could be a genius trained seal, by buying into the brainteaser game, and miss the broader question of how you will fit into their firm. One who sees broader contexts ends up running places; don't make it clear that's not you.


Would you say the questions that fund asked you are relevant to the work you do (or would be doing) there? I think that's the core criticism most people levy against the "high tech" interview questions.


People that don't get the offer tend to complain about such things. They are asking questions to measure how you think, to measure your potential, to see how you handle stress. If you are stressing over an interview, what will you do when you lose $1 million in 10 seconds?


Yes, people who get the job are less likely to complain about the process. I don't see that that vindicates the way it was performed.


If you are stressing over an interview, what will you do when you lose $1 million in 10 seconds?

Beats me - but the "skills" demonstrated in the interview process you're defending -- "suck up; just keep going online until you memorize enough canned answers to most of these dang questions; suck up, suck up, suck up" -- certainly aren't going to help.

If anything, they're basically orthogonal to the set of competencies needed to deal with that kind of a situation.


Beats me - but the "skills" demonstrated in the interview process you're defending -- "suck up; just keep going online until you memorize enough canned answers to most of these dang questions; suck up, suck up, suck up" -- certainly aren't going to help.

Right. People who don't think of Comp Sci knowledge as a set of substantive first principles, just as meaningless tokens they need to memorize and regurgitate to get the job -- those are precisely the people who should be weeded out. Each one of those questions is not merely a chance to "collect" an arbitrary thing. It's also an opportunity to practice applying basic knowledge and techniques.

If anything, they're basically orthogonal to the set of competencies needed to deal with that kind of a situation.

You may well need to spot a deadlock or a race condition in code. You may well need to spot code that is inefficient by construction, in precisely that kind of situation.


People who don't think of Comp Sci knowledge as a set of substantive first principles, just as meaningless tokens they need to memorize and regurgitate to get the job -- those are precisely the people who should be weeded out.

And yet - those are the people who are not weeded out, but selected for by the default interview process.


Certainly, there are lots of problems with interviews.

That said, the interview should be designed to specifically weed out the ones who hope to pass by regurgitation. The answer can't be just a pattern match for a particular algorithm. The emphasis shouldn't be on dazzling recall of something obscure. The purpose should be to look for actual experience implementing something.


You may well need to spot a deadlock or a race condition in code.

The thing is - if your production system really ends up going haywire and losing $10 million every second or whatever, then then the root cause isn't some race condition. It's the whole human and organizational process (or lack thereof) that allowed that defect to slip in undetected (and in a position to cause so much damage without proper safeguards) in the first place.


> You may well need to spot a deadlock or a race condition in code. You may well need to spot code that is inefficient by construction, in precisely that kind of situation.

How is that relevant to the practice of "memorizing enough canned answers to most of these dang questions"?


How is that relevant to the practice of "memorizing enough canned answers to most of these dang questions"?

It's not. Instead, ask something not particular tricky, a bit open ended, and just a little bit involved (at least 3 things interacting) with the purpose of seeing if basic skills can be applied. Instead of looking for an answer that can be memorized, you're looking for evidence to see if people have actually designed, implemented, and debugged something at a level beyond gluing libraries together.


Sure I get the gist, I've heard this reasoning quite a lot and I'm not condoning or condemning it. I was specifically interested in hearing if the OP thought the math they asked him about was relevant to the math actually done on the job. In my experience, it isn't really (except insofar as it might be the same field, from a 30,000 foot view).


I think it's more about measuring enthusiasm for problem solving. If you sulk and roll your eyes at a difficult math problem, what will your demeanor be like every day in the office when your job involves quant analysis?


I'm suspicious of any practice whose proponents offer multiple, incompatible rationales.


That answer doesn't really hold up because the people coding the system aren't the ones doing the trading, and most jobs that use similar "stressful" interviews don't have anything like that anyways.


that doesn't justify asking questions that are extremely poor predictors of work performance


> They are asking questions [...] to see how you handle stress.

I've got a time-saving suggestion - instead of asking questions, measure how long interviewee holds up while being waterboarded. If they can handle that stress, they can handle losing $100 million in 10 seconds, easy.


What was the contextual awareness here? Are you just talking about personality and cultural fit or is there something more? You’ve had a go at this but I’m looking for a concrete example.


where did you find examples to practice - im horrible at these types of questions but would like to get better.


Depending on the hedge fund (I'm thinking of a few in particular), there isn't really a route to practicing these problems just to get better at the interview. You'll be asked to talk about your own research in the domain and you'll work on re-deriving (very small parts of) the research of other members of the team with guidance and from first principles. Regardless of applicability to the job, if you're asked to prove a theorem on the board you either know it (and have known it for a while, and in depth), or you don't. In that sense they're not brainteasers to practice a particular formula for, like solving an algorithms question or probability estimation.

That being said if you're just talking about mathematics and statistics problems the more mainstream hedge funds like to ask their quant candidates, read Crack's Heard on the Street, Joshi-Denson-Downes' Quant Job Interview Questions and Answers, and Zhou's A Practical Guide to Quantitative Finance Interviews.


thanks


This reminds me of Jewish Problems[0], designed to "prevent Jews and other undesirables from getting a passing grade" at the entrance exams to the math department of Moscow State University.

[0] https://arxiv.org/abs/1110.1556


Wow, I never knew they actually published a paper on it. Coming from a Moscow math school that traditionally had a lot of jewish students, most of whom went on to MSU to study math, I remember studying these exact problems - although in my time, this filter was supposed to long been gone.


As a self-taught programmer that has had to go back to get "cs fundamentals", half of interview questions feel like that - if you just know about x or y data structure for this particular edge case, everything becomes simple/easy.


This was discussed in Edward Frenkels book Love and Math


For a second there I thought you were talking about the interview process for becoming a Jew when they weed out undesirables.


My issue is more with tricky algorithmic questions. I faced these tricky interview questions, where you either have to know the trick beforehand or there is NO way you can come up with a solution in an interview. I never understand the point of these questions. Because these esoteric hypothetical questions will almost never come up in real-life. They don't test someone's skills, because these problems don't follow any standard algorithmic problem solving procedure. They mostly test whether you know the trick or not, which is not a valid metric of skills.

One famous example of this type of questions is to find loop in a singly connected linked list. Now almost everyone knows the trick. And thanks to that, interviewers have stopped asking questions like that.


Most algorithms questions only prove one thing: whether a candidate has brushed up on algorithms beforehand. That has some usefulness, being a proxy measure for conscientiousness, but offers little beyond that. Maybe that they’re able to follow the reasoning, but it could just as easily be learned by rote and parroted back. The fact is that no one comes up with a from-scratch efficient algorithm for doing anything in 5 minutes, the candidate either knows it going in, or they’re screwed.


I've been a cpp and java programmer for 12 years and not once needed to use algorithmic complexity on the job. The fact that so many places focus on this baffles me. Big-O questions should be reserved for system architect positions only. Out of all the arbitrary topics, I think they should focus on pedantic syntax questions for whatever language they use. Show snippets of code and ask "what's wrong with this picture?". That would at least select for people with better language mastery and, in theory, faster development times.


I don't really believe this. You use algorithmic complexity every time you choose between a HashMap and an Array. I'm a javascript developer, and I make that kind of choice several times a day.

Having said that, I've never once needed to know the kinds of questions that the ask in these interviews off the top of my head. And I highly doubt google engineers need to often either.


Since Google is such a well-spring of computational resources and have incredible scalability requirements, I'd question your claim that they don't use algorithmic thinking regularly.


He is saying they do need to care about algorithmic complexity in choosing data structures or implementations, but rarely need to figure out really clever algorithmic tricks (of the sort which tend to be tested by harder algorithmic questions, like the "find whether there is a loop in singly-linked-list", which relies on the running pointer technique which you never need to use in real life).


If you didn't know the answer to the question, would you have been able to come up with it on your own, from scratch?

Merely practicing interview questions without pondering them produces the false impression that they're useless trivia, when actually deciding to come up with the answers to these questions and struggle with the clues inside them which produce the answers show a much richer conceptual tapestry for data structures that goes beyond memorizing tricks.


The test lost its validity over time, but how many people care about their craft enough to know how to write software without using a library, and be willing to reason about it from first principles? Or with systematic method? They would be capable of doing original work. The rest I'd give CRUD tasks to.


An algorithms question proves today what it proved 15 years ago: that the candidate learned the algorithm. It does not and never did prove that the candidate spontaneously generated the solution in the interview. In other words, it isn’t an intelligence test. It isn’t even a programming aptitude test, but it is a conscientiousness test because the candidate went to the trouble of learning many algorithms successfully. That does count for something, but not necessarily what the interviewers expect.

If you want to test for programming aptitude, give the candidate a realistic but difficult real world problem and let them solve it in an IDE.


Do substantive enough work, give it enough time, and first principles will still bite the butt of the CRUD app maintainer.


If I was asked to do some comp-geom problem involving convex hull algorithms or intersecting lines, I would be boned. I haven't touched any of that kind of thing years. Even something simple like a quicksort or Dijkstra's algorithm would be a challenge for me to code off the top of my head. I shouldn't have to study for a job interview, I should be interviewed on things that I do on a daily basis.

The thing with a lot of these algorithms is that they seem so obvious and simple when you know the answer, but they still took a very clever person to discover them in the first place.

A good interview question should be a question where the algorithm to implement is trivial (e.g. binary search), but the application is novel.

A good algorithm question I was asked a while ago was to reverse a string in place ("hello world => "dlrow olleh"). Then after completing this task, I was asked to reverse each word, but keep the words in order ("hello world" => "olleh dlrow").

Neither of them is a difficult algorithm. There's no specialist knowledge required to implement them, but there is a degree of logic and problem solving required.

The problem itself wasn't meant to be hard (although apparently a lot of candidates completely failed at it). It was a vehicle for me to discuss how I went about solving problems. It also ended up being a good discussion about why unit testing makes life a lot easier. My code itself was incorrect at a couple of points due to off-by-one errors, but the interviewer wasn't worried at all, because he was aware that it was a whiteboard, and these kinds of errors would be picked up in a real coding environment with testing very quickly.


Exactly. An on-the-job developer who hand-rolls some tricky but well known algorithm from scratch is being foolish. It would be much more productive to see how their google-fu is.

Do they know which algorithm families to look up? Do they know which libraries in the given language will likely have the algorithms already available? That's what a real developer needs to know so why not interview for that.


I don't know the loop question. Do you have a formulation where it's not obvious but realistically possible to solve?



I could not find it but there was an old blog post complaining about this question because it was an open topic in computer science for more than ten years, and if one is not familiar with it they are essentially asking the interviewee to be smarter than then all the people in the computer science world in the 1960's or to be an actor and pretend to figure it out in 30 minutes or less.


I took an interview today and was asked a simple problem which I immediately answered with linear time complexity. The interviewer then went on to say "no I'm looking for a specific answer which involves a trick".

Literally said those words.

Eventually figured it out but I came away with a bad impression of the company. Not the kind of stuff I judge people on.


“Are most of your day to day problems solved by rote memorization of esoterica?”


I had a similar thing happen when interviewing for ShutterStock 2-3 years ago. Wanted me to implement fibonaccci sequencer. I of course, forgot the mathematical definition of Fibonancii sequence, so went to Wikipedia to look it up. Was told I couldn’t use outside resources, then was told to think through what that sequence does and struggled a lot. Finally got through it with a really terrible recursive implementation, because the interviewer kept interrupting me and correcting me over the course of the interview, which took about an hour. I asked the interview where they put my skill level and I shit you not they said entry level to junior. I’ve been programming for 10 years at that point, and was significantly qualified for the position. After the interview was over, looked up a constant time formula based on the Golden Mean (from Wikipedia) and implemented a constant time algorithm to solve it. Took 4 lines of code and 10 minutes to do (including a test suite) Sent it in via email, and basically told them why their hiring process is broken and insulting to candidates.

I didn’t get the job, thank god.


would you be able to share the question?


"Well, you're not hiring a trickster."


I dont work in tech, im an engine mechanic by trade, but im fascinated to know why/how often this happens in a tech interview? Do programmers/sys admins really endure this kind of aggravation in their interviews?

The hardest question I've ever asked one of my shop mechanics during an interview was how does Exhaust Gas Recirculation (EGR) in a Diesel engine affect thermal efficiency, smoke output and emissions of the engine...and that was for a full-time diesel tech at our cross town location.

id imagine if i ever started throwing out manhole cover questions or underwater airplane bullshit, i'd catch some hands before I managed to hire anyone.


As a long time tech person, it wasn't always like this at all. The brainteaser crap spread out from the likes of Microsoft, followed by the whiteboarding a little later.


Think of the type of person that would suffer through this type of indignation. And, the type of person the gleefully puts people through this kind of hazing.

Sounds like perfect formula for building toxic workplaces full of people with personality disorders.


That's why I'm not a software developer any more, even though I greatly enjoy the creative aspects of the work.


No, that's not entirely right. Many of us that hate it and don't want anything to do with it still endure it, because the alternative is not getting a job.


It happens in part because large, very successful organizations adopt it and then it spreads as a cultural practice. Microsoft (way back) and Google (back, but less way) were infamous for this. It'd be like if you heard at F1 teams, mechanics have to disassemble and reassemble a vintage carburetor blindfolded and decided to try it with your applicants.

There are also lots and lots of places, large and small, who never adopted that sort of nonsense.


IME, it happens more at 'pop-tech' companies where engineers are lining up for jobs. At normal tech companies where they're just trying to solve problems and make money, it seems to happen a lot less. The latter companies generally just want to see if you're not an idiot, that your resume is correct, and that you're not an asshole.


It happens because the types of problems software engineers face are random and require active engagement and creativity to solve.

There's no book that you can read that says "if the program crashes, try these things". Yes there are basics but theres no real formula that always leads to the solution. Each program is unique enough that you have to study the internals of how it works and then test hypotheses about the root cause.

In engines, they all follow the same predefined models. Software does have "models" but most people don't use them or they blend models together and don't document how they work. So we end up with having to solve many problems on the fly. Not everyone can do this (crucially, even people with cs degrees) , so to avoid hiring people who can't, they try to test for the ability during the interview. This is fraught with an entire slew of issues. Some of which are non-technical people asking the questions, applicants sharing the answers and gaming the system, or as the article says, sadistic interviewers that enjoy the feeling of power. To date, nobody has found a universally effective system to hire good programmers.


There's no book that you can read that says "if the program crashes, try these things".

Here you go: http://www.network-theory.co.uk/valgrind/manual/


'There's no book that you can read that says "if the program crashes, try these things".'

You are talking to a mechanic. This makes up a huge fraction of their job. On the flip side, cars usually come with design documents, testing tools, and standards. Software only wishes it could be so organized.


Hunter and Schmidt did a meta-study of 85 (now 100, with the follow up) years of research on hiring criteria. [1] There are three attributes you need to select for to identify performing employees in intellectual fields.

    - General mental ability (Are they generally smart)
    Use WAIS or if there are artifacts of GMA(Complex work they've done themselves) available use them as proxies. 
    Using IQ is effectively illegal[2] in the US, so you'll have to find a test that acts as a good proxy.


    - Work sample test. NOT HAZING! As close as possible to the actual work they'd be doing. Try to make it apples-to-apples comparison across candidates. Also, try and make accomidations for candidates not knowing your company shibboleth/tools/infra.


    - Integrity. The first two won't matter if you hire dishonest people or politicians. There are existing tests available for this, you can purchase for < $50 per use.
This alone will get you > 65% hit rate [1], and can be done inside of three hours. There's no need for day long (or multi-day) gladiator style gauntlets. Apply this process to EVERYONE, including that elite cool kid from the "right" school/company. You don't want to exclude part of your sample population!

[1] http://mavweb.mnsu.edu/howard/Schmidt%20and%20Hunter%201998%....

[2] Technically IQ tests are not "illegal", but the bar courts have decided companies have to climb is so high it effectively means they are. There is existing case law directly covering IQ tests in hiring. You should speak with your lawyer before you decide to try IQ tests. [3] 2016 update of Hunter Schmidt: https://home.ubalt.edu/tmitch/645/articles/2016-100%20Yrs%20...


Link [1] is borked. Presumably, this one works: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.172...

Also, you're missing the structured interview component.


What sort of meaningful work sample test can you do in 3 hours?


A year or so ago, one of our devs got some questions on our test replaced with things of the type the maintenance team occasionally gets asked, instead of making it solely about development and bugfixing, though twisted a bit to be feasible in an interview test (for example, the source data they'd work on was something like a 3 MB JSON file instead of a database).

Questions involved figuring out how to manipulate that data, and since not a single one could be done solely with a library call it said a lot about their ability to think through a problem, as well as their knowledge of things like data structures and efficiency. The latter questions even built off of the earlier ones, as if the stakeholder came back with more questions.

With a reasonable knowledge of the language you were using, those questions would only take around 15-30 minutes total, and each answer (as long as you did correctly build on the previous ones) was only ~5-10 lines of code.


"Here is the almost complete skeleton of a project in your preferred language (presumably, the one you're applying to use full time), finish it by implementing this vital piece of business logic."


65% seems quite bad.


The single most effective interview question I ask: "What do you want to do?". I drill down on that to a granulator level as I try to make sure the tasks, duties and overall position matches what the candidate really wants for themselves. Its followed-up with a "How would you do it?", again persisting in the discussion (not interrogation) until we both are comfortable with the sensibility and symmetry of the answers.


That may be a good conversational start but I'm less certain that anyone really knows at any deep granularity what they want to do (professionally). There are a lot of roads to autonomy, mastery and purpose, and a lot of speed traps/bad exits. My current job, which I've just resigned because it became so awful, was on the surface what I wanted to do, until it changed. It would be difficult to discuss the before and after in this context, for example.


Wow, I'm sorry but I think that's terrible for the candidate. You have a job description and you shared that with them. You're basically hiring the most talented bullshitter.

Very few people will know exactly what they want to do in the next few years. There is only what you as the employer need to get done and what they as an employee are willing and able to do.


Its only bulshitting if people don't want to work there. Presumably they read the description and said "that looks cool, I want to do that".

You should be able to look your employer in the eye and honestly say what you want from the job and discuss it like adults. Not dance around with a corporate-speak ritual telling them what you think they want to hear.


Well, let's be honest. The vast majority of jobs are just that: jobs. They're not something that anyone particularly wants to do; they're just willing to do it because it's better than the alternative.


I'm going to try this. The questions my team comes up with haven't been very satisfying and this seems like a good way to find out what they do know vs what they don't.


I learned this from one of the best recruiters in finance. I can't take credit for it.


If you think anyone 'really' wants to build your shitty CRUD app then you are deluding yourself.

Do you 'really' love your job? Would you do it for free?


I'd build his/her CRUD app, and make sure it wasn't shitty, if I felt it had potential to be a useful product and also felt like the people I met would be good to work with/for.

I've learned the hard way that I much prefer quality of life over "sexy dev work." (...and obviously take both if I can find it.)


Ouch. So, I'm not in the software business, but either way the point of my Q&A is measure the distance between what an employee wants to during the day versus what probably needs to be done to accomplish the desired result. When the gap is big the hire/job never works out.


> what an employee wants

What I 'want' to do during the day is be a sunscreen boy for the Reef bikini team.

What I am willing to tolerate in exchange for money is considerably wider ranging.

No one 'wants' to be trapped in some corporate soul sucking nightmare for five days a week.


I might not want your shitty CRUD app to exist per se, but the building of it might not be half bad, and I might learn something useful on the way and maybe even make some new friends.

It's not unreasonable to look for a job where some of it is what you'd do for free, if you didn't need the money.


There is an enormous gap between being able to tolerate working on something vs 'really' wanting to work on something.


Yes, I would. Probably not for 40 hours a week, but I would.


So, there's a hidden value to these sorts of questions, but it is useful to the interviewee, not the interviewer. If they use lots of these questions, you probably don't want to work there, or at least not for the person who's asking those questions.


I don't think your second sentence is a good conclusion to draw. If nothing else, it's very difficult to find companies which will offer the highest compensation packages (or better) without also asking these questions. You're going to come across these questions no matter which of FAANG you apply to; the same applies to hedge funds.

We're not talking about tens of thousands here, either. The top companies have packages which (including liquid RSUs) involve low hundreds of thousands more in compensation. Not everyone is optimizing their career for income maximization, but if you are it's hard to do so without being at one of these companies.

On another note I've found that, at least in my experience, there is basically no correlation between whether or not people ask brainteasers/trivia in interviews and how competent they are or nice to work with. There probably aren't any interesting conclusions you can draw about personality based on interview formats without being biased in one direction or the other.


But your experience, combined with the article in question, would suggest that "narcissism", "sadism", and "callousness" would be rampant at FAANG and hedge funds; surely that can't be right. Or...?

I would not actually use this one criterion to disqualify a prospective employer, but for the record I would gladly give up a good chunk of my salary to work for one without those traits. It's not like you would have to work for pennies. In Austin, anyway, I have held contracts at a dozen different employers and not one used these kinds of questions in the interview.


I’ve been toying with the idea of stopping these sorts of interviews in the middle, but I’m not sure how to phrase it constructively.

I want to suggest that while they may learn more about me if we continue, if I don’t learn more about them then I don’t want to work here. And they will learn more about me by letting me ask my questions.

Long ago nobody knew how to interview. If you derailed it in the middle to talk about more interesting things they were almost grateful. And I would frequently get an offer.

Now everyone thinks they know how to interview even though we agree as a group that we don’t. So they show up to an hour meeting with an hour agenda and I spend half the time wondering if this question means they’re code cowboys or insane. So then I’m wondering if I’m throwing the interview to get out of laughing at them when they make an offer.

I’d like to go back to grilling hen about their development practices and their roadmap and how I might be able to help or fit in. Please.


I sympathize with the temptation, but I try to keep a good attitude towards the interviewer(s) until it's over, and draw conclusions afterwards. My comment above might suggest otherwise, but in reality I mostly analyze later, after I've left.


Most places advertise pretty loudly in the careers section or in the interview prep material they send to you beforehand that you are going to be hazed. Amazon and Google both send out preparation packets after a phone screen that made me laugh out loud.


I disagree, as long as getting the right answer isn't important. I flat out tell people I don't care if they get the right answer, that I just want to hear their thought process.

Such questions used in this way are extremely valuable.


I do the same thing. It's how the thought process works that's interesting, not the result. There is no problem or teaser whose answer is both short enough to be devised during an interview AND is representative of how a candidate would perform in real projects. Our field deals with complex systems in many dimensions.

Instead, looking for how a candidate approaches a problem, do they just go for the obvious solutions, how creative will they go, any experience they can draw on, how well they overcome unknowns, etc.


It's how the thought process works that's interesting, not the result.

I wonder how many interviewers have the necessary background in clinical psychiatry to analyze a candidate's thought process, or if that's just more handwaving away hazing.


They're valuable for what? Showing off a thought process in a high pressure one hour meeting with a judge watching you?


If the candidate is applying for a job which will require the ability come up with solutions to new and/or difficult problems while under pressure, sure. Why wouldn't the prospective employer want to explore that at interview?


I think a good compromise is to modify the questions so it's actually applicable to the context of the job.


I had one of these brainteaser questions in an interview once, and I liked my mathematician friend’s answer to it the best.

Q: Given infinite setup time and infinite money, how would you fill the inside of a 747 with ping pong balls as quickly as possible?

A: Define the inside of the plane as the exterior. Now, by definition, all ping pong balls are inside the plane. QED.


My answer:

I use my infinite resources to become benevolent dictator of Earth.

With my endless wealth I prevent climate change, provide the basic needs of every person on the planet, educate the world, and set up a global system of sustainable reproduction. I have infinite resources so I assume it's not eugenics or something dystopian, but more a collective human understanding of responsibility. Finally, with our future ensured, I dedicate as many of the world's top minds as possible to preserving my own life at any cost.

Once I have achieved immortality, likely by transplanting my mind into a robot body that can be updated over time so I remain the most advanced being on the planet, I steer the course of human history far into the future. All human suffering is ended. We all work as one to better ourselves. We spend decades, centuries, millennia improving not only our society, but our technology for inserting ping pong balls into 747s.

Under my kind leadership as an immortal god-king of increasingly sophisticated artificial intelligence, humanity becomes the dominant species in a surprisingly crowded universe. With each new species we meet, we find new allies in our advancement. Entire galaxies unite under my rule and never-ending resources, each furthering my transcendence of life and the speed at which we can put small plastic balls into extremely archaic public transportation.

Finally, at the far end of time, when there is nothing left to learn, no secret of the universe left unknown to me—I am now a single bio-mechanical entity composed of all intelligent life in existence—my setup is complete. Using technology so advanced that it cannot be described by words we have so far invented, technology so unimaginable it exceeds the power we ascribe to magic, I fill the plane. It is instantaneous. There is no time during which the plane is being filled. There is only before and after it was filled.

The creator of the universe descends on its final moments. The being sees the plane. It weeps for the beauty its children have created.


> when there is nothing left to learn,

I only logged in to comment on this :-)

This won't happen, ever. That's because new things to learn are being created all the time!

For example, when there was no horse there was no horse riding. When there was no computer there was no programming. Just two infinitesimal examples to demonstrate what I mean.

Knowledge of "basics of the universe" such as quantum theory are of no use to a "higher order" being such as ourselves when almost all of us apart from a few physicists work on much higher planes. The matter and energy of the universe is constantly being rearranged to create more and more and higher higher order problems, you will _never_ run out of new ones. First we create the problem, than we solve it, creating new ones (and the whole universe does it too, and biological stuff especially) :-)


Who am I going to believe, a human in 2018, or a transcendent being composed of the unity of life in all its forms across the entire universe who has dedicated an unfathomable amount of time to uncovering every mystery related to ping pong balls and 747s?


> how would you fill the inside of a 747 with ping pong balls

Funny you should ask, because that was actually one of my primary responsibilities at my last job!


Infinite setup time: In the vacuum of space, as t -> infinity, eventually random quantum fluctuations will spontaneously create a 747 full of ping pong balls.


If there's a specific number of balls, that's a great answer; if you need to fill up the interior, you're gonna need a lot more balls!!


All ping pong balls may now be "inside" the plane, given this approach, but he's done a horrendously poor job of filling the plane.


Not really. It is as full as possible. :)


So, FAANG is full of carefully selected sadistic narcissists thinking they can "save the world"? Sounds like fun :-D

Seriously, I thought brain teasers are a funny game and an intellectual version of "xyz" measuring contest, showcasing equivalent of "street smarts" in higher level brain functionality and a signal about how much boring one would be for an interviewer to work with. Now this secret dark-triad dimension completely bypassed my attention as I actually enjoyed them a lot.


The "brainteaser" question type in the survey is just another type of question. If you've seen/practiced Fermi estimates, it's pretty rote. If you've never seen/practiced Fermi estimates, it will sound like a horrible trick question. I wonder how well it would work to begin with "are you comfortable with Fermi estimates?". If so, go ahead and ask. If not, _teach_, do a couple together, then have the candidate do one on their own. The technique isn't hard, it's just surprising that it can work and work pretty well if you haven't already been exposed.


"Fermi estimate" questions are stupid most of the time. Why? Because they leave out the educated part of educated guess.

The idea is that inaccuracies in a series of educated guesses would tend to even out (though they're just as likely to compound). There is no such tendency in blind guesses, which is what most of these questions require.


This might be a bit less sadistic, but I still think it's pointless.

You have 30 minutes or an hour with a candidate. Why is this question the one that will tell you the most about whether they'd be a good hire?


I realize that the study was specifically talking about brainteasers, but I wonder if irrelevant algorithm questions have a similar motivation? I was once asked at an interview at a FAANG company to come up with an algo for finding the longest palindromic subsequence (!). At another I was asked to get all partitions of an integer (https://www.whitman.edu/mathematics/cgt_online/book/section0...).

I'm pretty certain I'd never be working on problems of that nature as part of the role.


> an algo for finding the longest palindromic subsequence (!)

Not irrelevant to my professional career, to be honest. There have been a few situations where being able to see a tricky, unfamiliar problem and after some thought say, "oh, we can use a DP algo for that" has saved a large amount of future headache.

Only a small fraction of people on the team need to have that skill (more in some teams than others, too) so I don't think it's a great thing to test for (much less require), but it's actually a pretty valuable skill in my experience.


A Fermi problem is not a brain teaser in the colloquial sense, it's more of an approximation game. If you looked up an answer a week ago and memorized it, then regurgitated it, you'd still get the question "wrong". No one expects you to come up with the right number of windows, they expect you to get somewhere plausible from scratch.

Asking someone how many windows are in NYC is not fundamentally different than asking someone how many blades of grass are on a soccer field. You can know almost nothing and still arrive at the correct order of magnitude -- all it tests is someone's ability to do napkin math and break open-ended problems into manageable parts.

Is this the right thing to test? That depends on what you're hiring someone for.


When interviewing I would often ask one (and only one) question of this type. The rest of the interview was your standard set of questions to see what type of person you were sitting across from.

The one "brainteaser" question was "Estimate how fast human hair grows in miles per hour. Order of magnitude is fine."

I just wanted to see that they not only knew how to estimate something but that they understood this particular estimation wasn't important given the job they were interviewing for and could thus be answered quickly rather than accurately.

The one person who answered immediately was off by 3 orders of magnitude but he was the best hire we had. He could spot BS a mile away.


We ask people to pair-program with an interviewer on a realistic task. No tricks, no brainteasers. We just get to see how people do the job we want them to do.


Is pair-programming actually used in your company?

Just curious, as I do something similar but solely to see how the person writes code. Folks are free to pair-program or not once hired.

I've found almost everyone needs help on some simple part of the task, and how they respond to the help is pretty useful information.


I've interviewed at places with a pair programming part of the interview, that didn't use pairing during actual development.

It is an enormous help toward glimpsing the mind of the programmer as they program.


Yep, we pair at least 80% of the time.


Interviewing others can certainly expose your own faults. In our last round of interviews, I certainly could have done better by our candidates. I learned, and will do better next time.

But judging a company based on their questions may be expecting too much. Nobody is perfect, and if you expect perfection from someone who will end up being a member of your team, you might be setting everyone up for disappointment. Or you might be walking away from a good team or product just because they are not good at interviewing.

Instead, if they ask questions that imply cultural problems or personality conflicts, ask about it. Find out if they are just bad interviewers, or if they do have deeper problems. Because bad interviewers are not always a bad sign -- it could mean they don't have to do it often. It could also mean they are trying to expand the skills of people new to interviewing by giving them opportunities to do something new. Also, companies who have mastered the interview process means they do it often, which could be a sign of high turnover.

In short, it is good to notice problems, ask about them, then think through all the implications of the answers before making decisions.


I was asked a series of brainteaser interviews at a university interview.

It was immediately clear that the purpose of the questions wasn't really to see if I knew the answers, but to see how I responded when asked about something I didn't (immediately) know the answer to.

Isn't this a perfectly legitimate way to try to see how a candidate might deal with solving problems they don't know the answers to?

Ironically one of the interviewers (a somewhat crusty old astrophysics professor) later became my physics tutor, and continued to ask us brainteaser questions during tutorial sessions.

For instance, I dimly recall us being asked to "estimate the change in the length of the Earth's day if all the cars in the UK were to drive on the right-hand side of the road instead of on the left"...


Being able to approximately estimate numbers is an excellent skill for software developers to have. Imagine you're running some operation that affects N things or has N results. After seeing the results, it's very helpful to ask yourself "Does N roughly match what my intuition/estimate told me it would be?" If your estimate is off by at least an order of magnitude, it's a hint there could be a bug in your solution and that it warrants a closer look.

I don't ask brainteaser type questions, but I can see their appeal given the value of estimates. I also don't see why they're so universally hated. Is doing back-of-the-napkin math something a lot of people struggle with?


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: