Hacker News new | comments | show | ask | jobs | submit login
How to talk about yourself in a developer interview (stackoverflow.blog)
441 points by NickLarsen on Apr 28, 2017 | hide | past | web | favorite | 191 comments



> What is the hardest technical problem you have run into?

I never seem to find a quick good answer for this.

Maybe I just almost never work on REAL hard things.

So my question to you, HNers, is :

What is the hardest technical problem YOU have run into?

I am really interested to know what you would consider 'hardest'.. It's probably not going to be something like 'I changed the css property value from "display: block" to "display: inline-block"..'


I have no idea how to answer this question. There have been some problems where someone was stuck for weeks, and I came in and coded up a solution in a day. That seems, in some sense, good evidence of being hard, but those problems never seem hard to me. The reverse happens to me too, where someone else solves a problem that was hard for me, and it's easy to them. Is that a hard problem? It was hard to me, but the problem was easy if I saw it the right way, which I didn't.

Last year, I came up with a solution to a problem that we'd been solving sub-optimally for years. My solution is arguably optimal (given a certain set of assumptions) and requires multiple orders of magnitude less code than the previous solution. The solution is the core part of a paper that was recently accepted to a top conference in its field. That sounds like it might be good evidence the problem was a hard problem, but in fact the solution just involved writing down a formula that anyone who was exposed to probability in high school could have written down, if it had occurred to them that the problem could be phrased as a probability problem (that is, the solution involved multiplying a few probabilities and then putting that in a loop). When I described the idea to a co-worker (that you could calculate the exact probability of something, given some mildly unrealistic but not completely bogus assumptions), he immediately worked out the exact same solution. It wasn't an objectively hard problem because it's a problem many people could solve if you posed the problem to them. The hardest part was probably having the time to look at it instead of looking at some other part of the system, which isn't a hard "technical problem".

Another kind of problem I've solved is doing months of boring work in order to produce something that's (IMO) pretty useful. No individual problem is hard. It's arguably hard to do tedious work day in and day out for months at a time, but I don't think people would call that "technically hard".

I'm with you. Maybe I never work on "REAL hard things"? How would I tell?


It's all relative. Sometimes solutions require insights. Other times solutions involve a ton of grinding. Many people have a tendency to overemphasize the first and greatly underemphasize the second, even though the grinding may actually be harder than devising clever solutions.

We build tools that read and write Excel files (open source library: https://github.com/sheetjs/js-xlsx) There are plenty of very difficult problems involving ill-specified aspects of the various file formats and errors in specifications, but it is largely a matter of grinding and finding files in the wild that capture the behavior you want to understand. Those are "difficult" in the sense that people still get these things wrong (related: recently a bug in the Oracle SmartView corrupted US Census XLS exports, which boiled down to an issue in calculating string lengths with special characters) but they don't feel difficult since most of the work didn't involve any really clever insights.

IMHO the hardest problem is now fairly straightforward: How do you enable people to test against confidential files? The solution involves running the entire process in the web browser using FileReader API: https://developer.mozilla.org/en-US/docs/Web/API/FileReader , and that is an obvious technical solution in 2017 but few thought it was even possible when we started.


That's actually my default response when people intimate the work I do must be complex / I must be clever - it's really just hard graft. Sometimes you just have to be willing / stubborn enough to chip away at a problem that initially seems insurmountable. Sure, the more knowledge you accumulate, the faster you can figure out when to look, but often enough you just need to roll up your sleeves and bisect your search space.

I can imagine you get some pretty warty excel files. I mostly get PDF for my sins. I'm sure, like me, you've spent hours taking bits out of files until they work as expected and then figuring out what the difference is :-)


I totally agree with these gems ...

> It's all relative. Sometimes solutions require insights. Other times solutions involve a ton of grinding.

> Many people have a tendency to overemphasize the first and greatly under emphasize the second, even though the grinding may actually be harder than devising clever solutions.

> No individual problem is hard. It's arguably hard to do tedious work day in and day out for months at a time, but I don't think people would call that "technically hard".


The incidents you describe are examples of what Alan Kay means when he says that 'outlook' is more important than 'IQ'.


You left us hanging without linking to your paper. I would be really interested in it.


You are over analyzing it. The question isn't about defining what a hard problem is. It's about sharing how you overcame problems. And the interviewer is looking for stories that demonstrate how you overcame those problems. Times when you lacked the necessary skill to achieve something.

Not being able to answer this question is just a telling as an answer itself.


> I never seem to find a quick good answer for this.

So do what polititions do -- answer the question you wished they had asked instead of what they actually asked.

In my case, at most places I've worked I end up being one of the go-to people for gnarly bugs that have stumped the regular crew. So part of my interview prep is to condense a war story into something short and coherent that illustrates why people should have faith in my intuition, a bit of a tough sell. Then during the interview I latch onto any semi-related question and tell my rehearsed story.


Also, politicians recycle answers from other people.

-- like the story of when you saved 187 million dollars by fixing a totally trivial bug https://thehftguy.com/2017/04/04/the-187-million-dollars-gma...


Bingo.

"Well I'm not sure I can pick just one as the "hardest", but one very interesting problem that ended with an elegant solution was ..."

And you fill in the ... with a tale of you slaying a dragon^W prod issue with just your wits and a default .vimrc.


I don't know why this has to be framed as a manipulative 'politicians' move, this is just being honest and helpful like you ought to be.


In >15 years of professional development, I've probably worked on only one project in which there were any significant technical challenges. I should probably consider myself fortunate that I've had even that one technically challenging project. It was a lot of fun, but it has been rather demoralizing looking for equally challenging work since then and largely failing to find it.

I think much of the reason for that is that most software projects that deliver business value involve plugging together a bunch of components to deliver functionality that is not particularly complex. It doesn't involve pushing the limits of your datasources or inventing new algorithms. If performance problems come up, it's almost always cheaper to throw money at AWS or more hardware than to spend a couple developer-months addressing the bottleneck in the application. In some ways, I guess that's efficient from the perspective of the market, but it's disappointing for engineers who like to build applications that require solving hard problems.


"Complexity is like a bug lamp for smart people. We're just attracted to it."


The way I think about that question, and in fact that's the question I have gotten many times, is "what's a technical problem that you solved and you are the most proud of?". In this question, 100% of developers have an answer. Everyone has done something that needed a bit of thinking or planning, and then they were proud of the execution. It can be the hello world in a new language or it could be building Facebook, or whatever else in between and beyond. If you think you don't have any of those, think again. Many times this answer changed during your time as a developer, but there is always at least one answer, no matter your level, experience, etc.

Personally, many times I answer with a time where I decided to re-engineer and rewrite a "snapping engine". It helped with snapping boxes together when they are close to each other in a 2d design application. Unexpectedly difficult to write with some features we wanted, but after a couple of iterations, I finished, and since then, new features and plugins worked perfectly and nicely together, and were easy to add and implement.


> In this question, 100% of developers have an answer.

Sorry, I don't.

The story I would tell if asked this was solved by the guy I was pairing with. He knew about URL encoding images, which immediately solved something we could have worked for weeks on. I was very surprised and impressed. Of course, now that is part of my toolbox, and I wouldn't think much of solving something else this way.

Sometimes I solve problems easily that others find very hard. I'm glad I could help, but I don't go around feeling proud of how awesome I did that day. I just happened to know something others didn't yet.

This might sound like humblebragging, and perhaps it is. Just trying to explain why I have a hard time with this question.


How long have you been working for?

It just sounds kinda sad to say that you've never been proud of work you've done. I think if I was never proud of the work I'd done I wouldn't still be doing this job: a feeling of reward is important!


Think of it this way: the question isn't for you, it's for the interviewer. He/she wants to know if you're smart enough to work for him. So answering that co-worker Paul had this problem that he'd been battling for days and then he finally came to you for help, and explaining how you helped him is a valid answer. It gives the interviewer the insight he's looking for and make you look good.


I actually don't like this question. It's hard to decide.

If it's something too simple, you're going to be looked down on. If it's a clever hack around someone's bug, it's hard to really be proud of something that shouldn't have had to exist in the first place. If I say something from a long time ago, I may not remember enough details to answer follow-up questions. If my job is boring (hence interviewing for a new one), I may not have had any good "shining moments" recently.

As time goes on, stuff that used to seem or look cool can become embarrassing. I've seriously considered deleting some of the early stuff I have on Github even though it has relatively-a-lot-of-stars for something small and stupid.

Asking to be regaled by stories of tech heroism is also prone to sabotage, because it's easy to rehearse an impressive story. It doesn't necessarily indicate their ability to do things that are useful for the job; it just means they rehearsed a good story and prepared for some follow-ups specifically related to that.

In an interview, you're a lot better off asking questions that will require the respondent to formulate an answer right then and there v. something that they've rehearsed. You're also better off leaving the expectations of tech heroism behind.

"Rock star" job listings have more or less died out, but this is really just a lesser form of it. Typically you don't need or want a rock star. You want someone whose output is professional and consistent.


I think you may be overthinking how seriously your interlocuters take interpreting the behavioral questions.

The vast majority of CS interview questions are really just one or both of two categories:

1. Say something entertaining or that makes me like you.

2. Say something that proves you're competent so if I like you it's not a hard sell to hire you.

When you read this hard into a question that can in this framework be reworded "talk about stuff you programmed that you thought was mentally interesting when you made it", they truly only are thinking about your skills at the most basic surface level, they really just want to let you gush for a minute.

Even in the most embarrassing code you've written there are dumb bugs and little moments of triumph, and they're begging you to share some of the juicy details, of which I'm fairly sure every programmer has a few they can recall.

If you have no example of work you've done you can gush over, then yeah it's a problem, but to me this is a sign that the only truly wrong answer is NO answer or trying to fake a modicum of passion by gushing about something you actually don't care about, and THEN sounding wooden when doing so, because if you didn't come off as wooden, even this would be sufficient.


Right, I actually agree. I said in another comment that these questions are lazy interviewing, because it's just the interviewer saying "Well? Amuse me."

But it's still difficult to say I'm "proud" of something that I don't really think warrants pride. Just have to steel up and go in there ready to talk about something dumb I guess.

The things I'm actually proud of are things that don't look impressive to the outside. To me, the gold standard contribution is surgical, precise, and simple. It may only be 20 lines of code but it operates within the framework of the existing stuff, doesn't break the tests, etc. That sounds like "routine work" to me.

These people want to hear about atom bombs because they leave a cool looking mushroom cloud, but the professional shouldn't have to go nuclear -- and they shouldn't be proud of it when they do.

I guess the core issue is that if someone is asking this question, it signals that we're not really on the same wavelength. At least, it seems to signal that, because I assume they're saying "OK, please wow me now." Good work is quiet and consistent, usually not astonishing.

If the candidate is the author of some bona fide, actually-used open-source software (not GitHub vanity projects), that could qualify as something that looks impressive and is also probably objectively worth being proud of, but few people would meet this description.

Of course, in reality, the signal is really "I have no idea how to interview someone, please make this easy for me." If you interpret it that way and ignore the actual question posed, I guess it becomes easy; just say something that sounds like a vague answer, and then speak for 2 minutes+ about why you're probably the best choice.


This feels pretty pessimistic to me. A "professional" (and I only half-mean the scare quotes, I think I am one) is able to often do some pretty transformative stuff that's worth being proud of while not setting off The Bomb. I can truthfully and accurately say that I reduced one employer's deployment time of their services from six hours to six minutes with no loss of safety or increased risk--I hacked through an accretion of technical debt (after building out testing to ensure that I didn't change any functionality) that had just plain grown because nobody else had had time to pull it out and replace it with a more scalable, long-term solution! I'm pretty proud of that. And two orders of magnitude on a deploy will wow folks who've ever been personally faced with the bigger one.

(One of the other ones I'm more amused than proud of, though, is saving a client ten times the money they paid me because I happened to know about the existence of AWS D2 instances...)

Your concluding point is well-taken, though, because most people don't know how to interview and they're basically asking you to sell yourself for them. But I don't think the question is as problematic under the hood as you're framing it.


Yeah, those wins are great. It's really about the level of detail that you assume they want. "I improved the deployment time" is an effect of a technical change, not a technical change in itself. There are a lot of people who could improve the deployment time just by switching to a faster build backend or doing some other small change that has big dividends. Is that a "technical accomplishment"? Sure, and it has big wins, but if the answer is just "I installed Jenkins" then it kind of takes the oomph out.

And big wins like that are usually compacted pretty early on. If you can get orders of magnitude improvements left and right, it means that something about the company's management is off.

It's also not a good question for an interview because it's a hard basis for comparisons. Perhaps another candidate knew how to improve the build/deployment pipeline, but he was blocked by political interference. He wouldn't be able to say "I sped up the pipeline 6x.", but he could talk about his plans to do so with a question more oriented to the task, e.g., "How would you build your dream deployment pipeline?"

That's what I mean when I say they're looking for something spectacular. Some people can say they saved their company or made a change with massive ripple effects, which is not necessarily aligned with the technical difficulty of that change and may cause some candidates to elide mention of it entirely, and some people can't make such big assertions, not because they're not skilled enough, but because the opportunity and/or priority wasn't there.

If you want to know about business gains and side effects of technical work, ask "How did your work help your employer?" If you want to know about technical work itself, ask relevant lines of questioning.

Anyway, I think we're basically splitting hairs here. It's just about what level you're choosing to process the question on, and it seems everyone agrees that it's best to take a very superficial interpretation and allow them to inquire further as necessary. I just don't think it's a good interview question.


Some interviewers think like that, with this "wow me now" attitude. These are the bad interviewers, and honestly I don't much care about the answer given to them. The one losing is the company by not hiring me because I didn't "wow them" by their standards, and in the end I would like to work there.

But there are many interviewers that don't expect to hear the atom bombs, or the circus acts, or whatever else. They actually want to hear about something you did, finished, and then a feeling of accomplishment came over you. They are also developers. They want to know a bit more about you. And these are the people I would like to work with.

Bottom line, there are worst questions that are asked during interviews :P.


But that reveals why it's such a bad question: they just have to prepare in advance and think of a problem scenario that sounds interesting and panders you, and recite it at that question. Even a faker can do that, but a legit programmer that needs a few minutes to get thinking (and hasn't come up with a cookie-cutter answer in advance) will stumble at.


But who cares that it's a bad question. The point is that it was asked and you should either answer it, evade it skilfully, or find a tactful way to decline to answer.


The poster asserted that "100% of developers have an answer" to this question. We're discussing why some developers may not have a good or immediate answer, and why the question is not as good as he asserts.


Well, I don't know about how good this question is, but it is one that gets asked a lot. And I do still think that 100% of developers have an answer, because I am sure every developer has felt proud about something he did, no matter the size, otherwise why would they still be a developer?


Sure, I wasn't speaking to that issue, only to the parent's attempt to justify it as a good question.


The localization project I use as an example here was definitely one of my top 3 hardest projects of all time (and that was years ago). It was not a particularly difficult technical challenge, it was difficult because it touched every single aspect of the codebase. The project took me and a coworker 3 months to build out the infrastructure for, then another 3 months of actually rewriting everything to use it, and explaining to every single other developer about why we made the decisions we made in order to teach them the different ways they were going to have to write code from now on. Social challenges of the workplace are hard; we're not always looking for technical difficulty.


Off topic but: this is some good nostalgia here. I was that other coworker. Some of the best pair programming I've ever done. It was fun working closely with Nick.


Was it hard or was it just matter of putting time and focus?


It's definitely hard. There was certainly no clear cut solution to any of the problems I included on the card in the picture. We evaluated 8 or 10 different solutions for out of the box stuff, found things we liked and didn't like about all of them and eventually decided it was best to build our own. At each step there was a lot of debate because we knew this would probably be used beyond just the Careers project and be used in the Q&A project as well (separate at the time), so it wasn't just worrying about my team's concerns, but everyone's. We won some debates, we lost some debates, it was very hard.


I've done localization conversion projects a few times so I can relate. There's never one way to translate everything (e.g. page content, URLs, database content, images, forms), there's usually several translation methods to evaluate, you have to trawl your whole codebase to tag text for translation, translating routes/URLs tends to break all code that doesn't expect those names to change, new developers have to be taught to develop new content with translations in mind, you have to schedule allowing content to be translated along with time to get it translated to get everything done on time and you need a new workflow for how translatable text is delivered, translated, reviewed and deployed.


Hard technical problems are pretty much exclusive to the academia and R&D departments (and opensource projects). The rest of us are the 21st century's plumbers and electricians - if we run into a hard technical problem, it means something definitely went wrong in the planning process beforehand.


> > What is the hardest technical problem you have run into?

>I never seem to find a quick good answer for this.

Real easy: Overcoming technical debt/bad decisions of the previous group of programmers.

At my current company/position, our group basically replaced an outside company - two programmers. You name something you should do and they did it: Code in the behind, logic in triggers, plain text passwords, direct database access - bobby tables all the way down, etc.

When they were in charge, the company had ~4 customers... we are now rocking ~30 unique customers. Their fragmented codebase is unmanagable.

Keeping a train moving while replacing the engine and changing the wheels would be easier.

This doesn't include company culture, inter-company politics, other decisions, etc.


This is not impressive. This is normal. When they say "What's the hardest thing you've done?" I would hope that if you are going to run with this, you explain why conventional maintenance/upgrades were so extraordinarily difficult in this case.

Every developer dreams of going greenfield. Ultimately, that's because it's harder and much more tedious to read code than to write it. If you start from scratch, you understand the whole stack/platform, everything is customized to your liking, and so on. That's great for you, but the company is usually stuck spinning its wheels for months while you push this rewrite down their throats.

It's also very easy to underestimate the depth of domain knowledge and accounted-for corner cases encoded in an old codebase. It looks easy at first, but it usually ends up taking at least months to reach feature parity with the old software, which usually also means that people will use both systems simultaneously, requiring data synchronization, etc.

The whole thing becomes messy, and by the time you're done, the "new system" usually isn't really all that improved over the old system. Systems get convoluted in the process of development, business needs demand quick shoehorning of something instead of thorough refactoring, etc.

Once in a while, a full rewrite is indeed justified, but it's much rarer than most people think.

Going in saying "Yes, my company needed a full rewrite" is an instant orange flag in my book, and thorough questioning would be needed to determine if this is an ongoing attitude problem where there's a reluctance/reticence to read other peoples' code. That portends laziness, a disrespect for colleagues, and a disrespect for the business's needs, which are rarely aligned with tying its developer labor up in a greenfield reimplementation.


"This is not impressive" That's because I've given you a mile high description.

We have an outside consultant who does one thing: Fix businesses.

When he says this is the worst situation he's ever seen? I take it with a little more weight than I'd take someone else saying it.

While I understand and don't disagree with what you say - a full rewrite is normally not the answer - you haven't seen this codebase. Or the company structure.

We aren't exactly doing a "full rewrite"... it would honestly be easier in many respects - we are keeping the company functional while replacing large chunks.

Aka Keeping the Train Moving while changing the Engine and the Wheels.

This isn't JUST a code base issue. Or JUST a culture. Or JUST management. It's a combination of all those - and many more that can't be covered in 3 paragraphs.

I could talk for 8 hours - and scratch the surface - of where we are and where we need to be.


>When he says this is the worst situation he's ever seen? I take it with a little more weight than I'd take someone else saying it.

I take it as a consultant emphasizing biases that favor his presence.

Anyway, the point of my comment was not to nitpick your specific situation, which I have no information about and obviously cannot speak about intelligently. Perhaps it is as extreme as you indicate. If so, my only suggestion would be to focus on the difficult problems rather than colorful characterizations of them. In an interview, the employer will know about its own problems, and may imagine your running the interview circuit and saying all the same kinds of things about them.

The goal is that as general advice for what to say when someone asks about technical accomplishments/pride, talking about the nightmarish situation you're coming from is first, trite, and second, a signal that you may not possess the cooperative qualities or the perspective to properly evaluate situations as they arise.


I agree on all points - the consultant has his own objectives just like the owners and my manager.

Is it the worst situation EVER!...? Undoubtedly not... but its definitely twisted like a pretzel with problem layers on top of problem. But we also aren't "rewriting from scratch" - that would be too difficult. We are replacing pieces one at a time and breaking/fixing as we go.

I'm compensated enough for the stress and like the people and environment enough to offset the "overall situation".

But yeah... my main point was to say that moving a company from "old broken" to "new shiney fixed" while keeping everything working, adding new features, etc is, at the heart, the largest technical challenge I've faced.

Devil is in the details - and "spinning" it correctly without bad mouthing the company (Which I do like, otherwise I'd not still be there) and while keeping to that main point (upgrading a company) is... interesting.

Finangaling the finer words isn't my top skill :)


> Overcoming technical debt/bad decisions of the previous group of programmers.

This is often a gold mine, just make sure your interview doesn't become a discussion about how bad other programmers are.


Absolutely. I try not to bad mouth the individual - because the two guys seemed like good people.

Secondly, programming - to a degree - is "art". My version of a masterwork is different than yours.

But the framework decisions? Lack of documentation? Lack of source control? No dev environment? etc. Decisions and foundational information that is demonstrably wrong and needs fixing? And what we have done/are doing/will do? THOSE are the things to focus on.


how about overcoming technical debt you created yourself in the past? :)


I like to think about this question in three parts: scope, depth, and originality.

The scope of the project is the size and ambiguity level. Ideally as you get more experience your scope grows. Whenever you're coming out of school, your answer to this question might be a tricky bug fix but after a few years it might be something like "we needed to build a system to flag and filter fraudulent users based on their site activity."

Depth is about how much detail you can talk about the project in. If you choose a project with a big scope, can you drill down and talk about the implementation details of each component? If you chose a bug fix, can you describe exactly what triggered the bug as opposed to just knowing what fixed it?

For originality, what about the problem made it non-trivial to solve with out of the box tools? For the fraud case above, maybe the data was stored in a format that was hard to analyze. Or maybe for people at the bigger companies there were scaling issues that requipped unique solutions. For bug fixing, maybe it was a bug that was really hard to reproduce and you had to do a lot of memory dumps and code analysis to pinpoint it.

When I finish something I like to think about it along those three axes for a little bit in case I need to recall details later.


>> What is the hardest technical problem you have run into?

The hardest technical problems I've run into, have been mostly human; i.e. other people.

But, in the purest sense, I have to say that I have observed, on reflection, that the reason I am a technologically competent, adept, person, making a living by way of dark and serious mystery, is that I long ago decided that nothing would be hard. Just .. un-learned.

You see, it is a key factor of success that you, literally and otherwise, embrace the idea that you can't know everything.

So, know what you need. The hard things become easier the moment you do it, even the first time.

I know this sounds like compound nonsense, but I honestly had to give pause on this question. I'm a systems engineer with decades of experience in a multi-variate set of industrial categories, and relatively successful in my lot. This question made me really think - I couldn't think of the hardest things.

The hardest things, I haven't done yet. {But, on another thread, I'm serious about people being the hardest things about technology..}


I lie.

When I was a young, wet behind the ears, Java developer I answered telling them about making a modification to a Linux kernel driver for hardware support. It was a telephone interview but the silence was deafening. Still the only interview I ever had where I wasn't offered the job.

Some things haven't changed in that it is when I step outside my comfort zone I find the technical problems harder. But now I'd just talk about a more comfortable problem that went through multiple rounds of better fit solutions on a system actually in Java so they can relate and see I can actually talk about the target language. Then I'd probably make the point that as a more senior developer it's usually the non-technical problems that require my most focus.

Still makes me cringe thinking about it.


> Still makes me cringe thinking about it.

Probably because you are in a much better place now.

I have found that the propensity to lie is directly in proportion to one's [for the lack of a better word,] desperation. The less desperate I am, more ideology I tend to exhibit.


I'm currently a wet behind the years (hopeful) Java developer -- what's cringey about a modification to a kernel driver being hard? That sounds pretty intimidating to me.


As someone on the hiring side of the table, I rarely care about how technically complicated it was and more about WHY it was hard, how you figured it out, and how you avoided it in the future. I've made some really dumb typos in my code that caused a debugging nightmare for me in finding them.

As a recent example, in my game engine I copy/pasted some code for framebuffer and texture creation and missed renaming one variable. A stupid mistake that took me 2 days to find. But to solve it, I needed to look at all of the various textures on-screen. Some of them are non-linear, some are single component (just red) which doesn't display well, so I ended up writing a method that allowed me to render all of the various stages of my renderer out to the screen (color, shadow, light, depth, normals, etc.) as a debug method. Only then did I realize that the shadow buffer texture was sized to width * width instead of width * height. Again, a stupid mistake, but now we've got something to dig into a talk about and it's much more about the solution than the problem.


This https://kopy.io/eI8bT (that's 140 lines, whole thing was over 1000).

Was going to buy the calculations in as an API because it was an opaque government standard, API turned out to be incomplete after we bought it, rang them up to ask why "oh we are getting out of that side of the business".

I had two weeks to build out an API (over Christmas) that implemented a government calculation that was implemented in one 200 page PDF[1] and then modified in another two, total calculation had 44 individual steps referring to several dozen data tables some with hundreds of values.

I did it with a day to spare.

It was probably the single greatest pure technical programming I've done in my career.

[1] https://www.bre.co.uk/filelibrary/SAP/2012/SAP-2012_9-92.pdf


Strictly technical? Determining the existence of metastable states for the T-cell receptor protein in solution (PhD Dissertation topic). Sort of difficult science, though it seems quite easy in retrospect now; the project wasn't so much hard, as it just cut across a lot of disciplines. Poor developer interview answer though, as it didn't involve a lot of software development (lots of TCL scripting for data extraction and ML with Python instead).

The answer I used to use was a problem I had working as an R&D intern: determine when the speed limits posted on a street have changed from measurements of driver behavior. Interesting and fairly tricky ML problem (weather is a big confounder). Ended up writing a lot of C to get high enough performance to make the solution reasonable which was educational (I didn't know a lot of C at the time), but almost certainly not the right approach to the performance problem. Still more science than development, so it depends on who's asking.

Probably the hardest business-type technical problem I've encountered is database restructuring. We moved (a subset of our data) from a NoSQL database to SQL as part of larger architectural changes, and mapping, migrating, and maintaining compatibility has been non-trivial.

The hardest problem I've encountered has been helping to rescue a project with a severely dysfunctional development history. Much more project management and people than technical (it was just a CRUD app) but I came into a project that had been in development for a year or so and stalled out. The development was outsourced and I fell into a position as a liaison between the internal folks at the university that wanted the product and the dev team that had been hired to build it. Sort of a classic issue where the dev team and the stakeholders would talk right past one another. It drove me crazy at the time, but an excellent experience in retrospect. And it has a happy ending; the project went on to be successful after that, at least when I last heard.


I spent a couple very stressed weeks (nights, weekends) debugging a crash that would only happen every 30 minutes or so. It looked like stack corruption, so I was trying all avenues to debug it. Nothing seemed to make sense. We finally figured out that it was a signal integrity problem on the DDR memory bus. Software was fine.

Did you see that post a few days ago about "Is ECC RAM worth it?"

The answer, after my hellish debugging is an unequivocal YES! My horrible problem would have either manifested itself as a correctable ECC error or I would have gotten an uncorrectable ECC exception. I would have been able to go straight to hardware engineering with that instead of spending many miserable nights debugging an RTOS and ISRs.


In no particular order:

* GPU drivers are a buffet of terrible things. My best moment was either hand-compiling shaders to GPU-specific assembly in order to implement video playback filters, or deducing how the GPU vendor's drivers managed to fake a particular GL extension and implementing that same fake trick in the MesaGL version of the driver.

* Self-applicable partial evaluators are cool. I've tried several times to build one, and each time I fall short.

* I've hand-written parsers for big languages. I've also written parser generators. I'm not sure which is harder.

* Fighting with motherfucking BitBake. You have no fucking idea.


Sounds like you do some embedded graphics work.

On multiple occasions, I've kicked off BitBake to run overnight. I come in to find it failing from running out of disk space. And I'm usually perplexed - does this really need over 200 GB of space!?!


I once slept in a lab in order to monitor a BitBake project for a few weeks. I would wake up, check BB's progress, tweak it and start it going again, walk across the street and get a snack, then come back to the lab and go back to sleep.


Honestly it's a vague question. I don't really know what I would consider "hardest"...but one comes to mind as being really difficult:

Debugging memory leaks in a Python 2.7 asynchronous (gevent) daemon.

Aside from memory leaks supposedly being improbable at worst in Python's reference counting managed runtime...the GC interface and STDLIB tools for such debugging are anemic in Python2 (improvements have been made in 3 although I can't comment on them since I haven't used them yet). Not to mention that C extensions (gevent is just one) add complexity to debugging.


Weird that people consider this a question. I think it's objectively possible to say if a task is harder than another:

1) One problem is harder than the other if it requires more knowledge. E.g. to code AI you need to have programming skills, AI related skills, statistics skills and graph theory skills, plus whatever your domain knowledge is (e.g. how to build the code in your company's environment).

2) One problem is harder than the other if it requires more skills.

3) [...] harder if it requires a higher composition level of skills. E.g. configuring a firewall via iptables is harder than configuring a firewall via your router's web gui, since the first requires bash, Linux, tcp/ip related skills as a foundation to even understand what iptables does. The gui may only require a limited set of networking skills and 2 pages of router handbook.

4)[...] harder if it is more complex. Coding your own kernel is harder than coding your own calculator.

5)[...] harder if it requires more departments. "Go to market" of your product therefore is a harder task than "proof of concept".

6)[...] harder if it relies on more legacy code. Legacy code always contains domain knowledge that is unaware to most people, even to the developers. Changing that code or its environment yields a lot of surprises.


My go-to answer is my time when I worked in AIX kernel development at IBM. We'd get bugs for kernel crashes that appeared related to memory corruption. They frequently ended up being caused by stale DMA addresses in device drivers for (mostly) Infiniband adapters writing into memory that now belonged to some userland process or kernel data structure.

How I'd debug these (it took me a while to be effective in this regard):

  - Main tool was the AIX kernel debugger (like cutting bone with a butter knife :)
  - Identify corrupted memory, look for clues like recognizable data structures or pointers in the raw dump that could be cross-checked against symbol maps, etc.
  - Confirm the alignment of the corrupted memory. Page alignment was a tell-tale sign of errant DMA writes in our system... cache alignment is more mysterious and can be related to CPU design bugs (IBM designs their own POWER processors, and we'd test on alpha hardware frequently).
  - Scour the voluminous kernel trace for the physical frame # of the corrupted memory. A typical offending sequence was: 
    1. Frame assigned to adapter for DMA
    2. Physical memory layout change (we supported live hot-swappable memory arbitrated by the POWER hypervisor)
    3. Frame allocated for use by page fault handler
    4. Crash happens
Sometimes the root cause was that the device drivers were not properly serialized with the dynamic memory resource subsystem (the hot-swappable memory) and the sequence above happens very quickly (<1 ms). Sometimes the bug took a while to manifest, and the nice story tols above for our page was interspersed with thousands of unrelated activities in the same region of memory.

We had to be like a prosecutor and build a strong case to implicate a bug somewhere else. Until then, our team was always on the hook to figure these out.

This class of problem was hard because the tools we have at our disposal to collect evidence were quite inadequate, and the amount of data to sift through was enormous. Also, any tool we think might help to sift through all this data needed to already be in the system and in the kernel debugger as a diagnostic command (a crashed system in the debugger cannot be modified in practice). There's hundreds of those debugger commands for all kinds of randomly recurring problems we had trouble figuring out. Over time, you'd build your own for your own set of problems in your kernel specialty :-)


This one took a while to figure out. Especially, since as a tech-support person I did not have actual access to customer system. http://blog.outerthoughts.com/2004/10/perfect-multicast-stor...

This one took many many tries of various incantations and variations to discover (documentation was "less than useful") http://blog.outerthoughts.com/2011/01/bulk-processing-lotus-...

This one makes for a nice story when I talk about computer-specific language issues: http://blog.outerthoughts.com/2010/08/arabic-numerals-non-wy...


There was this variable name I misspelled once ;)


Heh. While still a student, I was working in Fortran (all upper case). I was trying to type COS (the cosine function), and I overshot the Oh character, and typed a zero: C0S. Not very visually different! It took me two days to figure out why Fortran suddenly didn't know what cosine was...


rm -rf $BUILROOT/∗


I ask: 'Tell me about a problem that was particularly challenging'

I'd love for someone to tell me a story about something they couldn't solve (or at least not the way they wanted to).

If they can't come up with something, which is rare, I ask them to tell me about something that was fun for them.


>> I'd love for someone to tell me a story about something they couldn't solve

I was in twelfth grade. I was given some EEPROMs which I had to write data to, just that I did not had the standard equipment to write to it. I used a printer port to drive an amplifier circuit I built, which in turn sent the voltages to the EEPROM. I sent waveforms exactly the way the data-sheet suggested. Yet, I wasn't able to read back what I was writing.

I had no oscilloscope or waveform analyzer to debug. All I could do was to re-read the data-sheet and then my program for correctness.

I could never figure why wasn't it working.

Later, my Dad found someone who did have the company-supplied EEPROM writing equipment and took the EEPROM to them. He learned that there was just data on the first few locations on it.

This is one of the very few projects where I have failed. Being in twelfth grade then, doing stuff that would fail college grads, I have not taken an offense with myself. :-)


Then the guy start describing the problem he solved in his last 6 months.

And you realize you've done about the same, fully finished and shipped, in about 3 weeks.

The rest of the interview is wondering whether you should cry or he should.


One of the most complex on the front-end was a repaint/reflow issue in Safari that was complicated by the way we were using Angular.

The easiest solution was to use transforms to force rendering through the GPU render pipeline by adding a Z-depth to the elements.

Which caused rendering issues in rendered font-weight for Firefox. We never resolved the issue, even after a root cause analysis showing the bug in Webkit and not Blink or Gecko.

On the backend, it was finding a way to store a persistent collaborative changelog with proper access control and heirarchy on top of a RDBMS. Resorted to redesigning a distributed file-system based on HFS+ and btrfs for COW and COR obligations. This is one of the most data-structure and depth of infrastructural knowledge problems I had to address.


Honestly, the hardest technical problem I ever ran into was teaching myself C pointers and keeping at it until I fully grokked them. Now, this was in the early 90s, and I was a loner with only a second edition copy of the K&R book. There was no Stack Overflow, and the only technical people I knew were on the other end of a BBS connected to FidoNet, which only batch-updated once per night. In hindsight -- and with today's resources and ever-shrinking distance between human beings in a community -- this problem is trivial. I've seen some pretty wild things in my decades as a programmer, but I have never since encountered a technical problem that completely fucked me up like learning C pointers did back then.


A good way to think about how to answer is to look at it from the perspective of the interviewer. What information do they want to get from you by asking this question? I'd say this question is aimed at finding out how you react when you're challenged. Can you describe what made the problem hard? Maybe it had conflicting constraints or goals. Maybe you were debugging some particularly tricky problem. What did you do when you ran into difficulty? Did you throw your hands up and give up? Did you talk with teammates about potential solutions? Did you have a systematic approach or were you just trying random ideas to see if one worked?


Like some of the other comments in reply to yours, I usually focus on a technical issue that stumped me for a long while - and then a change in perspective allowed me to solve it, or understand the solution offered by someone else. It's good to deconstruct what you got stuck on once you know the fix, because then you recognize what led you astray in the first place. (((imo)))

I think this question relates to personal growth and overcoming show-stopping obstacles with retrospective analysis? Something something smart-person-speak.


It always bothered me a lot too, but I was "lucky" to encounter a rather niche Chromium bug last year, so now I have it covered.

Generally, when you actively work on weird bugs and try to really understand what's going on, instead of doing quick hacky workaround, sooner or later you'll face some interesting bug. But it's sometimes exhausting to investigate stuff like that, plus most of the reasonable managers will try to prevent you from going down the rabbit hole if the bug takes too long to fix.


When i started out, a lot of things were hard.

Now, I can usually think of three decent ways to do anything. Nothing really feels "hard", it's just a different amount of work.

Another angle is that the way to solve "hard" problems is finding a way to think about it that makes it easy. Once I've done that, I no longer think of the problem as "hard".

I think the real issue here, that I don't fully understand, is what interviewers are really asking with that question? What do they want to hear?


I've had the same problem. I used to have what I thought were OK answers to those questions, but now it's hard to choose. It's especially hard if they scope it down, e.g., what's something you're proud of that you've done in the past 3 months. What would be worth being proud of after 3 months? It'd have to be an exceptional project to warrant that. Otherwise, little bugfixes are routine, and even if they're clever, they're hard to talk about both because the details get discarded and because it's hard to provide the necessary context.

Interviewers are being lazy with that question, essentially. They're saying "Wow me so that I can know you're the most impressive."

This is a problem if you don't think of interviews as a competition over who's the most sparkly (also, who's the best storyteller and/or who had the best script).

My experience is that people are shockingly bad at interviewing. They throw all the work onto the candidate and expect to get good hires that way, which is rarely successful.


Instead of taking the question too literally just think about a recent projects that had some technical challenges that you would like to tell the interviewer about.


Okay so I built shit version of Google Maps single handedly, from raw map data, before they had an API in 2005. It "worked" in IE5.5. Does that count?


One way I play this is to be like, "well I've blocked out the hardest problems, probably due to trauma, but here's a problem that I worked on that might apply to what you guys are doing here."

I find this easier because usually hearing the interviewer talking about things will trigger my memory as to when I was working on similar problems. It's probably better for them to know a relevant example anyway.


well, one hairy problem I had was migrating a legacy enterprise behemoth from a 4g language to Java (it was early 2000)

now 4g languages let you do anything easy, so nobody really put thought in anything really. result: everything was soft code and the database grew to around 4 thousand tables. the database itself wasn't even that big, running at around 10gb.

The sheer number of tables made impossible to use an orm layer, because back in the day Hibernate and the others had no other option but to map everything at startup time from xml files or annotation and have all the metadata about tables and relationships loaded in memory. Just the metadata was using about 5gb of memory.

However as part of the migration we managed to build all the UI straight from the 4gl definition, so we really really needed a way to create queries out of the UI metadata using object introspection.

We ended up writing our own object query language and the translation layer to build SQL queries out of it. It sounds bad but in the end wan't impossible even for a small 3 man team - we needed not to support the full spectrum of possible way to interact, only what the UI needed to load the data (and yes this was a thick client)


I think how hard something is cannot be easily understood by a non-very-senior engineer without context. I had a very good experience when interviewing at Facebook except for the part where they ask you this question. Either they were asked to respond rudely, or they really didn't think anything outside of adding stories to things was interesting/difficult.


Yes, it's kind of difficult to define the 'hardest' problem. Some issue I considered 'hard' is more because of my missing relevant knowledge in this field rather than the nature of the issue. And of course, I don't want to say "I spent a few hours to learn something to get things done, which is just changing a property value".


I got same vibe. Nothing I work on is really hard. It takes some time and focus, but most of the stuff in software development takes time and focus, unless it was already done (and if it was done, why redoing it?)

You example of changing block to inline-block can very well take time and effort depending on the issue at hands. So yeah - this is very vague question in my opinion.


“If you continue this simple practice every day, you will obtain some wonderful power. Before you attain it, it is something wonderful, but after you attain it, it is nothing special.”

That's why we can't look back at something as "hard". Or maybe it's not. It's a good time to read that book again.


Another problem with the question: if the hardest thing was something you solved then you're probably not stretching yourself. If I answered this I'd really be answering "tell me about an embarrassing failure."


I felt the same way. But it can be anything. "I sped up the page load on the site", "I redesigned the front-end to work on mobile", etc.


If we are talking about front-end, gradually migrating the Backbonejs/Marionettejs codebase to React/Redux/Webpack.


I'm not a developer, so my answers are different, but I've got a handful:

- Consulting for a customer where they were deploying to new hardware with a new processor architecture, I received a report that an application was running slower on the new servers than it was on the old ones. I started out looking at things with strace and ltrace, had to move deeper and pull out perf and systemtap, but found that it looked like memory access was slower than on the old hardware. I did research on the processor, and found that it was due to the 'Intel Scalable Memory Buffers'. Since memory first had to be loaded into the buffer before the CPU could access it, things not in the buffer already had higher latency, but things already in the buffer were much more quickly accessed than they would have been previously. I worked with the developers to make up for this performance decrease in other ways. Their application was well suited for using hugepages, but they were not, and TLB pressure was causing performance bottlenecks in other areas. Switching to hugepages prevented TLB pressure, and the application ended up being even more performant on the new platform due to the increased amount of available memory allowing for a large amount of hugepage allocations.

- I was consulting for a customer that was running instances on a xen platform. They were having performance issues vs. their old bare metal deployment, and had already done some analysis. They gave me a perf report that was showing a massive amount of time being spent with a specific xen hypercall. I had to dig into the xen source code to figure out exactly what that hypercall was doing, as general public documentation about it was somewhat vague. I was able to determine that it bundled up a bunch of different operations, so it wasn't conclusive from that, but it did narrow down the possibilities. It was enough to point me in the right direction, however, and I was able to determine with a little bit of trial and error with some tweaking that it was ultimately related to decisions NUMA was making. It turned out that the customer had thought they were doing NUMA node pinning, and ultimately weren't. Interestingly enough, even with pinning, we still saw some of this, and completely disabling NUMA (all the way - not just balancing) actually ended up being needed to fully reclaim the lost performance. I also learned an important lesson in trusting customers - even the ones that know what they're doing aren't always right, and while I should trust them in general, verifying their answers is important. I discounted investigating NUMA as early on they told me they had their applications pinned to nodes, and I would have otherwise investigated that more quickly and probably solved the issue in less time.


Tryna reverse-engineer the bit-banging protocol for a network card using the specs and a Linux driver.

Eventually I just gave up.


Dealing Usenet header data. The big alt.binaries.* groups can have upwards of 10 billion headers.


Talk about a problem of the same type as those likely being faced by the interviewing company.


> What is the hardest technical problem YOU have run into?

Layer 8, ie. human beings. The software side of stuff, I can eventually solve by hammering at the keyboard until it works. But the people using it, and the ever-changing requirements they have - especially since this influences my software design - is definitely the hardest part.


Debugging undefined behavior related heisenbug. Why your askin' such easy answers?


Whenever I'm in an interview and this question comes up I have similar issues as you. Even though I can think of certain particular problems that were a pain in the ass for me, a lot of the times someone came and solved it much faster and in a more clever way than I did.

However it just occurred to me that maybe the hardest problem I've had was actually making up an architecture from scratch as the problem was unfolding itself, and then having to maintain it and even bring others aboard. Meaning I had to document as much as I could (even though I had very little time for this) and I also had to sometimes give more priority to a not-so-important bug (vs a very pressing issue for me), not because it was critical to any feature but because it was making it very painful and hard for a teammate to implement one which in turn would later delay some other feature.

And the major reason why there was no actual planning to avoid this as much as possible, was because features were being decided on the go by the top brass on a case by case basis, completely opposite of the original direction I was told we were going to go (which was the information I used to lay down the foundations of the project). I.e. I was told at first that this was going to be just a wrapper script and it ended up being a whole orchestrator including multi-node operations needing result consolidation, a state machine to track down the... uhmm...state of the system, and things like that.

So my point is that probably there are several axes of "hardness" in a problem that can be mixed together, and that makes it difficult to compare a problem to another (i.e. over which combination of axes are you comparing one to another?). I guess part of the response to such a question in an interview would be then to explain the context so that it can be more easily understood why was that problem perceived as hard and over which axes. Was it because the problem was an optimization one and the previous code was impossible to work with? was it because the business constrains (as I believe was my case) where surreal? was it because the teammates made it really hard to move forward (e.g. bureaucracy, defensive/aggressive coworkers, etc)?

And I know we are talking about "technical problems" but I find it increasingly hard (as my career advances) to make a distinction between what is and what is not a technical problem. If the business constrains dictate certain sub-optimal solution must be developed, and that in turn causes technical issues, was that a technical problem? if a teammate is disruptive and introduces sub-par code that later causes bugs that need to be immediately addressed now, was that a technical problem?

In my mind they probably all are to some degree just by virtue of in the end influencing whatever technical decisions are being made. So maybe that could be part of the answer? asking about what specific sense are you referring to when you (the interviewer) ask me about the hardest technical problem.


It's all hard until you swim in those waters for a while.

I've got two answers that I would probably consider.

#1: debugging what ended up being a hardware problem. I was working on a device with a microcontroller and it had a sleep mode where the micro would program an RTC, shut itself off and the RTC would trigger the board's wakeup circuit when its alarm fired. I'd already told the board designer of two or three hardware bugs that somehow (surprise!) turned out to be my software bugs. So this time I was a little more cautious. There was a more senior software engineer working with me, and he told me to check the schematic. I looked at the processor manual and the board schematic, and followed the traces to make sure I was doing it right. And I just couldn't find out what was wrong. So the senior sw eng said, "well, ok, if you're sure, then just probe the RTC pin with a scope." Wow. A o-scope. WTF is this gloriousness? So I got to learn a bunch about how to go from the board schematic to the board layout, how to probe, what all the stuff on the scope was about. Sure enough, the RTC alarm went off on schedule but the trace showed some funny stuff that indicated that there was a design error in the board somewhere (I didn't understand the details, but IIRC a cut-and-jump of the prototype made the bug go away).

Motto: It's never a hardware design bug. Until it is.

#2: This bug I learned a good amount from. I would see frequent misbehavior in my code where it looked like multiple subsequent sessions were being corrupted somehow, perhaps from a previous session. I was certain that I was releasing resources from the previous session and destroying all of it. I watched my code hit my `boost::shared_ptr<foo_t>::reset()` and so clearly it was now gone. Right? Well, shared_ptr<> not all it's cracked up to be. So I went back to read about conventional advice about shared_ptr<> and people would frequently suggest boost::weak_ptr<> where appropriate. I mistakenly thought about these as a dichotomy for some reason. But that was no good because I couldn't share my weak_ptr<> so it's not really useful. Except -- wait -- the vast majority of the time I'm propagating my shared_ptr to places where they don't need to share it beyond themselves. So my design would actually be better if I shared the shared_ptr as a weak_ptr anywhere other than Right Here. In doing this redesign, I realized that the weak_ptr promotes itself temporarily by effectively asking "hey is this still allocated somewhere?" Turns out that other thread using this resource would occasionally take slightly longer and wouldn't decrement its shared_ptr until after the new session had started, which would mean that the old resource was never destroyed. After the redesign in this case where the background thread loses the race it would just fail the weak_ptr<> promotion and harmlessly skip its activity.

Motto: shared_ptr<> and weak_ptr<> help preserve an ownership metaphor. Which code Owns this memory/resource and which code is just "borrowing" it?


Dealing with legacy code.


>> What is the hardest technical problem YOU have run into?

I have solved about ten "hard" problems in my career, most of which has been in R&D. Each one of these had multiple prior failed attempts, and in some cases took me months of thinking before I could find a solution.

1. Qualcomm wanted me to devise a computer vision solution that was more than two orders of magnitude power-efficient than what they had then. There was a clear justification existing as to why such a drastic improvement was needed. Nobody had a solution in spite of trying for a long time. Most laughed it as impossible. I started by looking for a proof as to why it could not be done if it indeed could not be done. After some three months of pulling my hair, I started getting glimpses of how to do it. Some three months later, I could convince myself and a few others that it is doable. Some three months later, the local team was fully convinced. Some three months later, the upper management was convinced. You can read the rest here: https://www.technologyreview.com/s/603964/qualcomm-wants-you...

2. I wanted to solve a specific machine learning and Artificial Intelligence challenge. I would code for a day or so, and then again run into days of thinking how to proceed further. E.g., coded a specific parser algorithm for context-free grammars, including conversion to Chomsky normal forms, in 1.5 days including basic testing. However, what's next. Woke up with new ideas for about ten days in a row. Conceived Neural Turing Machines back in 2013, about a year before Google came up with their paper on the subject. (Unsurprisingly, I did not had that name in mind for it back in 2013.) I also did not get an actual opportunity to work on it, as a result of which I am still not sure if I could have actually done it.

3. Needed to make a very sensitive capacitance measurement circuit, trying to get to atto Farad scale floating capacitance even with pF scale parasitic capacitance to ground. The noise and power requirements were very challenging. After about three months of seeking inputs from the team lead without hearing a solution, I ended up coming up with a solution. I later discovered that the technique was already known in RF circles, though only a few were aware of it. Capacitance measurement circuits with such sensitivity did not show up in the market for several years. (My effort was target at using inside a bigger system.)

4. I was working on measuring bistable MEMS devices. The static response of these was well understood. However, so far, the dynamic response was only measured by the team; there was no theoretical explanation behind it. We invited several professors working in the field to give seminars to us, and asked questions for this, but never heard back a good answer. A physicist colleague found an IEEE paper giving the non-linear differential equations behind it, which worked, but yet provided no insights into the device behavior, and took time to solve numerically. I wanted a good enough analytical solution. I kept on trying whenever I had time-opportunity, while the physicist colleague kept on telling me to give up. Six months later. I woke up with a solution in mind, and rushed to the office at 7 am to discuss with whosoever was there at work at that time. The optics guy I found did not fully understand it, but did not find it crazy either. A few hours later, the physicist friend confirmed my insight by running some more numerical solutions. I could then soon find tight enough upper and lower bounds, and the whole thing fit the measurements so well that most people thought it was just a "curve fit". (It was pure theory vs. measurements plotted together.)

5. I proposed making pixel-level optical measurements on mirasol displays using a high-resolution camera to watch those pixels after subjecting them to complex drive waveforms. Two interns were separately given the task (surprisingly without telling me), and both failed to develop algorithms for pixel-level measurements. Later a junior employee worked on it, was unable to develop pixel-level measurements still, though was able to get it to work at lower resolutions. The system took about 40 minutes of offline processing in Matlab. Later, a high-profile problem came up where pixel-level measurements were a must, and I was directly responsible for solving. Solved in one day. Processed images taken in real-time, not 40 minutes. The system stayed in deployment for years to come.

6. We had bistable MEMS devices, and there was a desire to make tri-stable MEMS devices. Several people at the company attempted it, including a respected Principal Engineer, but no one could figure how to even start. I could not figure either at the outset, but started bottoms up from Physics and using Wolfram Mathematica to create visualizations around the thing. And bingo. In a few days, I had not only figured how to make these tri-stable MEMS devices, but also multiple schemes of driving them. My VP's reaction was "Alok, you should patent that diagram itself", given the clarity it had brought on the table.

7. We were creating grayscale/color images using half-toning. A famous algorithm, Floyd Steinberg, works very well for still images but has lots of artifacts for videos. An PhD student working in the field was brought in as an intern, nevertheless, the results were not great. The team also tried binary search algorithms to find the best outputs iteratively, however, it was not implementable in real-time as needed. I was interested in the problem, but was not getting time to give it a fresh thought that it needed, until one day. A few days later, the problem was solved. I developed some insights into it and just had the solution coded, to the surprise of people who had spent months working on it.

I could go on writing about more cases.


''''' Hmm, so, great. That's interesting.

So, um, how would you say your skills deploying to NodeJS are. Would you rate them as strong? Tell you what, lets go ahead and break for lunch now and Sam is going to show you around the campus a bit and then we'll continue with a follow up and some coding challenges. """""


Writing a Java compiler (not the full language, but a large subset including inheritance and polymorphism), writing a C++ game engine.

So much work involved. Very complex problems, needs a lot of theory but also practical knowledge. Needs good debugging skills. And endless amounts of time.


If you don't agree with this slightly-contrived (I'm talking about the 'start with punchline', specifically) storytelling technique endorsed here, at the very least, please be aware of the STAR technique commonly used in behavioral interviews.

One benefit of using the STAR technique is that you are not going to ramble. It should not take you more than 1 minute to fully lay out the Situation, Task, Action, Result. After that "executive summary", if they want you to go more in depth, the interviewer(s) can ask you.

https://en.wikipedia.org/wiki/Situation,_Task,_Action,_Resul...


You could use the RATS technique. ;)

I feel like STAR is important in the same way and for the same reason as 'start with the punchline'. Both are good ideas, and both are aiming for 'keep it short and relevant.' Which, having interviewed and hired many people over the years, I'd have to say is reasonably good advice.

There are plenty of exceptions to both of these ideas though. I probably have more trouble getting engineers to elaborate on something than I have with them going on for too long. I quite enjoy a candidate who will help me carry a conversation, who will ask questions of me, who will offer and inject relevant or interesting side-details into their story. Going on a tangent isn't a bad thing unless it's negative or irrelevant.


Resume writer here to add that the STAR technique applies to resume writing as well. For any accomplishment on a resume, it's ideal to have these four items covered in at least some detail.


Yeah trying to create a model of 'how you should talk' is probably often counterproductive as everyone responds differently. Your suggestion of STAR is better as it is far more generalized while still giving you a guideline to focus on so you don't get lost or off track.


Yup, I've always known STAR as the "gold standard" for these kinds of questions. Also the presentation technique of: "Tell the audience what you're going to say, say it; then tell them what you've said."


+1


This is one of the best blogs on the topic and as someone who has easily cracked all big tech company interviews I can say this is a good piece of advice.

I will make following broad points:

1. Never walk into that room without practicing. Practice before a mirror, practice before a friend, practice in a car. Have a written script and optimise it to remove redundancy, highlight achievements etc.

It is not about repeating what you have practiced but having a free flowing conversation where you don't have to struggle for words, sentences all while maintaining a confident posture.

2. Converse not interview

A lot of people fail to keep the conversation going. It is not like a FBI investigation. It is more like a friendly banter. Think of a scenario where you are talking to a potential roomie. It is okay to walk out of that interview without an offer but then you should feel good about having conversed with another geek just like you.

---

Maintain the mindset outside of interview preparation. Most people fail at this.

Good interview preparation begins months ahead. You need to look at your co-worker's code, give them feedback, learn to make needless improvements in your existing code, solve algorithms and discuss technical problems on stack overflow and else where. Built a mindset where you are able to talk about technical work to other people. Speak more, listen more and advice more at least 3 months ahead.


>2. Converse not interview

This is good advice but it applies to both sides.

The best way to learn about a person's technical background is to start from a common base and go over their experience. I like to start talking backwards from their resume, and say "OK, Job A. What were you focused on there? Your description mentions technologies B,C,D. How did you apply them?"

You then just take it from there, pick up on the things they discuss to get into the technicalities. Ask them hypotheticals. Ask them how that technology could apply to a different problem set. Ask them about things that annoyed you specifically about those technologies in the past and how they addressed/resolved them. etc.

This is the best way to interview in my experience. It keeps the pressure low, it doesn't waste time on rehearsed answers, it doesn't waste time on whiteboarding unless it comes up (a very basic takehome project (30 minutes) should be given pre-interview), it lets the person discuss their experience and provide real feedback about the things they've learned. It gives them the opportunity to discuss their technical habits, values, and interests. It reveals the most about the candidate in the minimal amount of time.

So many of my colleagues would lock up when they'd go in to interview people and not know what to do. They'd sit there and just expect the candidate to know what they wanted to see and carry the whole thing. They'd print off a list of questions that they found from a site about how to interview people, or they'd give them a code trivia quiz that is a massive waste of time for everyone.

All of that is very silly and misses the point. Everyone just needs to relax and hold an unscripted technical discussion. You can go in with an outline to make sure you hit the topics intended in the course of the discussion, but shouldn't need more than that.


Yes. In fact there are very little blogs and advices for the interviewers.


If that's what it takes to get a job as an engineer at the big companies, it's not particularly surprising that the quality of their engineers has declined as they've grown.

The skills of a con artist are not related to the ability to build good systems.


>> learn how to have a conversation

> The skills of a con artist are not related to the ability to build good systems.

lol.

Communication is about moving information through someone's senses and into a model constructed in their head. If you can't effectively communicate about yourself, the interviewer is going to make more inferences about you and may focus on areas that aren't your strengths while not even knowing to ask about strengths you think are very relevant.

It would be great if they could just sense your innate value through your aura, but it's not going to work. Being able to talk about yourself may feel uncomfortable or like self-aggrandizement, but that's actually a great reason to practice it. The interviewer wants to learn about you but also has a bunch of other explicit and implicit goals (get through the interview questions, to not be incredibly bored, etc), so there's no reason not to do a good job at honestly telling them about yourself.


Drawing a conclusion from this, if a person has a disability that leads them to being unable to interact until they understand the nature (rules) of the person or group around them, then they're more-or-less screwed in an interview situation.

That does explain a few things. I am REALLY good at my job, and after working a job for a couple of years I become very good because of my disability, yet the same disability means I don't know how to interact with people so I interview poorly.

I don't know how to talk to people I haven't met and just doing it for practise isn't the way to learn. I really can't spare the 10 or 15 years it would take.

Ultimately, when someone asks something that's unexpected, makes a claim that's false or incorrect, I just freeze up and can't actually respond. In daily conversation with people I know it's not a problem, but with strangers I have to stand back and wait until I have a grasp their sense of humor, how much they think of themselves, and so forth, before I can speak.


Good communication skills are certainly essential to being a con artist, but they are also essential to working with other people who are almost certainly going to be very different from them. If someone is incapable of explaining something to another person or discussing and agreeing on a course of action, that's somebody who probably doesn't know a fraction of what they think they know.

For every gregarious person who uses their communication skills to fake competence, there's at least one person who is convinced that they are a misunderstood genius, but their lack of supposedly BS communication skills has cursed them from ever being fully appreciated by the "normals" that they think they are better than. You don't want to be either one of those people, they are equally useless when working on hard problems.


Communication skills have nothing to do with blowing your own horn.


Conversely, the skills to build good systems are not related to working effectively with others. There's probably a balance that needs to be struck, though they're also not mutually exclusive.


> the skills to build good systems are not related to working effectively with others

Yes they are if you're talking about the types of systems that large companies have. It is in fact so difficult that I believe it's a considerable advantage to have scopes small enough to be manageable by a single engineer, but inherent complexity is often well beyond that, especially for very profitable business engines.


There isn't a single work project that I've been part of in the last ten years where communication wasn't important. If you work with anybody else (and if you are writing software for other people to use the then you should be), then communication is extremely important to build good, effective and useful systems.


> The skills of a con artist are not related to the ability to build good systems.

I think it is very very rude of you to call it a skill of "con artist". I have seen teams with average individuals achieving lot more than several very intelligent people simply because together they worked lot more better. Any company who ignored the communication and personal skills of their engineers is bound to fail.


You're suggesting constantly practicing talking about yourself and highlighting your skills. Building confidence is exactly what a con(fidence) artist does.

I'd rather have engineers that can actually discuss technical problems rather than deliver smooth talk about how they are incredible.


> If you’ve been through interviews at some companies that are not as good at interviewing, then you probably had some questions on your list such as

> Where do you see yourself in 5 years?

Dead.

> Why do you want to work here?

You have money.

> How do you handle disagreements with coworkers?

Attempt constructive engagement, and if that doesn't work then shun them.


Sorry, but you are too honest to work in our industry.

Have you thought about going into politics or drugs instead?


Sorry, no culture fit.


> > Where do you see yourself in 5 years?

> Dead.

Did you say you have a three-year vesting schedule? I'll be at another company probably, since you'll have me doing two jobs for my original compensation and title after the second year.


What are your strengths and weaknesses?


Honesty and honesty.


You forgot "humor".


and "humor".


Brevity.


I see some criticism on this point, but for me this passage is a gem.

> In general, real stories are told chronologically backwards. This is why we start off with a punchline. In contrast, practiced stories are told chronologically forwards. It’s a solid indication as the interviewer that the person is reciting something they have committed to memory if they tell the story forwards, and in turn it’s significantly more likely that the story isn’t entirely true.

I have a friend who - bless his heart, I adore him, but can't get a quick story out to save his life. Every point he makes he reserves the punchline for last, and he starts by going on a back-story tangent first which usually forks into multiple back-stories. I've been trying to nudge him to turn it around and give away the punchline first, but he's deeply convinced that good stories are like movies and need to have a backstory followed by a narrative arc that doesn't make it's final point until most of the way through act 3.


> In general, real stories are told chronologically backwards. This is why we start off with a punchline. In contrast, practiced stories are told chronologically forwards. It’s a solid indication as the interviewer that the person is reciting something they have committed to memory if they tell the story forwards, and in turn it’s significantly more likely that the story isn’t entirely true.

This is awful and just completely untrue. Many companies that take the time to want to do interviews properly will have something similar to STAR or SOARA implemented, and you'll be starting with the situation, move on to the tasks/target you wanted to complete or hit, the actions you took to achieve that, and the results of what you did. This is chronologically forwards.

This comment is the kind of psuedoscientific crap that makes interviewing a crapshoot and is a good indication of an unstructured interview.


Interesting opinion. I find the quote somewhat true, and the article's broader point mostly true and rather valuable.

The broader point of that quote is that a dynamic conversation usually does reveal more truth and paint a more accurate picture than a practiced story. I find that to be very true.

I feel like you might have misunderstood the article and decided it was wrong before taking the time to understand. That could be an indicator of poor writing in the article, or of excerpting and discussing a quote out of context, but is it helpful to respond with hyperbole?

STAR & SOARA do not dictate a chronology, so they are orthogonal to this point. But their goals align with the article & this quote almost entirely, if you think about it.


The quote is quite explicit - 'If you tell a story chronologically, you're more likely to be fabricating it'

I'd argue that my response isn't hyperbolic, and is justified considering how ludicrous that statement is.

How do STAR and SOARA not dictate chronology? You specifically discuss the initial situation first, and the results you achieved last (or the analysis of the results).


The quote didn't say "fabricating", it said "not entirely true." The way you're interpreting the quote, it would be ludicrous, I can agree. The way it was actually written, along with what I interpret to be the broader point, I think the article is somewhat true, and has a valuable message.

STAR and SOARA are a way for the interviewer to drive the requests for information, force a conversation, try to frame the question so that candidates can be more easily compared, and prevent the candidate from rambling and offering irrelevant information. The article's suggestion has the same goal, aside from the truth detection part, which I'm downplaying here.

Don't focus on a single quote and ignore the article's larger context. The author also said "If you get too far into a story without making sure they are still with you, it comes off to the interviewer that you cannot explain things well." and "If it’s not obvious yet, force the interview to be a conversation." All of the sections lead to "force conversation", if you can get past the part about speaking backwards being more truthful.

Conversations almost always run backwards, in portions. Anytime you answer a "why?" question for example, you're telling the first part last. I suspect that's what the author was trying to say, less that narratives should always be presented backwards, and more that conversations are desirable and conversations often run backward.


Oh man, that's me for sure. I always backstory my tales to death. It does successfully take a minor conversation and turn it into an elaborate and interesting discourse but yes, as you say, it's not focused on the original punch line.


I never realized it until now, but I think I might be doing that as well. I will try to be mindful of that. Thanks.


There's a time and place for a good long meandering and elaborate backstory! Interesting conversation and discourse is something you should cherish and continue to develop. So, don't get me wrong, punchline first is not a rule, not always the goal. Just be aware of it, and you can start to decide which one to use. For job interviews, punchline first is good advice. For talking to your friends, it depends. For giving a talk or telling a story, backstory might be critical. Start playing with it and see how people react, find out when & how starting with the punchline is better. I think the point of the article is that punchline first is a tactic to get the other person talking and asking questions, rather than you talking for 5 minutes.


I'm pretty sure I tell my stories chronologically forwards. What a strange accusation.

(This is one of those things that's going to bug me for a while every time I tell a story.)


The assertion in the OP is that forward stories are practiced beforehand and are thereby less likely to be true. I am introspective in nature, so I do have many (true) stories thought over again and again.

I would rather disagree to the idea of telling stories backwards. We aren't doing Memento things after all! :-) It's best to tell the (true) story the true way, the way it happened.


I'm not sure the "backwards" suggestion was meant to be taken quite so literally, my interpretation is that interrogative conversation, as opposed to narration, is the goal. Conversations, as opposed to narratives, do frequently go backwards without us even knowing about it or thinking about it, it probably happens more than you think.

"Hey I wrote this code"

"Why did you do that?"

"Because the frobnobbitz wasn't accounting for tribbles."

That's a conversation that runs backward. Anytime someone asked you 'why', and you answer, it's backwards. You don't have any control over what direction it is because the person asked you a question.

I'm really not sure but I suspect my own stories are less likely to paint an accurate picture of something than an interrogative conversation is. All humans have cognitive biases, so I wouldn't rule it out by thinking that I'm trying to tell the truth.

I've watched people practice true stories, many times, to a point that they become misleading. Not untrue, just misleading. (NB the author said "not entirely true".) Events are left out, motivations are made to look better intentioned than they were, etc. etc..

To be fair, I've seen plenty of lies made up on the spot too, so I'll refrain from defending the claim that one is more likely to be true than the other.

I still see value in placing conversation over narrative in a job interview.


Yeah, same here. Sometimes I might give away a bit of the punchline by asking if I've told them about X, before launching into the story. But generally, it's left for last, like all good stories. Otherwise, it's the definition of anti-climactic.

When giving information to someone, sure, lead with the important stuff. But when telling a tale? Pff, no.


I can understand putting the headline first (Let me tell you about the time I re-wrote the widget code for Acme.), but I don't think I ever tell the story backwards... maybe I am just strange in that I appreciate a good (concise) story?


Not strange at all, there's a reason movies are they way they are. It depends on context, there are times to put the punchline first, there are times tell the story backward.

For that matter, there are classic examples of movies that tell the story backward, or give away the ending first.

There's some difference between headline first and punchline first, but either way the real point being made was turn it into a conversation by giving the shortest possible answer first, and letting the other person request the backstory as needed. Make sure they're doing some of the talking and driving the direction of the story. Make sure they're interested and controlling the direction and amount of your narrative.

OTOH, if you're in a setting where it's expected that you'll take ten or forty five minutes to tell a good story, then a narrative arc that increases tension for a while is probably a really good idea.


That makes much more sense. I was thinking of 'telling the story backwards' being telling the same story, just giving the details in reverse of how they happened.


Seems like a questionable heuristic. It is not difficult to imagine that a person who is telling the story from scratch might start from the beginning and kinda meander around the details as they try to reach the end, precisely because they have NOT rehearsed the story beforehand.


> If you give them a resume, expect questions about stuff you worked on at your past jobs. If you gave them a link to your Github profile, expect questions about your projects. If you gave them a link to your Stack Overflow account, expect questions about some of your answers.

On my Linkedin (and also resume etc.) I give a link to my blog / github. Every time I've been asked about it in an interview setting, it was actually when I was the one conducting the interview, and the interviewee was trying to impress. Much as it pains me to say, I don't think side projects are a good way to bolster your CV, at least in my field.


I've only asked questions about Github projects when they have something on there wasn't obviously from schooling or just garbage they threw on there to keep for later. I'm pretty certain we're in a different industry, but I'm still surprised nobody has asked about your Github stuff, assuming you've put decent things on there.


Doesn't surprise me, but if you're handing it out, you should still be prepared to answer questions about the stuff that's on there.


What is your field?


Google has published some of their data on this and behavioral questions about teamwork (e.g. handling disagreements) are reasonable and can be valuable. Software development is usually a team activity. The rest of this post is a useful, if anecdotally sourced, guide to answering the more technical class of behavioral questions. It should include a block on follow up questions. Good behavioral interviewers, like you might find at Google or Amazon, will ask specific follow ups for each question. See: http://www.businessinsider.com/google-laszlo-bock-interview-...


Thanks for posting this. One thing I didn't write about is the objectiveness of interviews (felt like a different topic). Just about any question can be okay as long as it follows a few simple rules. 1) if you ask a question, you should have an expectation of what you're looking for as a result in some quantifiable way and 2) how does that result affect your decision to hire or not. The worst interviewers usually don't have an answer for #1 there and they probably decided to hire you or not very soon after you opened your mouth for the first time. This article fails to state why each of these questions is valuable and what you should be looking for as a result of asking any of them. With the addition of that, it would be a much better article.


Let's translate shall we?

> What is the hardest technical problem you have run into?

"Tell me an unverifiable story in which you're the hero."

I really hate this question:

> Where do you see yourself in 5 years?

Any post on HN about interviews draws a ton of comments, and they're usually the same comments as every other post on the subject.

Honestly at this point having gone through a reasonably large number of interviews I think it comes down to brushing up on basic CS knowledge and, more importantly, whether or not they like you. As much as we like to make interviews dispassionate assessments of proficiency it really does seem like basic chemistry is the key issue. And honestly that makes a certain amount of sense: most people don't want to work with someone they dislike.


Also, how bad they need to fill that position. You will get waaay less bullshit if they are in hurry.


Good advice. My first (sometimes only) question when I interview people is: Tell me about your favorite project.


This is a good question if your goal is to hire people who can talk a good talk about an unverified favorite project. It also assumes that someone has a clear favorite project ready to discuss. People who do not are put at a disadvantage. (Though I do understand this question is well-intentioned.)

The article doesn't really justify the process people go through as a good one. People who think they have a good approach to interviewing, but their sample size is too small to back it up or worse, present an opportunity to people who are good at telling stories but may not have the skills to go along with the story in the end. People who hire based on the story-telling experience will eventually get burned.

I think someone should read this and feel a little worried. This is story-telling. People who might argue that telling a story is being able to communicate---it's merely one form of communication of many that are needed depending on the work environment, and results in a blind spot for your team's hiring process.


Having done many interviews, mentored interns and full employees, I'd argue storytelling is a _key aspect_ of performing our job in a larger team. Perhaps I'm being too liberal in my definitions, but I see substantial overlap between "talk about your last project" (and then digging into pitfalls, hacks, workarounds, conflicts; and mind you, I don't mean just PASSION projects, literally any work prior one can speak fluently on) and "tell me how you want to spec this architecture; why".

Much of how we discuss what we do can't be empirically precise, unambiguous, and scientific. Certainly the more social and abstract aspects of our jobs begin to sound, as you put it, like storytelling.

(To be clear, it's far from the only thing I look for, and can be taught/learned to a degree, but I consider it as a key skill in the broader bucket of "communication"; alongside problem solving and base competencies. And like programming itself, having some experience helps.)


> I see substantial overlap between "talk about your last project" [...] and "tell me how you want to spec this architecture; why".

A major difference is that some people have deficiencies in autobiographical memory. This can make telling the story of a past project vastly more difficult than talking about a subject of current focus. An extreme case was recently described in Wired [1], but the ability is generally more of a spectrum.

[1] https://www.wired.com/2016/04/susie-mckinnon-autobiographica...


As I mentioned, this is playing into the fallacy that telling a story during the interview is equivalent to the communication skills required for performing the job.

It confuses story-telling performance during an interview, and the stresses of a success/fail situation, with the type required to perform real work.

Mistaking "overlap" for all-encompassing. Sure, there's overlap. There's overlap in being able to type up a coherent response to a post on Hacker News, but it doesn't make me more qualified for your position.

It fails because it assumes that the candidate has a single defined favorite project. If you try to make the question more broad, then it becomes so open-ended that it is hard to equally compare candidates on it. It becomes random chance that a candidate discusses a project in a way conducive to doing the job. Where if more precise or structured, it might weed out people who can talk passionately about a personal project from those who can constructively explain to a manager the challenges of a project.

It also starts stretching it into saying that everything is story-telling so as to make the term meaningless. I agree it is to some degree, but it's important to try to stick with meaningful mental models that can make predictive assessments about candidates.

The huge problem is there are people without the skills who can tell a good story. This process let's these people through, while ignoring good candidates who may not perform well on this interview question.


I'm sympathetic to most of what you're saying, but:

> It fails because it assumes that the candidate has a single defined favorite project.

I...kinda don't want to work with somebody who can't read between the lines and pick one of their favorite projects when a question like this comes up. Social signaling and parsing are important to a comfortable and pleasant work environment.


All I see here is excuses. A good software engineer has to be able to communicate freely and be confident and decisive in what they say. Asking a question like this expresses all of these things, and being an experienced interviewer also allows you to notice when its a little forced (when a person is very introverted), or when they are making stuff up (thats why you ask more specific follow up questions) etc.

I see more people having the skills but not being able to tell the story or communicate in a meaningful way. Seems to be a huge deal with software engineers, the communication piece is just disregarded in most cases. Thats when you end up with engineers who are locked away in their own rooms and not put in contact with any external stakeholders.


To be a good engineer requires a number of skills. You named one. Knowing what arbitrary conclusion a random interviewer wants to hear and will draw is not one of those.

You can be a great communicator AND frequently not answer this question in a way that an interviewer wants to hear. There are more excuses for bad interview practices than anything else here.

People vastly overestimate their ability to detect lying and shouldn't rely on that. How do you know when someone fooled you? You don't. Why risk that when there are alternative and better methods?

Many engineers don't realize they are self-rationalizing their own interview process without any rigorous evidence. This is the opposite of good engineering.


Being able to discuss design is an essential skill for programmers. This question gives you a chance to display that.


This question too easily gives people a chance to NOT display that they can discuss design. But the question is not opaque if that is the goal. Look at all the responses to me from people deriving vastly different (flawed) conclusions.

The same vastly different interpretation of what you want to hear is going to happen with candidates. You're exhibiting poor communication if that is want you want to discuss.


I ask a slight variation, tell me about a project that you worked on that you enjoyed or are proud of. If they can't answer this it makes me wonder if they don't enjoy anything or are not proud of anything.


"Makes me more wonder" is a euphemism for "makes me doubt their qualifications and abilities".

Some people are humble about their work and abilities. They will never exhibit open pride. Or they don't want to bullshit people and blow smoke up someone's rear. Some may want to switch jobs because they are forced to do poor work and know they shouldn't feel proud about that.

This is probably a 'flaw' I have, but I'm aware of it that I know what I'm expected to sound like when asked this question. Now I'm put in a position where I am being dishonest. You're now testing my ability to BS you to get what I want, and by my nature I am already uncomfortable with BS'ing people.

Now compare this to the person who is too incompetent to not feel pride in the shoddy work they do.

Now the enjoyment part. I can talk about this extensively, but enjoyment is subjective. I could enjoy working on something because of the challenge of the problem, or because I was part of a great team, but it still doesn't speak to my ability to perform the job you have.


I don't see this as being humble, I see this as a lack of confidence in their skills and abilities and achievements. These same people, for the same reasons, are adverse to avoid making decisions and second guess their work, which causes delays and communication problems.

Instead of trying to make this into an excuse of being humble, it should be acknowledged as a lack of a certain important trait.


> I see this as a lack of confidence in their skills and abilities and achievements

Given that imposter syndrome is well-documented in our industry, it's quite possible that I've done cool things, or impressive things, but not realize that they are cool or impressive because I am in awe of the awesome developers I work with.

Combine that with most of my work being something like "I added new features in our Ember app, and fixed bugs in the UI and backend", and it often is easy to feel like the day-to-day work I do isn't awesome, even if what I am building is (IMO) pretty cool.


But if you think the stuff you work on is cool, why can't you express that feeling by telling it to someone else? Because you lack the confidence in your work, which is the exact problem.

The interviewer should not change here, you should change to be able to convey your work and why you think its cool etc. Thats exactly what the interviewer is looking for.


I'm 30% through writing a 10-20K word blog series about something that I've been working on for the last 6-8 years. I've probably written ~5K words already on blogs unrelated to my series where I'm kicking around the ideas that went into my project. Finally, the project probably isn't that interesting unless you've encountered a very specific type of problem before. Which means I probably should put in another 5-10K words to market it.

Sometimes it's not about confidence. Sometimes it's about complex social dynamics and how people react to suddenly being thrown into the deep end of a domain that's completely new to them.

"Tell me about a technical challenge that won't make me feel inadequate or be difficult to follow. Don't make it too simple though because then I'll think you're an idiot."

Ah yes, just be confident. That's the ticket.


No, they're honest.

It's amazing to watch you bend over backwards to try and vilify a person who simply strives for accuracy.


I've never thought that was a fair question, because it's actually pretty rare to be able to work on something you enjoy and end up proud of. The way I see it, one of the reasons I'm interviewing with you is that I hope that the best of my career is ahead of me, and that the project I'm most proud of is the one I'll be working on next.


I ask a variant of this question when I interview, and if you said this? I mean, people vary, but I think this is a totally valid and good answer. Because I have a lot of shitty gigs in my past, too--the signal that "I'm moving on because I want to work on things I'm proud of" is a pretty powerful one. (It's gotten me gigs before, too.)


My anecdata with hundreds of interviews over the years is that getting people to talk about their projects is the single best/fastest way to verify their involvement and knowledge of the projects they list on their resume. When someone can't elaborate on what they did and why, or what problems motivated their work and what they learned, it's a strong indicator that they are puffing up the projects/keywords on their resume (which is super common) but didn't actually learn much.

Moreover, getting people talking is a great way, in my experience, of identifying strong thinkers, strong coders, and strong experience. It helps you see someone's personality, it helps you literally get to know them. I can't think of any reasons why I would worry about that before hiring someone. I would worry about not doing it.

I would like to know what you would offer as a better alternative approach? Do you prefer the idea of coding questions to stories?


What percentage of bad hires result from your approach vs. other approaches you've tried that have failed? If you even did know that, how is this not randomness in a small sample?

This ignores the problem that if someone can elaborate, it's not necessarily an indicator. It can just mean that someone can BS well. Relaying too many specific details can actually be an indicator that someone is not telling the truth.

"Getting people talking" is really the only way you can identify these things during an interview. And this is not the same as telling a good story. And now you have to demonstrate that this results in job performance.

On here, tptacek has exhaustively described an alternative approach: https://news.ycombinator.com/item?id=9159557


I've never hired someone who's been fired, and I've had to fire other people's hires, so I think my process works okay on some empirical level. I do not defend my approach as scientific or perfect; it's not, and I never claimed it was. But I have verified my approach by asking follow up questions with people who seem to signal they're inflating the importance of items on their resume, and found that my detector was working.

> Relaying too many specific details can actually be an indicator that someone is not telling the truth.

This is actually not true and has been scientifically demonstrated. I listened to a podcast about this, and lies come out with a measurably, detectably reduced vocabulary versus true stories. Let me see if I can find a link to it...

But I also don't care whether it's true because your base assumption is that you shouldn't trust anything anyone says. My experience is the polar opposite: most people interviewing for jobs aren't primarily bullshitting, they are by and large telling the truth, and all I need to do is determine which candidates are better than other candidates, not which ones are lying.

Most of the resume inflation I've seen isn't a case of intentional BSing, it's a case of inexperienced people not knowing how little they know, and assuming that a month of JS or SQL during a summer internship puts them in roughly the same camp as someone who's done it for 5-10 years.

Talking that out with people has and continues to give me a pretty good idea of what they know and don't know.

BTW, I don't see anywhere that @tptacek is countering what I've said. If you read what he said, he mentions using conversation to filter people multiples times. Here, for example: https://news.ycombinator.com/item?id=9159959

No interview process is scientific and perfect, and you shouldn't expect them to be, that is unrealistic. There's nothing wrong with realizing it's a social activity and not an algorithm, that it's an art and not a science. Learning how to be good in interviews by understanding what interviewers are looking for is part of your job, not a way to gamify the system and trick people into hiring you. Another part is being a good coder. Both are important.


I do that too and look if the person has a project he/she is excited about and can talk about decisions that have been made and other details with some clarity.


My favorite question is to ask them about their failures, and what they would do differently now


Never talk about yourself; steer the interview to being about a problem they are solving with work and help them run through how you would solve the problem as if it's a meeting and you are working with them. Rarely fails.


I'd say it's not either or. Do both. Give a short summary of who you are with related projects you worked on. Explain shortly how these experiences made you an authority on their problem. Then explain how you would handle that problem of theirs.


>In all likelihood, the interviewer doesn’t know what they are looking for with these questions, and they are just being used to fill time.

Wait a sec -- just because the author of the article doesn't know how to get value from those questions doesn't mean that those questions hold no value. It is true that they won't give you information to help you in a tech screen, or to gauge the value of where to initially place a candidate on a team. But if you are trying to decide between a few candidates of equal skill, and trying to figure out which one will work better in a team environment, which will fit more smoothly into the personal dynamics between team members, who will grow better as the company grows, who might be a better leader or follower, and what their trajectory might be as the company and team evolves, these questions can lead you down those paths.

Dismissing those questions as useless makes me think the author doesn't care about the people as individuals, but just as machines to be plugged in to produce code. And that doesn't sound like someone I would want to work for.


It's ironic actually. The author touts told stories of prior work as key, and degrades attempts at measuring candidate conscientiousness. Meanwhile, strongest known measures of job performance are an actual work sample (r=.26), rather than ability to describe a prior project, and measures of conscientiousness (r=.15).

On conscientiousness:

Barrick, Murray R., and Michael K. Mount. "The big five personality dimensions and job performance: a meta‐analysis." Personnel psychology 44.1 (1991): 1-26.

On work samples:

Roth, Philip L., Philip Bobko, and L. Y. N. N. McFARLAND. "A meta‐analysis of work sample test validity: Updating and integrating some classic literature." Personnel Psychology 58.4 (2005): 1009-1037.


It's the rote, cliche questions that show an interviewer doesn't really know what he's talking about. Questions about personality and culture are very important, but if someone asked "Where do you see yourself in 5 years?", it would point toward their not being a skilled interviewer. It's better to phrase the same question in any other way that's not associated with corporate ineptitude, like "What are your long-term career goals?"


I was updating my CV recently. I don't have the longest employment history (switched careers, then stayed at a job for 5 years), so I padded it out with a few short paragraphs of projects I'd worked on.

It actually worked really well - it brought some projects I'd forgotten about back into my mind making it easier to talk about them, and gave the interviewers specifics to latch on to.


More importantly, it shows accomplishments, not job duties. And that's what everyone wants to see - how your wins there will translate into wins here.


I did this in my resume for my first job out of college, and I still do: "Open Source Software Project Experience". List of projects, with details on impact and relevance.


I find I'm more willing to talk about myself if engaged in meaningful dialogue.

For example, I get this a lot as an opening question, mainly from crooters (actual hiring managers almost never do this): "What are your skills?"

You mean like numchuck skills, bow-hunting skills? If you don't know what kind of skills I have that could possibly be germane to the positions I'm looking for, you obviously didn't even read my résumé, which means you don't have a clue, which means I am hanging up on you because obviously you can't help me.

If you say "Can you tell me about your role at company X, what sort of challenges you had, etc." I'm more willing to open up.


Being prepared to talk is important. Knowing when to shut up is equally important.

Recently interviewed a candidate who seemed promising, until they started to rant. I didn't want to interrupt them because I was hoping there was a point to be made at the end of the rant ... but in the end it was just 5+ minutes worth of "my current job isn't fair and everything sucks and everyone who is better than me really sucks too".

Didn't hire.


All that matters is having a good conversation and not appearing completely incompetent.

After an enjoyable conversation, the hiring person will rationalize wanting you all by themselves, even if they have to make up / project qualities you've never demonstrated.

95% of the time, there's nothing rational about hiring.


For younger candidates, how about questions like: "Do you have a passion for this industry? Why?" and "What have you done in the past that demonstrates your commitment to completing anything you set your mind to?" For older candidates, how about questions like: "What do you consider your driving factor?" (with possible answers like "good benefits" or "complex challenges" or "teamwork") and "What sort of challenges do you see in this field/industry and how would you go about solving them?" (with acceptable answers from technical development solutions to "let's outsource" or something creative - something that demonstrates their wisdom, even if the field isn't their expertise).


What is the hardest technical problem you have run into?

Almost always asked from companies that don't have problems to offer even remotely comparable to the "war stories" they're expecting to you rattle off -- at least not for the position you're applying for, anyways.


Also, As well as words, I would recommend practising the key diagrams that represent the projects(s) - so you leave enough space for the important elements, and don't fall of edge of page/board.


I think behavioral and "tell me a time when" questions can easily be gamed, so I question the effectiveness of asking them. Sometimes I forget specific domain knowledge around a project I worked on 1+ companies ago. But I still have to list it on my resume and thus be able to speak to it...... so I just approximate the details. I imagine you could just full on lie as long as you can detail a technical problem and provide a solution to it.


> What is the hardest technical problem you have run into?

well technically, the hardest problems I have encountered are not technical


Is this actually useful to people? Please don't give me non-answers like "well it got upvoted.."


When I read articles about job interviews, I always wonder what is the correlation coefficient between "being good at interviews" and "being good at doing your actual job".


Interesting. I naively assumed that the most important part of preparing for a developer job interview was to prepare for the coding exercises -- the stereotypical algorithm stuff.


This is incredible, thanks for sharing!




Applications are open for YC Winter 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: