
Asana Engineering Interview Guide - pspeter3
https://blog.asana.com/2016/03/asana-engineering-interview-guide/
======
lostcolony
Do these companies not recognize that moment of cognitive dissonance that
surely -must- come when, as they're faced with "how do we determine if someone
will be productive coming here" decide "I know; let's ask a bunch of questions
we don't expect them to know! And then, let's provide a study guide so that
they -can- cram before the test and come in and show they know it!"

I mean, it's bad enough to ask questions you know don't really relate to the
sorts of problems you have to solve, but to implicitly admit it, -and- to also
acknowledge that it isn't something you expect people to have off the top of
their head by providing a study guide? What are you trying to measure?

~~~
bostonpete
> let's provide a study guide so that they -can- cram before the test

Did you look at the guide? I don't see anything you would be motivated to cram
after reading through that other than to brush up on the "usual suspects" of
data structures, but you should be doing that before any technical interview
anyways IMO.

Seems like the guide is mostly intended to give candidates an idea of what to
expect during their interview.

~~~
lostcolony
Yes. Under "How To Prepare" -

Read through technical interview prep resources HackerRank and Interviewing.io
are good places to start. We’re also fans of InterviewCake’s Coding Interview
Tips.

I.e., "here, go study and cram for your quiz". If you expect people to know
the data structures and algorithms off the top of their heads (because you
believe that to be useful), then why tell them to prep?

That's what boggles my mind. Technical interviews relying on data structure
and algorithmic questions started with a few companies (primarily Google)
thinking it was a good way to gauge how strong a candidate was. For their
purposes, it might have been. But as other companies started to cargo cult it
(despite their businesses requiring far less algorithmic work), a cottage
industry of interview prep sprung up (Cracking the Coding Interview, various
sites intended to help you practice, etc). And since that then raised the bar
(you were now competing against people who were -explicitly preparing- for
these kinds of questions, and frequently would have seen the exact question
they're being asked, or one akin to it), companies, such as Asana, decided
"Okay, to be fair to everyone, we need to warn them that we're asking these
kinds of questions and so they should prepare well in advance for them".

Or...you could take a good hard look to decide if that's the skillset that is
really important in your hires. If it is, you can probably give them
legitimate business questions (which to be fair, Asana might do), as it will
cover some of the same basics, but then giving the candidate access to Google,
and access to querying and digging and figuring out how best to solve it might
be a better gauge of their ability to solve those problems in the real world
(unless for some reason they're going to be developing without access to
technical resources and the internet). Or, you might find that your business
actually requires people able to solve different problems, where implementing
fisher yates is not the key technical deliverable, and so you can instead ask
those kinds of questions, and see who can solve them based on what they know,
and what they think to ask you during the interview.

~~~
bostonpete
Wow, you really read way more into that than I would have. I barely even
looked at the first bullet in the "How to Prepare" section. I suspect it's was
just there for completeness and I think it's basically a given that people
should brush up on technical interview-type questions before going into a
technical interview.

For me, the real meat of that section was in the 2nd and 3rd bullets!

The 2nd bullet is telling people to reflect on their experiences and come
prepared to discuss. This may be a no-brainer for some, but for those who
don't have a lot of interview experience, I suspect it's easy to get caught up
with the "what should I know" mentality and forget to spend a little time
reconstructing the interesting projects and situations that they've navigated
in your career. For most of us, only the most recent stuff tends to be fresh
in our minds.

The 3rd bullet emphasizes that you should come prepared to evaluate _them_.
Again, perhaps a no-brainer for most experienced candidates, but it's nice to
see the interviewers acknowledging that this is a two-way evaluation and
encouraging candidates to be sure they're making the right choice.

While it's unusual for candidates to show up unprepared for technical
questions, I've seen _many_ candidates show up completely unprepared to
discuss their experiences or ask questions that will help them evaluate the
position/team/company/manager.

~~~
lostcolony
Yeah, I have much less of a problem with those, as they're not "cram for this
test" and instead "reflect upon what you already know/wish to know, so that
you're prepared to voice it". That's useful advice if you're not used to
interviewing, and worth pointing out.

------
dvt
I came looking for some new or interesting way of doing tech interviews, but
it's the same typical SV bullshit. Lol: "Hashes, sets, heaps, binary trees,
linked lists, all the usual suspects." Yeah, I spend most of my time fiddling
with binary trees and hashes at the office, said no one ever. I'm a strong
believer that there are two primary ways of testing someone's coding ability:

1) Open-source, or otherwise public, contributions.

2) Real-world take-home assignments.

I've opted out of interviewing people because it's such a miserable
experience. I guess I'm just not a sadist that enjoys watching some poor
22-year old squirm in his chair because he can't remember what the upper bound
on the insert operation of a red-black tree is.

What a joke.

~~~
tptacek
I agree with this sentiment wholeheartedly but bounds on balanced binary trees
are worth knowing (like lookup, it's log n), since that fact is one of just
two reasons you'd ordinarily ever use a tree.

However, I will offer you some Interview Candidate Self Defense, which I'll
repeat from something I yelled on Twitter a few weeks ago:

If interviewers ask you about bounds for data structures, pick a fight with
them over hash table complexity. If they believe it's O(1), argue it's worst
case O(n). If they want you to say worst case O(n), argue that it's in the
real world always O(1). It doesn't matter which is right, because these kinds
of questions aren't about technology, they're about status signaling. Be the
alpha nerd!

As a bonus: interviewers are evaluating you for X-factors like "confidence".
You know who never got dinged in an interview for lacking "confidence"? The
person who yelled at the interviewer about hash table complexity.

~~~
benwaffle
>If they believe it's O(1), argue it's worst case O(n)

In the case that all your keys have the same hash, right?

~~~
Rapzid
It seems to me this would entirely depend on the implementation and how
collisions are handled.

~~~
sorokod
In Java 8, collisions in Maps are stored in a tree structure so worst case is
O(log n)

~~~
logicchains
Is it a balanced tree? If not, worst case could still be O(n).

~~~
Rapzid
Haha, that would be an odd set of data indeed. I have no idea how they
implement it but I hope I have made a point(I feel I have). I wouldn't be too
impressed with an interviewer or interviewee arguing one side or the other
about the complexity of "unspecified hashmap implementation".

Too clever by half.

------
swanson
> “We want every candidate who comes into an interview with us to feel
> prepared and confident.”

Was expecting to then see a list of specific resources that a candidate could
use to become prepared and confident. Instead, the guide provides a high-level
overview of "algorithms", "data structures", and "modeling" and refers me to
"HackerRank and Interviewing.io" for tips on how to prepare.

This seems like it could be improved. You value learning -- so give candidates
something to learn and see if they can do it.

> Our goal is not to simulate day-to-day software development — where we read
> docs and write lots of tests!

I feel very confident in doing day-to-day software development. I do not feel
as confident speaking about abstract hypotheticals in front of a whiteboard.

~~~
jacklionheart
Hey -- I work at Asana.

Great idea! Took a task to add more resources to learn from to the doc.

> I feel very confident in doing day-to-day software development. I do not
> feel as confident speaking about abstract hypotheticals in front of a
> whiteboard.

Hmm, the way we framed that is confusing. Thanks for the feedback! To be sure,
our goal is to accurately assess what your day-to-day contributions would be
like, and simulation is a really good way of eliminating accidental bias in
that process. To that end, we do have every engineer submit actual code during
the interview process, and we generally avoid white-board programming in favor
of higher-level discussions (that may also use the whiteboard). But perfect
simulation is possible (and time-consuming), so we've tried to focus on the
higher order bits.

~~~
insulanian
What's your primary tech stack?

~~~
ktothemc
We're in the middle of rewriting our stack to look like this:

\- TypeScript web code \- Using React \- Connecting to a custom, to-be-
opensourced reactive datastore written in Scala called LunaDb:
[https://blog.asana.com/2015/05/the-evolution-of-asanas-
luna-...](https://blog.asana.com/2015/05/the-evolution-of-asanas-luna-
framework/) \- Using to Amazon RDS (hosted MySQL) and Redis as our primary
backing stores \- Running on top of kubernetes, docker, AWS

------
Jormundir
I recently interviewed with Asana, and unfortunately would review their
process as the opposite of what they're going for.

The first sentence is not far off from what their process is. My interviews
with them were composed of one helpful database problem, after which they
explained how they designed their production database, followed by a bunch of
frivolous algorithms problems.

I did not receive an offer from them, so take what I say with a grain of salt.
Asana seemed like a company that has some great engineers, and runs well in
general. I would, however, like to provide some constructive criticism for
their interview process.

1\. Their engineers can use some training on how to interview. Several of them
came into the room noticeably anxious. Hearing what the problems were was
easy, but communication was very difficult when it came to figuring out what
kind of answer they were actually looking for.

2\. The questions they use are very academic, which to me means they are
mental exercises lacking practical grounding. As an interviewer, it's very
uninspiring to have to solve academic problems without getting a peak into the
business' real problems.

3\. The whole process wasted way too much time. Obviously I wouldn't have
cared as much if they had given an offer, but I think it's important to be
conscious of this nonetheless. My suggestion would be to pick topics for
interviewers to cover, and cut out the repetition of the problems. Plan a
process that will lead to a decision with one shorter visit, be respectful of
peoples' time.

I hope that helps. I received more offers I'm excited about, and I hope to see
the company succeed from the outside.

------
szx
The funny thing is that Asana is possibly the worst performing web app out of
all the ones I use on a regular basis. I have a strong suspicion there's at
least one or two O(n^2) algorithms hiding there that could use the same kind
of attention and reasoning abilities they expect from their candidates.

~~~
JonoBB
Yes! Honestly, they need to hire someone to fix the start up time of their
app, because it's the slowest by far of any webapp that I use. That should be
their first hiring priority.

~~~
pspeter3
It is one of our biggest hiring priorities and we are actively working on
this.

~~~
srcmap
Why not ask questions or ask for solution about that in the hiring and
interviewer process? You might just find folks who love/know how to solved
that kind of problems to work for you.

BTW, that is very trivial problem to solve for those who has done it.

------
trjordan
> We’ll ask you to solve some coding questions in a language and text editor
> of your choice. Feel free to bring your own laptop, or we’ll be happy to
> provide one. Notably, we ask candidates not to compile or run their code
> during this exercise, and not to refer to online resources.

Every time interviewing comes up, I have this funny feeling about letting
people look things up vs. whiteboarding them out. Reading this finally broke
something loose:

Coding in interviews should be as hard as the hardest coding you'll do on the
job. Heck, anything in the interview should be about how you do things when
the job is hard.

Now, this doesn't mean under stress. That may come naturally. But the
difference between good and great engineers is decided when the task is hard
and information is sparse. Asana is explicitly selecting for people who are
going to reason about things from first principles and try to get it right the
first time. I admire that they're owning up to that.

They'll pass over people who are exceptionally good at searching /
synthesizing information. They'll ding people who focus on tightening the
iteration time as a way of learning faster. But that's OK. They've probably
built a culture where that kind of person thrives, and they're trying to find
more engineers who will thrive at Asana.

~~~
vonmoltke
> That may come naturally. But the difference between good and great engineers
> is decided when the task is hard and information is sparse. Asana is
> explicitly selecting for people who are going to reason about things from
> first principles and try to get it right the first time. I admire that
> they're owning up to that.

> They'll pass over people who are exceptionally good at searching /
> synthesizing information.

I also admire that they are upfront about this, but the attitude in general
rubs my classical engineering self the wrong way.

"[R]eason[ing] about things from first principles" is not the same thing as
having those first principles memorized. That was one thing my undergrad
taught me repeatedly. Engineering isn't about memorizing formulas, it is about
learning what those formulas mean, how to use those formulas, and when to
apply them to a task at hand. Software engineering should be no different.
Expecting people to solve hard and novel problems with no resources whatsoever
is ludicrous.

~~~
thedufer
Much of the point of "first principles" is that there aren't very many of
them. If you have any trouble at all remembering them, you can be fairly
certain that you have not trimmed enough.

------
aiokos
Ah, got to love the subscription pop-up with a close button that doesn't work.
A timeless web design principle.

Aside from that, when will we get past the requirement for data structure
knowledge for positions that center around web design? Data structure
knowledge is available at my fingertips throughout the work day. Why must I
memorize implementation boilerplate for an interview?

------
sudo_bang_bang
I tried interviewing with Asana and they rejected me on the basis that I
didn't have a computer science degree. For the record, I've built apps that
have generated millions in revenue and I have many years of experience. If
they are trying to be meritocratic and community friendly with this, they
should change that aspect of their culture to reflect that.

~~~
meritt
Can you blame them? That CS degree provides essential knowledge when it comes
to solving some of the world's most complex problems: to-do lists.

------
usaphp
From the guide: "we ask candidates not to compile or run their code during
this exercise, _and not to refer to online resources_ "

Seriously? You don't allow referring to online resources? Do you also unplug
the internet connection for all of your existing programmers so they don't
refer to online resources?

The whole point of the interview is to see how you would perform at your job
at the first place, creating an artificial job for an interview where you swap
words in the array or solving some other stupid quiz + disconnecting you from
online resources is just plain stupid in my opinion.

~~~
SkidanovAlex
The very next sentence there is "Our goal is not to simulate day-to-day
software development", so no, they probably do not unplug the internet
connection for the existing programmers.

~~~
usaphp
Why don't they ask them to play soccer on the interview or see how fast a
person can run, if they don't care if the candidate will perform well in their
day-to-day software development?

------
p4wnc6
I did an extensive set of interviews with Asana and did not enjoy the
experience. After an initial screen, I was required to complete a fairly
extensive take home data science problem that involved querying an AWS-hosted
database, fitting a predictive model on that test data, then writing a report
summarizing which factors in my model were most important and what else I
might do.

Considering that (a) some of the data in the database was erroneous, but when
I alerted them about it they took a long time to respond and I had to just
hack a work around, and (b) it was a huge time sink in an otherwise busy week
for me, I was very proud of my solution.

It was a simple random forest model, with no exploration of optimizing the
parameters, and some simple entropy metrics for feature importance. I wrote
nicely encapsulated SQLAlchemy for the database side, then modular sklearn and
pandas code for the model fits and plots, including calibration curves in
addition to accuracy. I put it all together into an 8 page TeX write up with
plots.

I got virtually no feedback, then after a long time I was asked to have a
technical phone interview reviewing my submission that was with two data
scientists simultaneously.

They barely asked me anything about my code design. They asked a few questions
about my choice of accuracy measures and entropy scores, but it was extremely
hard to understand what they wanted to hear (why do companies still think that
a multi-party phone call is a good use of time??).

It was a major instance of "guess the teacher's password" which was a huge
turnoff for me.

Then they asked very vague, high-level questions about designing a news-feed-
like interface, and how might you determine which articles are newsworthy on
an individual basis. Again, super vague. I threw out all kinds of ideas about
seasonality, correlation to major events, properties of your network, ... But
nothing seemed interesting to them at all. They clearly wanted someone to
recite some well-known stuff Facebook already used, but none of that was part
of the job ad or even related at all. It was bizarre.

After that, another mysterious block of time went by with no feedback, then I
got rejected with no explanation.

Later I learned that Asana has pivoted to focus on providing Agile junk for
enterprise project management, so I was relieved to have dodged that bullet,
but it was still a vague and time-wasting interview process, with so much
magic "guess the teacher's password" nonsense.

~~~
jacklionheart
Hi there -- I'm really sorry that you had such a negative experience. It
sounds like we didn't honor how much time you'd put into the challenge, and
that sucks.

I'd be happy to share more feedback with you if you're still interested,
though it seems like you've moved on happily. Feel free to reach out to
jack@asana.com and we can talk more.

Regardless, best of luck where you are now.

------
avoutthere
> Our goal is not to simulate day-to-day software development...

I don't understand this. Why do they not want to see how well a candidate
performs the tasks that they will be expected to perform on the job? Does this
not result in them selecting for the wrong set of skills/knowledge?

~~~
gkop
Plus they're using this goal to justify not letting you run your code during
the interview, which prohibits a "try it and see" problem-solving approach.
Asana engineering team, what's wrong with "try it and see"?

~~~
rmah
You're kidding, right?

~~~
gkop
Not kidding! "Try it and see" is a legitimate problem-solving approach.

------
artimaeis
Much more interesting is the guide that this blog post mentions at
[https://asana.com/eng/interview-guide](https://asana.com/eng/interview-
guide).

~~~
twphonebillsoon
I don't really see whats interesting about this either way. Lots of places you
interview at tell you what the interview is going to be like and what you are
expected to be familiar with. This is just 'write code and we'll be testing
you on data structures and algorithms!'

There's some flowery language there about having a genuine desire to want to
hire you unlike those 'other evil companies!'... sounds like the same shit
pretty much every other company does to me.

~~~
riyadparvez
I had also the same impression. I think one of the best approaches is let the
best engineers at the company go through same interview process. If they can't
get pass through the hiring bar of the company, then maybe that company should
re-think their hiring process.

~~~
th0raway
My current company has a very similar interview style, and I later learned
that I was pretty close to not being hired because I didn't do quite as well
as one interviewer wanted on a coding challenge: I ran out of time. But what I
also learned is that the interviewer had never solved the problem in the
language I used!

I had one of our most senior engineers, who has worked in this language's
compiler, to try to solve the exercise. Instead of 40 minutes, it took him
three hours!

If a candidate is going to interview in a language, for the love of god, have
as a prereq that the interviewer is actually capable of doing the exercise in
that language.

------
orangetabbycat
I had an onsite interview for a new grad engineering position at Asana
recently. It's true that they test you on technical problems you don't know
the answer to; sometimes it seems like they tested me on technical problems
that had no answer. One of their interviewers, who was in charge of both
conducting a one-on-one with me and explaining the programming questions to
me, was very friendly but also had a hard time clarifying any of his questions
for me. I didn't really understand him and it was obvious from his facial
expressions that he believed that his explanations had sufficed from the very
beginning. The questions were either too vague and could be approached from
many different directions or were so specific that it was obvious the right
solution would have to be found in an epiphany-like "aha!" moment of
mathematical understanding. I was able to answer and program most of the
questions. I have to say that I was really happy to receive feedback following
my interview, which no other company did. My advice to Asana: Don't let any
one interviewer dominate the experience of the interviewee (especially if you
may have suspicious that the interviewer may be overly didactic or has
difficulty explaining things.) Of course, that's just my experience. I ended
up finding a good position somewhere else and think Asana's interviews were
some of the best ones I had the honor of failing.

~~~
riyadparvez
> Don't let any one interviewer dominate the experience of the interviewee
> (especially if you may have suspicious that the interviewer may be overly
> didactic or has difficulty explaining things.)

I think this a common problem. Many companies let a single interviewer has
huge influence on hiring decisions. What if the interviewer himself/herself
had some bias that s/he wasn't aware of?

------
Xyik
Sigh, more marketing. Sorry, but at the end of the day any company that does
technical interviews evaluates candidates the same ways and asks the same type
of questions, having an interview guide that says things like "Instead, let’s
open a dialogue. Let’s solve problems together and see what we can learn in
the process. Let’s have fun!" is just trying to get more people to apply.

And I'm not slamming algorithms or technical interviews, but its time
companies stop trying to dress themselves and say it how it is.

~~~
jacklionheart
JFYI, that's not accurate. We've been sending this to candidates for months;
the genuine impetus was to try to put all candidates on a more level and
accurate playing field so that we can make better hires.

That's not to say it's not self-serving; of course we benefit from making
more/better hires. But it's not just marketing.

------
bpp
I had a really stellar interview experience at Asana (and, full disclosure, I
didn't get an offer).

What really stood out was the pages of notes that the recruiter sent me before
inviting me to apply again in a few months. I had just been through many on-
site interviews at many companies, and no one came close to providing the
level of feedback that Asana did. The document that I was sent can't have been
too far off from the unedited notes that the interviewers shared with each
other in their debrief. It really spoke to Asana's commitment to transparency
and helping develop people - even those they didn't want to hire. I'm happy
where I am now, but based on this experience, I would absolutely apply again
in the future.

------
tptacek
I appreciate the sentiment behind documents like this, but, Asana, you can do
better. When you do, you'll find it gets easier to hire people, and, perhaps
counterintuitively, the quality of your hires will go up.

Specifically:

It's clear from the tone that you're trying to mitigate the hostility and
unfriendliness of the conventional programmer interview. Great! But you need
to do more than superficially adjust your tone. What this document describes
is literally the circa-1999 software developer interview, in almost _every_
detail --- the fuzzily described coding question, the "algorithm and data
structures" interview, the design review.

Everyone has heard of interviews that pointlessly refuse candidates the
opportunity to consult Internet references, _like every working programmer
does every time they code anything ever_. But your process goes a step
further: you won't even allow candidates to _compile their code_. How can this
possibly be helpful to your process?

The most alienating and hostile aspect of the conventional job interview is
on-the-spot whiteboard programming. You shouldn't have programmers write code
in interviews at all, if you can avoid it; instead, take a week or two and
design programming questions that candidates can do at home, on their own
schedule, in their own surroundings.

But if you're going to make people code in an environment that is utterly
unlike the one professionals actually work in, the onus is on you to do
everything you possibly can to mitigate the performance penalty you're
imposing. Why not, instead of coming up with crazy rules like "no running your
own code allowed", point your creativity in the opposite direction and find
ways to get candidates to be more at ease with writing code with someone
looking over their shoulder?

Here are some things you can do _right now_ to make your interviewing guide
actually helpful to candidates:

* Provide sample questions. They do not need to be the ones you're asking in real interviews (although: if they are truly good questions, they could be!), but they should be close enough that a candidate would have no business being surprised by the real ones.

* Provide a detailed breakdown of your interviewing process. Do you phone screen? How many times? Who staffs the phone screens? What kinds of questions get asked on them? Who delivers the in-person interviews? How long do they last? These are some of the questions real candidates have about interview processes. I don't understand why more companies don't just answer them up front.

* Provide reference material for candidates. Don't point them to other people's interviewing guides! You know your jobs better than anyone else does... right? How about instead: what are the most popular books on the shelves at your team? (Bonus points: _buy those books for candidates_ ). What are some Github repositories you think represent really well-engineered software? What are some examples of really hard problems your team is still grappling with?

I hired for years and years with a battery of over-the-top technical questions
and a rolodex of personal contacts. Then my team and I had a crazy idea: we
opened our whole interview process up, standardized it, and took pains to make
it understandable to people with no experience in our field. What we learned
was that there is a huge amount of underutilized talent out there that can't
(because it's locked out due to jargon and poorly described requirements) or
won't (because it's denied social permission) apply for jobs unless you go out
of your way to bring them in. _Many of these people are better than you are_.
At least, they were for me. Stop bloodying your forehead against the wall
competing with Facebook and Google for people who --- if they can navigate the
process you describe --- can get an offer from those companies any time they
want.

~~~
markbnj
>> The most alienating and hostile aspect of the conventional job interview is
on-the-spot whiteboard programming. You shouldn't have programmers write code
in interviews at all, if you can avoid it; instead, take a week or two and
design programming questions that candidates can do at home, on their own
schedule, in their own surroundings.

Phenomenally good advice. I wonder, and I don't mean this to be a loaded
question because I really don't know the answer: do other practitioners in
other highly-skilled disciplines have to dance these sorts of dances to get
hired? Does a doctor have to diagnose on the spot? Does an architect have to
design a building on a whiteboard? Do they make a lawyer stand up in front of
a group of partners and argue some case off the top of his head? Perhaps they
do. I really don't know. But it seems to me that basically professional
experience counts for little other than to keep the resume out of the trash
folder.

~~~
viraptor
Don't know about the architect, or a lawyer, but doctors do get real-life
questions... although probably not the ones with years of experience and more
letters in the title than in the name. A bigger difference though is that
lawyers and doctors have to pass an exam to practice their profession, which
already excludes a lot of people. Anyone can be a programmer.

~~~
markbnj
>> Anyone can be a programmer.

Anyone can be a doctor or lawyer too. I know what you mean, I think, but I
don't know if I buy the distinction. There is a formal pass-the-bar testing
event for lawyers. For nurses and I believe doctors there are boards. Same
basic purpose AFAIK. They take these tests when they are just entering the
profession. How meaningful are they years later, compared with the experience
gained in the interim? My wife is a registered nurse working in cardiac
critical care for seven years. If she interviews for a new job with that
employment record behind her they aren't going to ask her what an aorta is,
which is sort of the equivalent of what Asana is talking about doing here (and
what everyone does, to be fair).

So you seem to be saying that a professional nurse, as an example, can be
trusted to know the basics after seven years of work, because he or she was
required to take a hard test on graduation. But a programmer with seven years
of experience must prove that he or she knows what a binary tree does? It's
also worth pointing out that lawyers can get people jailed or cause them to
lose their property. Doctors and nurses can kill them. So can we,
occasionally, but it's rare.

The most effective interviewing technique, as far as I am concerned is to send
the candidate out for lunch with a few of your engineers and tell them to find
out whether he knows what he's doing. You spend an hour talking software with
some people and it's awful hard to fake knowing your stuff.

~~~
plinkplonk
"But a programmer with seven years of experience must prove that he or she
knows what a binary tree does?"

you'd be surprised at how many programmers with seven years experience have no
freakin clue what a binary tree does!

There is a reason fizzbuzz is phenomenal filter.

~~~
kevinmchugh
fizzbuzz filters for candidates who a) can do a trivial programming task in a
unrealistic high-pressure, high-stakes environment, or b) signal membership in
your group via memorization of hazing rituals.

fizzbuzz does not filter for programmers capable of doing original, thoughtful
work in a realistic work environment.

------
golergka
Why does everyone love algorithms and data structures of all the CS topics so
much? I would like to be asked about covariance and contravariance, LALR
parsing or some basic statistics for once.

But more importantly, I would love to be asked about CS topics that arise
every day in development: code organization. How many times do you get a new
requirement that makes you throw all previous assumptions (that an
architecture was naively built upon) out of the window? When it turns out that
module A and B, which were completely different and didn't even know about
each other, now need each other's data? Yes, this situation arises when
original architecture wasn't thought through enough — and we all were guilty
of this at one point or another.

Now, what a developer does in that situation is much more important than
whether he remembers what complexity what kind of hash table (with all the
chaining and hashing types) is. Does he just hack it so it works, creating a
lot of fun for future maintainer? Does he decide to spend 2 days to refactor
the whole system into a new architecture? Or does he find a balance somewhere
in between?

------
the_cat_kittles
i dont understand why you would even listen to them, their website is such a
piece of shit- terrible performance, terrible ui. maybe their infrastructure
is good, but i can tell you they absolutely suck at frontend js.

------
pklausler
> We design our interview questions to see how engineers work through
> technical problems they don’t know the answer to yet.

That's the way to do it, I think. In my own technical interviews, I pose
(relatively) simple questions that require some problem analysis, a strategy
discussion, and then some coding and testing, with a goal of testing aptitude
for basic programming abilities like modularization, indirection, recursion,
defensive programming, unit testing, etc. I keep things interactive and
collaborative and see how much the candidate can do for themselves.

~~~
p4wnc6
The problem is that Asana seems to have preconceived ideas of what answers are
supposed to look like, but the questions are vague and open-ended enough that
even very good candidates can give good answers that simply diverge from the
manner of thinking that the Asana question designer had in mind.

If the question has a well-defined answer, then this is good. If you are
asking questions that the candidate "doesn't know the answer to" merely as a
parochial side effect of the question's vagueness and the vast space of
possible good approaches, then you're really only screening for people who
happen to think about things the same way you already do, and thus are good at
guessing what you want to hear.

I'd argue that this is very bad in terms of diversity of thought.

~~~
jacklionheart
Hi there! That sounds bad. Could you let me know what's giving you the
impression that that's Asana's goal?

We really try to explicitly avoid that kind of interview evaluation, and put
strong emphasis on Communication, User empathy, and Learning, per the guide.
But definitely "did they say the right answer?" is an easy trap to fall into
so its entirely possible that some interviewers are doing it -- would love any
concrete feedback you could give to help us avoid it in the future.

~~~
p4wnc6
The more concrete feedback is in the previous post (you commented on that one
too). But I will elaborate here:

On 8/4/15, I received a message from an Asana representative via Stack
Overflow Careers. After a few back-and-forths discussing specifics about
whether I was a good fit, I then agree to do the take-home test.

I received the take home test on 8/7/15\. I emailed back on 8/8/15 with a
verifiable error in a timestamp column of an AWS-hosted Postgres database that
was configured with some test data (data resembling some certain kinds of user
data for modeling what an "engaged" user is, and what factors predict when a
user will be "engaged"). I included code which could be run to demonstrate the
erroneous data.

On 8/9/15 I wrote in again stating that I had created a workaround for the
erroneous timestamp data, since the only way I could complete the exam was to
do it on those particular days (I couldn't wait around for the exam data to be
fixed).

I submitted my solution on 8/10/15\. I got an email on 8/11/15 with a note
from someone in engineering apologizing for the timestamp data error, but also
saying it shouldn't affect my solution. The recruiter who sent that email also
said that my submission was currently being evaluated.

Then I waited a week before hearing anything. On 8/18/15 I got a response
apologizing for the delay and saying that I was invited for the next round
interview.

The email said I would be speaking with a person named Jack (maybe you?) a few
days later (8/21/15 at 4:00 pm EST). But that call actually included two data
scientists over one conference call line.

Both interviewers effectively made no comment on my software choices. In fact,
when I spoke about how I cared about software design and best practices, even
when working on a rapid prototype, it actually sounded as if the interviewers
thought this was a bad thing and there was some awkward silence. I brought up
examples of how I had worked in a very fast paced environment before (quant
finance) and my team had learned some hard lessons that the naive approach of
letting data scientists just write scripts for ad hoc modeling, assuming that
engineering will "productionize" it later, tends to lead to some bad failure
cases, and that it's far more efficient over time to simply require data
scientists to construct rapid prototypes, even from the first line of
exploratory, ad-hoc code, that adhere to many best practices and are only a
short distance from "production" even in their first version.

Then they asked me some vague and hard-to-decipher-over-a-conference-call
questions about how to measure the accuracy for the random forest model I had
fitted, and about my conclusions regarding which predictors were most
efficacious. Despite having used random forests for published research results
and even having taught part of a course on random forests when I was a grad
student, I could not make heads nor tails of what exact accuracy metric the
interviewer was fishing for. I am sure that if such a question was asked in-
person, with some pen and paper, or something where it's not just the vague
descriptions of some stranger I've never met before, then we could have sorted
it out. But the communication was just too poor over the conference call for
this kind of question.

Given that I had to solve the problem in my spare time while conducting all
the other stuff in my life, I felt it was quite reasonable for me to take a
very conservative approach: fit a very simple model, don't waste time chasing
down parameter optimizations (like depth of the trees or some huge set of
features). Just walk through some very straightforward analysis, use some
straightforward features, produce some common diagnostic curves (which showed
a huge bias in the data labels -- almost all example users were not "engaged"
), and draw only conservative conclusions (e.g. with such severe bias in the
toy data set, it's pretty hard to conclusively say anything, so don't pretend
you can say more than you really can, don't recommend strong conclusions, and
leave it at that). The point is to show basically a mock-up of the end-to-end
workflow, from querying the data, to constructing the features, to fitting the
model, to assessing the fit. The point is clearly not to do that in some kind
of insanely accurate way when you're talking about a busy adult doing it in
their spare time over a few days.

The two interviewers had some gripes with the specifics of my accuracy
measures (which seemed really misplaced, again for an analysis done in a busy
adult's spare time using erroneous toy data).

~~~
p4wnc6
After that, the interviewers asked the questions about how to develop a news
feed ranking algorithm at a super vague, high level. I spoke about how you
would need to account for seasonality or cyclic effects -- e.g. election news
would clearly be ranked higher in newsworthiness when approaching important
elections, or the same idea for sports-related articles, etc. I talked about
trying to determine which news articles are the most clicked-on by peers in
your network, possibly using some centrality measures to rate certain users
more as "trend setters" and use their news viewing habits as guides for what
to show to others.

I was actually fairly detailed in my description of some math ideas for these
things, but the whole conversation sounded like someone sucked the air out of
it. The interviewers kept trying to get me to talk more specifically about
aspects of the newsfeed that actually occur in Facebook's newsfeed. When I
told them I don't have a Facebook account and don't know what the modern
version of the newsfeed does (which is true, I don't), they sounded like they
just lost all interest in talking to me. My hunch was that they wanted me to
possibly talk about A/B testing, but this is kind of silly. They definitely
didn't ask me about A/B testing. It was like they wanted me to guess that
phrase. Plus, at least for frequentist A/B testing, there are good reasons to
believe a lot of it is junk science and a lot of applied mathematicians would
not consider this a reasonable idea at all. I guess I could have talked a
little about Bayesian A/B testing, but my mind associated their problem with
network modeling and seasonality effect modeling, not A/B testing. (I can't be
sure they were focused on A/B testing -- it's just what I independently
thought after the fact when struggling to understand the outcome since no one
gave any feedback).

Overall, I could tell it went badly, but I could not understand what the two
interviewers were trying to get me to say. As someone who has worked on a lot
of math modeling problems, from undergrad and on through grad school, I
thought my way of breaking down and describing ideas for the newsfeed example
was perfectly on-point and technically useful. To boot, the job I was
interviewing for did not (at least as it was explained to me) have anything to
do with ranking problems, A/B testing, or newsfeed-like work at all.

That was on 8/21/15\. I then heard no response of any kind, until 8/28/15 when
I got a form letter rejection email that simply said at that point, Asana did
not want to continue the process, and gave no further details.

From start to finish it went from 8/4 to 8/28, so almost a month. I scrambled
to put in a lot of hours to solve the take-home test in my spare time over a
short window, and then went through two different full week long periods of
silence (one after my code test was being reviewed, and one after the
vague/weird two-on-one phone interview), only to receive no feed back at all
on my software solution, limited and tangential feedback on my actual data
science approach (feedback which didn't seem at all appropriate for something
a busy adult would put together over a couple of days), and no feedback at all
about the decision to reject me based on the interviews.

The overall experience left me feeling like Asana asks very vague questions
and reserves the right to be dissatisfied if you don't identify the magic
answer to the vague question. It also made me feel that Asana does not act in
a timely manner to review code submissions or to provide interview feedback,
even though Asana did require me (despite being busy outside of interviews) to
complete a code test in a timely manner. It also sounded like Asana might have
some data science employees who act a bit like "gatekeepers" in the interview
process -- they felt a need to assert that they are better than the candidate,
even if it means bringing some discussion up about specific accuracy criteria
or something when it may not really be very appropriate for a quick solution
to a poorly specified problem.

It's a bit like someone talking about Haskell programming for an interview,
successfully solving some Monad problems in a quick way given the time
constraints, and then getting grilled about unification algorithms in the
Hindley-Milner type system. It's just to let the existing employee who is
conducting the interview feel superior and act like it is some great filter on
the incoming talent pool, when really it's mostly unrelated minutiae.

As I mentioned in the other comment, I later learned the extent to which Asana
supports Agile-focused systems for enterprise project management, and this
actually saddened me a lot. I always liked Asana because it embodied a type of
minimalist efficiency that is fundamentally incompatible with the bureaucratic
wastefulness of Agile/Scrum/Kanban/"lean" buzzword systems. I had high hopes
that Asana would not be just another one of those cargo cult management junk
systems, and maybe it still will avoid that fate, but agreeing to support
Agile doesn't bode well.

So ultimately it was a good thing I was rejected, though I do not feel it was
a fair reflection of my engineering effort on the exam, and I certainly don't
feel that the evaluation was done in a way that showed respect for the amount
of time and effort that I took to complete the task.

~~~
jacklionheart
Yep, I'm that Jack. I remember you now. Thanks for the clear picture of what
it's like to be on the candidate side. Efficiency is clearly a goal, and we
clearly failed here given that it took us a week to get back to you after the
phone interivew. I do think your interview was an outlier in that regard, but
that doesn't excuse it.

Re: the questions themselves, although I don't agree with your
characterization of what's going on or what we're looking for, I definitely
think there's a lot of room to improve on our data-science screening. I'll
send you an email with more re: what specifically came up in your interview
that made us reject.

Again, thanks for taking the time to share this really explicit feedback.

(For the record, I don't really know what you're referring to re: Asana and
Agile.)

~~~
p4wnc6
Hopefully this thread's not too stale yet to escape notice.

I just wanted to add that the user above, Jack, absolutely made good on what
he said.

He sent me some very well-written, extremely professional, and constructive
feedback about my interview process. I am able to see a lot of things I can
improve and it will help me for future interviews. I think it also highlights
some things about the process that Asana can improve, and by all evidence of
Jack's sincerity in the feedback, I believe Asana is very interested in ways
of improving.

Don't let anyone say that _all_ I do on Hacker News is grumble about open-plan
offices... in this case I am expressing my sincere gratitude and appreciation
for Jack's feedback.

If I had received this kind of response in a timely manner when Asana decided
not to continue the interview process with me, I would have walked away with a
significant positive feeling about interviewing with them.

------
sulam
I have interviewed (phone screened) at Asana. It was delightful. I really
liked the people I met there and enjoyed the problem I worked through over the
phone. It really did feel collaborative and like we were solving a problem
together just as software engineers do when they're not in this artificially
combative environment we often create when interviewing.

Regarding the problem, I didn't have the best approach out of the gate. In
fact, I had a vague memory of the right approach ("I seem to recall it uses
two data structures, and if I really stretch I think it was X and Y, but I
don't remember the details.") That made it harder to get to the final
solution. The interviewer was very good at at talking things through with me
in a way that didn't seem like they were frustrated or otherwise grumpy that I
didn't know this particular problem cold. They apparently thought the results
were encouraging enough to move to an onsite, but I got an offer from my
current employer (where I am working with close personal friends) and had to
pass on the chance to find out what that would be like.

All in all I think Asana is probably a great place to work. I also like the
product, although it seems like you need a whole team to adopt it to make it
sticky.

------
marknutter
I find it funny that startups only implement these ridiculous hiring practices
_after_ they start to grow. Imagine if the founders had to quiz each other on
algorithms before they agreed to do a startup in the first place. The first
few hires at a startup are arguably the most important, yet they don't go
through the bullshit that later stage startups put potential employees do.

~~~
jacklionheart
> I find it funny that startups only implement these ridiculous hiring
> practices after they start to grow

Hmm, I'm curious what you're assuming! At Asana, at least, our first hires
went through many of the same questions that interviewers today go through,
but we were just a lot less systematic about our evaluation and less
scrupulous about accidentally testing for knowledge/context rather than
ability. The intention of this guide was to make the interview /less/
bullshit-y, by giving people enough grounding to not be thrown off by
unexpressed expectations.

------
vain
Their subscription pop up at the bottom of the page with a close button that
does nothing only makes me sure I don't want to work for them.

------
krosaen
I like the idea behind this but the actual guide is pretty thin:

[https://asana.com/eng/interview-guide](https://asana.com/eng/interview-guide)

would be great to see a more detailed guide covering the concepts they like to
see master of and problem solving skills they are looking for.

------
sajid
Disappointing. The links to Triplebyte and Stripe are far more informative.

~~~
jacklionheart
Thanks for the feedback! What did you learn from those that yo u didn't see in
the Asana one?

\- Jack from asana

------
whitenoice
How do these question align with the work Asana is doing? Side note: The blog
site performance degrades as you keep scrolling down.

------
MicroBerto
Maybe you could hire someone that will design you a mobile website?

I'm not installing your app on my phone...

------
loycombinate
There are many different types of engineers This guide is for software
engineers

------
hans
this place has so much cash, they may as well hire trial by fire, create an
onboarding experience for serious candidates, morph them to your liking or let
them go, stop guessing with goofball interview theater.

------
andrewstuart
\- smart

\- gets things done

\- able to actively engage in discussion about software development

\- works hard

\- wants to work here doing this job

\- loves to learn

\- not overly dogmatic

The rest, meh.

