
The time spent on practising white board test may not be worthy - nanxiao
http://nanxiao.me/en/the-time-spent-on-practising-white-board-test-may-not-be-worthy/
======
jcadam
Of course it isn't. An interview process that requires/rewards cramming for
months ahead of time is fundamentally broken.

I just gave up on getting hired at any tech company that uses whiteboard
interviewing. It seems like, just as in college where there's always that _one
kid_ who aces every test and wrecks the curve for everyone else, in any
candidate pool there is always one demigod of algorithms (and it isn't you,
you're merely great) that spends every waking hour of his/her life on
hackerrank and topcoder.

Whatever, I spend my time on side projects, learning new things, family, and
the occasional bit of fun and relaxation. And I still manage to stay employed
(so far).

~~~
protonimitate
I think this is the only way to fight the white-boarding standard interview.
Stop applying to these jobs.

I know the allure of a big paycheck is too much for most people, but why are
ok with going through these ridiculous circus acts just for the "privileged"
of working for Big N?

~~~
oblio
> I know the allure of a big paycheck is too much for most people, but why are
> ok with going through these ridiculous circus acts just for the "privileged"
> of working for Big N?

Cause having a lot of money changes your life and it's not worth giving that
up for a bit of pride.

Also working for "Big N" could give you access to very interesting projects,
either while you work there or after you leave, since brands do matter and
your CV will be more attractive to other companies.

~~~
protonimitate
That's fine, but you can't have it both ways which is what a lot of the
complaints around whiteboarding are.

Either you accept that it's the industry standard and the gateway to a big
paycheck and a prestigious CV, or you don't participate.

~~~
oblio
What makes you think it's the same people accepting the status quo and
whining? :)

------
Mountain_Skies
Seems like the more employers complain about the lack of available labor, the
more they ratchet up the pointless interview hazing. It's hard to take their
claims of 'no one is available' when they're so capricious in tossing away
potential employees.

~~~
cdoxsey
One thing I've noticed, it's often not the employer at fault but the
employees. Engineers reject candidates for extremely pedantic reasons (for
example, he used a dict instead of a defaultdict, which some will take as a
signal that the developer doesn't know python). I suspect that if you were to
interview current employees 50% of them would fail the interview.

~~~
oceanghost
The worse a programmer-- the more biting his questions. They'll google some
trivia before the meeting to look smart, then reject the candidate to feel
good about himself.

------
gameswithgo
>You just need to call std::sort, and don’t care it is “bubble sort” or “quick
sort” under the hood

This is often the case, but not always. When you know things about the input
data you may be able to get better performance than the generic sorts built
into a language. Or, when you know more about sorting algorithms you may be
better able to choose among the available sorts/sort options provided by a
language or library.

I don't understand the attitude of people willfully choosing not to understand
their craft better. Yes we can all be mediocre by just using libraries that
other people made, but

1\. Someone has to make the libraries 2\. The average quality of software is
not very good, its very often slow (which is astounding given how fast the
hardware is) and buggy, often with disastrous consequences, so we should all
be striving to do better than average.

~~~
xfer
In case of sorting algorithm, it is important to know the trade-off. However,
precisely knowing the details e.g. knowing the details of deleting a node from
a balanced tree, is not really useful and that is exactly what is needed to
pass whiteboard tests.

In most cases(in my experience), i can always lookup details of some algorithm
when i need it. Admittedly, my job doesn't require me to design algorithms,
but implement some not so readily available ones sometimes.

~~~
gameswithgo
For sure, I'm not arguing that some whiteboard interview questions aren't
crazy, i realize there are.

But also I'm not sure how common it is to have interview questions such as
"delete a node from a balanced tree" where they actually expect you to get
that right vs just see if you get the basic idea. I've never run into it.
Perhaps many people on hackernews work in silicon valley and that is more
common there.

------
hmschreck
Is this really news to any of us? I figured most people here understood that
algorithm/whiteboard test prep was pretty much standard, but incredibly
pointless.

~~~
gameswithgo
It might be pointless if you already understand the concepts 100% and you are
just brushing up so you can do them quickly and more completely.

But for a lot of people they will actually by reminding themselves of
important fundamentals of computer science, which might be quite a good thing.

~~~
mixologic
A whiteboard test is probably the worst way to determine somebodies grasp of
computer science fundamentals, especially when what really matters is can they
_apply_ those fundamentals properly in the proper context?

------
maxxxxx
You practice whiteboard tests to get a job and not to be a better programmer.
If it gets you a better paid job it's a win but then you can stop.

~~~
yanslookup
I recently joined Blind and mostly follow the compensation discussions. 2
Takeaways:

1) Most of us are incredibly underpaid

2) A common question is when asking how candidates prepared that resulted in
offers is "How many problems did you do on leetcode?" I'd never heard of
leetcode but it seems if you want an offer from FANG, Uber, Lyft, etc then you
put your time in practicing programming problems.

~~~
joehahart
What's Blind?

~~~
Etheryte
Looks like an anonymous work oriented chat app:
[https://www.teamblind.com](https://www.teamblind.com)

------
gwbas1c
I don't think you understand the purpose of a whiteboard test. When I
interview candidates, I'm looking for a lot of things that can't be practiced:

\- How well can you converse about a technical topic

\- How adaptable are you to design constraints

\- Do you understand the fundamentals of threading, databases, data
structures, ect?

\- Can you "think in code?"

\- How well do you really understand the language that you spent XX years of
your career working in

The best way to improve your skills is by doing: specifically, choose a hobby
project that involves an area that you want to learn. Reading a book will only
get you about 10-15% of the way there. Books are useful to choose a technology
to learn, but not to learn the specific technology.

Getting back to whiteboarding: I've used it to:

\- Reject candidates who forget basic fundamentals of a language that they
claim XX years in. (Seriously, if you can't construct an "if" statement or use
a common collection class in a language you claim XX years in, then you don't
belong on my team.)

\- Reject candidates who can't learn an unfamiliar API. This is critical,
because we need to discuss new APIs in design discussions; and because "old &
working" code isn't always worth refactoring to use some shiny new API.

\- Reject candidates who don't know how to use a database. (Frameworks / ORMs
are not a replacement for knowing how a database works and how to program with
one.)

\- Reject candidates who don't know fundamentals of threading

~~~
maxxxxx
You seem to have a reasonable approach to interviews. However, as a contractor
I was in interviews where I had to code the optimal solution for a certain
problem on the whiteboard without syntax errors. That's just silly.

There was no discussion to show any thought process. It was about knowing the
expected solution to 100%.

~~~
gwbas1c
Remember, interviews are a 2-way street. If the company has unrealistic
expectations in the interview, then it should influence _your_ _decision_ to
continue pursuing the job.

~~~
maxxxxx
It does influence the decision. But there is also a good chance that the
engineer doing the interview doesn't really reflect the work environment.

------
xet7
I found this blog post after also writing today about Time Well Spent
[https://blog.wekan.team/2018/02/time-well-
spent/index.html](https://blog.wekan.team/2018/02/time-well-spent/index.html)
and production setup for Wekan
[https://wekan.github.io](https://wekan.github.io) at AWS for thousands of
users
[https://github.com/wekan/wekan/wiki/AWS](https://github.com/wekan/wekan/wiki/AWS)
.

About optimization there is DTrace talks at
[http://dtrace.org/blogs/bmc/2018/02/03/talks/](http://dtrace.org/blogs/bmc/2018/02/03/talks/)
and related HN discussion at
[https://news.ycombinator.com/item?id=16303595](https://news.ycombinator.com/item?id=16303595)

I do remember Randal Schwartz talking at Floss Weekly
[https://twit.tv/floss](https://twit.tv/floss) sometime that in his work he
tries to listen developers and ask right questions to see where bottlenecks
are. Is there some expensive database query, does something need to be cached,
etc.

------
pan69
I think that the whole interviewing and whiteboard frustration is because not
all software development is equal. I think software development is a spectrum,
with (let's say) on the one hand engineering (the academic side) and the other
development (let's call it the "craft" side). Most software
positions/roles/jobs sit somewhere in the middle or lean to one side or the
other.

Some companies are clearly academic, think of the Google's of this world.
Other companies are more crafty/creative, think e.g. of typical web-dev shops.
Games development is an interesting one.

As software developers we like to stick things in clearly defined boxes and
treat software development as one big box. But I think it isn't.

I believe that the frustration most people have with interviewing is that the
wrong style of interviewing is applied for a certain role. E.g. a company that
is clearly on the crafts/creative side is hiring like they're Google (because
they read about how Google does it on the Internet).

My experience is that in most situations whiteboard interviews don't
contribute a single thing. A good conversation about making software usually
does a lot more. But I guess for inexperienced interviewers, a whiteboard is
an easy tool to hide behind.

------
llaolleh
I don't think it's a complete waste of time to do these practice interview
problems. At the end of the day it reinforces your understanding of core
computer science topics and helps you think out of the box(barring stupid
problems where you need to know a specific formula). Then there are situations
where you need a deep understanding of these algorithms to choose them for
specific situations irrelevant of implementation.

------
fahadkhan
I find this useful
[https://airtable.com/shr5TdnpVYVTpeRrN/tbluCbToxQ2knSLhh](https://airtable.com/shr5TdnpVYVTpeRrN/tbluCbToxQ2knSLhh)

Companies interested getting placed on it should raise PRs on
[https://github.com/poteto/hiring-without-
whiteboards](https://github.com/poteto/hiring-without-whiteboards)

------
dahart
> After stopping these practice, I leverage the spare time for following
> tasks: read C++ classical books and learn more about standard libraries

I might suggest this is the best way to study for whiteboard tests in the
first place. Practicing for them might only help if you have other people pick
the questions and watch you.

The last interview I had, I completely bombed the whiteboard test, but not for
technical reasons. I probably wouldn't have been able to practice my way out
of it. Luckily, the company also had a live coding test which I aced, and they
hired me, so it didn't matter.

As a manager and hiring interviewer, I have to say that I find whiteboard
tests moderately useful, even though I agree with pretty much all the
criticisms here so far.

The point of it is to try to see how the candidate thinks on their feet, watch
them talk their way through a problem without StackOverflow at their
fingertips. It almost never matters if the whiteboard code has bugs, the
question is more about the process, not the result.

------
ryandrake
I'm definitely not trying to be snarky or mean here--just offering my take on
this after reading the article. English is obviously the author's second
language, and I would offer that practicing English and learning better
fluency/grammar/spelling would help an order of magnitude more than practicing
computer science quizzes and coding challenges. Many of the candidates and
developers I encounter may or may not be good software engineers--I don't know
because their language skills are lacking, and they are unable to convey it
through spoken or written word.

I'm learning a second language myself and I feel your pain. I would never want
to interview in a second language. However, in the US workforce, English
proficiency is critical and such soft skills can often mean promotion into
management, increased responsibility, and technical leadership positions.

------
cube00
I'd rather someone who knows which library has the required function already
implemented in it rather then spending days enacting "not invented here"

------
Daycrawler
The first part of the article where the author explains how time-consuming it
is to study algorithms can really be applied to any field of study. The fact
that the author spends much time learning algorithm doesn't mean that he's
"dumb" as he humbly presents himself, it just means that he's just getting
started. It's like trying to pick up guitar and when you realize that it
really is hard and takes time you decide after one month that you're not a
talented guitar player.

Then there the second part where the author considers algorithmic knowledge as
useless (or not so much useless at best) because the tools available to a
programmer already implement any relevant algorithm.

There's some ironic parallel between this way of thinking and algorithmic
proficiency, actually. Many algorithms work by leveraging data structures
whose understanding of internal working isn't needed. For example, one way to
efficiently merge N sorted list into one big sorted list is to use a priority
queue. Do you need to know how a priority queue is implemented to find this
solution? Not at all. You just need to know the interface it offers, and the
complexity of each method. Finding the k-th smallest element in a list can be
done using a heap. Do you need to know how a heap is implemented to find this
solution? Not at all. You just need to know what a heap does.

Really, studying algorithms is like studying the C++ standard library, except
that instead of knowing about classes and methods as your toolbox, you know
data structures (and common patterns) as your toolbox. Of course any curious
mind will then go deeper and actually read about how those things are
implemented, building an even better understanding of the foundations, and
solving even deeper problems with it.

While being an interesting parallel, this doesn't really answers the author's
questioning about: what's the point? Which brings us to what the author forgot
to address in his article: white board test. The title of article sets the
scene with someone wanting to find a job in one of those big tech company
hiring people who can pass the whiteboard tests, but then who somehow...
forget about it and decides to abandon this endeavor? Well, fair enough, but
to be perfectly honest, while Google & Cie engineers certainly aren't spinning
up algorithms on a daily basis like mad computer scientists, the engineering
level there is still quite high. So there's definitely some basis in wanting
to pass the whiteboard interview, mostly working with intelligent people.

------
barkingcat
Practicing white board tests is more about learning composure, maintaining
cool under pressure (and under observation), and finding a way to think and
reason about problems that "shows your work."

Many times people can solve problems, but can't verbalise them enough to make
a white board test be an accurate indication of how they think.

It's not about practicing to solve any specific problem - that kind of
"practice" is counterproductive as the poster realised.

------
jfasi
I regularly conduct interviews at one of the more infamous "whiteboard
interview" companies. At the end of my interviews I like to take five to ten
minutes to answer whatever questions the candidate might have about myself,
the company, the process, etc. One time I had a candidate who absolutely
bombed the interview asked me a question that expressed a similar sentiment:
why do you guys place such a heavy emphasis on basic data structures and
algorithms when standard libraries and software packages offer easy to use,
efficient implementations?

To be honest, this is a fair question. For someone whose entire career has
been spent coding on a single machine, with data that fits in memory, reusing
pre-written O(NlogN) or faster algorithms, often with a single thread, I can
see why they would ask this question.

The answer is: for what we do, there is no off the shelf solution. Our
datasets almost never fit on a single machine, to the point where we make
jokes about five terabytes of data being so small we forgot how to count that
low. Our algorithms are almost all novel: one of my favorite interview
questions starts off as a trivial string manipulation problem but then
branches out into easy-to-express, easy-to-understand variations that actually
require some very sophisticated algorithms to solve. When it comes to pre-
built software, even our internal turnkey database solutions are so
sophisticated they require a solid understanding of distributed systems and
operating systems to avoid common pitfalls.

Honestly, not every person is up to the task of working in this environment.
Our workforce skews towards people with degrees in CS from high-ranking
universities not because we're snobby but because there are few places that
teach this particular combination of skillsets. You can probably go your
entire career without working for us or on the sorts of problems we try to
solve, and you can just as well prepare for interviews by focusing on
practical matters and not deeper algorithms and data structures knowledge. And
that's fine, I'm sure there are plenty of positions out there for you.

But if you do, don't come complaining to me that our interview process is too
hard or the prep process is too impractical or that we're being unfair because
none of the stuff we test for is practical in the real world. You can have an
easy time prepping for my interview or a job at my company, but not both.

EDIT: There is something to be said for companies that cargo cult this
interview process. True, if someone can pass this interview process they’re
probably pretty decent, but you have to be honest with yourself what sort of
problems you’ll be solving. OP, meanwhile, expressed an feeling of blanket
pointlessness without saying what kind of job he’s looking for. I hope I’ve
made clear this is counterproductive for companies like Facebook, Google,
Microsoft, etc.

~~~
ryandrake
I got one of these interviewers (but not at one of the big companies that
actually do things at scale), and stood my ground.

How would you sort this list? "Use the standard library and move on with my
life."

But I want you to show me... "Use the standard library and move on with my
life."

But what data structure would you use? "This one, because A, B, and C. It is
already provided, debugged, and tested, by my standard library."

How is that data structure implemented? "Doesn't matter. It works and the time
I spend not worrying about it I can spend shipping software."

\--

It actually worked. I think they respected the pragmatism. I actually went on
to help migrate them away from a _HORRIBLY BUGGY_ home-grown container/string
library, over to, you guessed it--the library that comes with the language.

~~~
oblio
I'd argue that for almost any companies except for the big 10 or so, that's
the correct approach. Especially for enterprise software. And even in those
big 10, for projects which are not directly tied to the things they do at
scale (I doubt the Hangouts Android client needs to reinvent the wheel...),
it's still the right approach.

I'm kind of sick and tired of companies writing their own frameworks and
languages when they can't really maintain them, long term. Google, Facebook,
Apple, Microsoft can, because they have a different culture and primarily
because they have a different focus as well as enough profits to support it.
Random Java big-shop can't. Their framework will be nice and shiny the first
year and 15 years later you'll be wondering why you're working with the
atrocious Struts-1 inspired undocumented internal framework.

~~~
jfasi
> I doubt the Hangouts Android client needs to reinvent the wheel...

This actually touches on an interesting point. My notorious interview company
is Google. I'm not familiar enough with the Hangouts Android client to tell
you anything about it, but for the sake of argument let's unrealistically
assume it's trivial. Imagine you have an organization where some engineers are
good enough to work on the Hangouts Android client but not good enough to work
on, say, the F1 distributed database.

Internal transfers become crapshoots. Is this person good enough to work on my
new, highly complex project? Should I reinterview them to make sure they can
hack it? Imagine what this does to the social and cultural stratification.
"Oh, those Hangouts guys are nice but they're not that impressive, it's not
like they're working on the machine learning or anything." It'd be a nightmare
both practically for management and socially for the engineers.

Our philosophy is: all engineers should have the chops to quickly be able to
ramp up on whatever project they like. This means rejecting a lot of people,
but it also means that two engineers can look at one another across the
hallway and immediately know they could swap teams and not miss more than a
couple weeks of productivity. When framed this way, I think the interview
process is a natural consequence.

Not sure what the culture is at other places, but I hope this is a useful data
point.

~~~
oblio
The only way this actually works is because Google is sitting on a pile of
cash, in my opinion.

But regarding your problem, you could solve that by having internal
interviews.

And I really doubt that for any complex product people switching will only
miss a few weeks of productivity.

------
Double_a_92
You Don't Say?

------
thesmallestcat
This is like saying running for exercise is a waste of time because you could
be driving instead. It's not about knowing a particular sort algorithm. It's
about the discipline of solving performance problems in the small. Knowing how
quicksort works isn't that helpful. Being accustomed to the thought processes
that led to the development of quicksort is important in any non-trivial
programming activity. I'm not writing Google-scale services, but I regularly
encounter algorithm design problems on the job, and they're never the exact
algorithm you studied for some white board exam. I think the author is
approaching algorithm study with the wrong attitude.

~~~
noxToken
I took it to mean that most developers in this current market do not need to
know implementation perfect algorithms and data structures like RB trees,
depth first search, A*, quicksort, etc. Rather your time is better spent on
learning the advantages and disadvantages on the structures and algorithms,
and you get more benefit from understanding the implementation. For example,
merge sort is dividing a list up into sublists until you hit single element
lists and recombine them into a single sorted list. That, in my opinion, is
more important than a picture perfect implementation on a white board.

