Of course it isn't. An interview process that requires/rewards cramming for months ahead of time is fundamentally broken.
I just gave up on getting hired at any tech company that uses whiteboard interviewing. It seems like, just as in college where there's always that one kid who aces every test and wrecks the curve for everyone else, in any candidate pool there is always one demigod of algorithms (and it isn't you, you're merely great) that spends every waking hour of his/her life on hackerrank and topcoder.
Whatever, I spend my time on side projects, learning new things, family, and the occasional bit of fun and relaxation. And I still manage to stay employed (so far).
Next time I find myself in a whiteboard interview, I'm thinking of tying a blindfold around my head, throwing out some profound-sounding Knuth quote, and then drawing some illegible scrawl all over the board. The challenge will be in keeping a straight face.
I think this is the only way to fight the white-boarding standard interview. Stop applying to these jobs.
I know the allure of a big paycheck is too much for most people, but why are ok with going through these ridiculous circus acts just for the "privileged" of working for Big N?
> I know the allure of a big paycheck is too much for most people, but why are ok with going through these ridiculous circus acts just for the "privileged" of working for Big N?
Cause having a lot of money changes your life and it's not worth giving that up for a bit of pride.
Also working for "Big N" could give you access to very interesting projects, either while you work there or after you leave, since brands do matter and your CV will be more attractive to other companies.
> Also working for "Big N" could give you access to very interesting projects
Maybe. The people I know who leave the big companies almost always do so because the work ends up becoming somewhat wrote -- there is a lot of boring infrastructure to be maintained at big tech company du jour.
Seems like the more employers complain about the lack of available labor, the more they ratchet up the pointless interview hazing. It's hard to take their claims of 'no one is available' when they're so capricious in tossing away potential employees.
One thing I've noticed, it's often not the employer at fault but the employees. Engineers reject candidates for extremely pedantic reasons (for example, he used a dict instead of a defaultdict, which some will take as a signal that the developer doesn't know python). I suspect that if you were to interview current employees 50% of them would fail the interview.
The worse a programmer-- the more biting his questions. They'll google some trivia before the meeting to look smart, then reject the candidate to feel good about himself.
>You just need to call std::sort, and don’t care it is “bubble sort” or “quick sort” under the hood
This is often the case, but not always. When you know things about the input data you may be able to get better performance than the generic sorts built into a language. Or, when you know more about sorting algorithms you may be better able to choose among the available sorts/sort options provided by a language or library.
I don't understand the attitude of people willfully choosing not to understand their craft better. Yes we can all be mediocre by just using libraries that other people made, but
1. Someone has to make the libraries
2. The average quality of software is not very good, its very often slow (which is astounding given how fast the hardware is) and buggy, often with disastrous consequences, so we should all be striving to do better than average.
In case of sorting algorithm, it is important to know the trade-off. However, precisely knowing the details e.g. knowing the details of deleting a node from a balanced tree, is not really useful and that is exactly what is needed to pass whiteboard tests.
In most cases(in my experience), i can always lookup details of some algorithm when i need it. Admittedly, my job doesn't require me to design algorithms, but implement some not so readily available ones sometimes.
For sure, I'm not arguing that some whiteboard interview questions aren't crazy, i realize there are.
But also I'm not sure how common it is to have interview questions such as "delete a node from a balanced tree" where they actually expect you to get that right vs just see if you get the basic idea. I've never run into it. Perhaps many people on hackernews work in silicon valley and that is more common there.
which is quite right at my first job we didn't get our very expensive engineers (Oh rate was > 500%) to write our own fourier analysis we just brought one from NAG.
These are the same people who want "intuitive" understanding of fundamentally mathematical subjects. Anytime I see a title, "X without math," I just see "X without X."
Intuition isn't understanding and ignorance isn't expedience. I think your choice of the word craft really homes in on what we are supposed to be doing. Building things well with craft. There is a place for duct tape and wire, but the craft of software matters.
I understand both of your arguments. In the end I hate whiteboarding problems, because I don't really need to know all of the potential problems in detail, but a certain interest for why some algorithm is faster than another IS indeed helpful when programming fast applications.
I have done some optimizations in the past, and I think what the author is getting at, is that it's not too late to learn about the specific niche algorithm you're dealing with when you have it in front of you.
My favorite rule about optimization is one I learned from a former colleague. First you measure, then you find where you should look, then you optimize. Without measurements you're maybe doing more harm than good. And if you look up different solutions when you know what you're trying to optimize it's usually not that hard to find a better replacement. But you don't need to have the exact solution in your head for this.
What you do need is the rough cursory knowledge of algorithms that grants you the ability to see more quickly where you're losing precious cycles. E.g. knowing the tradeoffs between linked lists and vectors.
I've seen way to many tools that could be sped up by a factor of 30-100 in an afternoon, that I really have to agree with you that a lot of people don't know their craft. I just don't think rote memorization of algorithms Q&A is solving any of this. Take home exams IMHO are a much better measure. I can ask people why they chose a certain data structure, why they didn't do it another way (if I have an idea of what could be better). That gives me much more of an insight in how people think. Not how well they memorize things.
He doesn't seem to be saying that he chooses not to understand but rather he's trying to optimize his time to improve in the context of his current knowledge. This is understandable and it's important to know what's in the standard library in order to be productive. As he gains experience I suspect his curiosity will lead to deeper exploration.
I feel like #2 is caused more because people try to be #1. Instead of letting good programmers develop well-established libraries that everyone uses - you end up with NIHS and a bunch of mediocre developers creating mediocre libraries that only they use. I became a better Javascript developer when I began using well maintained libraries to solve problems instead of wasting time reinventing the wheel every time I had a problem that needed to be solved.
Time is hard. I don't like to deal with localizing time, formatting time, or converting time. Instead I use a library [0] to solve that problem for me. Because while I may need to localize time - localizing time is not the problem I am trying to solve. It's just a roadblock in the process of solving my actual problem.
I've been in the business for ~20 years and there are very few cases I can recall where it really mattered what the underlying sorting algorithm was, and no cases where the actual details of the algorithm's implementation mattered. Not saying these cases never come up--I am saying they are exceedingly rare [EDIT: and often company-specific], and making them general candidate interview questions might be a bit of a waste of time.
Present a candidate with a simple system with three levels of middleware API and have them plumb a newly exposed low-level value up through all three levels to the high-level API. This is unfortunately 75% of professional software development.
We've reached a point of complexity where few if any people hold the whole stack - from electrons wiggling along to making the button more enticing to click - in their head. You have to eventually decide where you'll settle in the layers of abstraction and let others handle their own.
There is a lot of value in knowing that some algorithms are better for certain problems than others. But knowing them to the level that you can whiteboard them quickly without mistakes in an interview is just an exercise in memorization.
Is this really news to any of us? I figured most people here understood that algorithm/whiteboard test prep was pretty much standard, but incredibly pointless.
A whiteboard test is probably the worst way to determine somebodies grasp of computer science fundamentals, especially when what really matters is can they apply those fundamentals properly in the proper context?
I recently joined Blind and mostly follow the compensation discussions. 2 Takeaways:
1) Most of us are incredibly underpaid
2) A common question is when asking how candidates prepared that resulted in offers is "How many problems did you do on leetcode?" I'd never heard of leetcode but it seems if you want an offer from FANG, Uber, Lyft, etc then you put your time in practicing programming problems.
Ultimately I don't think this is what OP is saying. Just that white boarding leetcode/hackerrank algos isn't the best use of time, but I'll respond to the takeaways nonetheless because I have spent a lot of time on them:
1) By what metric? Also, who's we? I am a full-stack engineer and feel properly compensated if not over-payed given my expectations.
2) Leetcode has been extremely helpful. Not just in interviewing, but also in understanding how to have more granular control of space/performance. However, it is not the whole picture and I would point people to the famous coding university github: https://github.com/jwasham/coding-interview-university
and further push people to follow research and coders they admire such as Peter Norvig: http://norvig.com/
There was a recent discussion here about developer compensation likely following a bi-modal distribution, I'm inclined to believe it, so IMO 1) Most of us are underpaid.
To go from 100k to 200k seems like it would take years unless you jump to one of the FANG companies
I don't think you understand the purpose of a whiteboard test. When I interview candidates, I'm looking for a lot of things that can't be practiced:
- How well can you converse about a technical topic
- How adaptable are you to design constraints
- Do you understand the fundamentals of threading, databases, data structures, ect?
- Can you "think in code?"
- How well do you really understand the language that you spent XX years of your career working in
The best way to improve your skills is by doing: specifically, choose a hobby project that involves an area that you want to learn. Reading a book will only get you about 10-15% of the way there. Books are useful to choose a technology to learn, but not to learn the specific technology.
Getting back to whiteboarding: I've used it to:
- Reject candidates who forget basic fundamentals of a language that they claim XX years in. (Seriously, if you can't construct an "if" statement or use a common collection class in a language you claim XX years in, then you don't belong on my team.)
- Reject candidates who can't learn an unfamiliar API. This is critical, because we need to discuss new APIs in design discussions; and because "old & working" code isn't always worth refactoring to use some shiny new API.
- Reject candidates who don't know how to use a database. (Frameworks / ORMs are not a replacement for knowing how a database works and how to program with one.)
- Reject candidates who don't know fundamentals of threading
You seem to have a reasonable approach to interviews. However, as a contractor I was in interviews where I had to code the optimal solution for a certain problem on the whiteboard without syntax errors. That's just silly.
There was no discussion to show any thought process. It was about knowing the expected solution to 100%.
Remember, interviews are a 2-way street. If the company has unrealistic expectations in the interview, then it should influence yourdecision to continue pursuing the job.
I do remember Randal Schwartz talking at Floss Weekly
https://twit.tv/floss sometime that in his work he tries to listen developers and ask right questions to see where bottlenecks are. Is there some expensive database query, does something need to be cached, etc.
I think that the whole interviewing and whiteboard frustration is because not all software development is equal. I think software development is a spectrum, with (let's say) on the one hand engineering (the academic side) and the other development (let's call it the "craft" side). Most software positions/roles/jobs sit somewhere in the middle or lean to one side or the other.
Some companies are clearly academic, think of the Google's of this world. Other companies are more crafty/creative, think e.g. of typical web-dev shops. Games development is an interesting one.
As software developers we like to stick things in clearly defined boxes and treat software development as one big box. But I think it isn't.
I believe that the frustration most people have with interviewing is that the wrong style of interviewing is applied for a certain role. E.g. a company that is clearly on the crafts/creative side is hiring like they're Google (because they read about how Google does it on the Internet).
My experience is that in most situations whiteboard interviews don't contribute a single thing. A good conversation about making software usually does a lot more. But I guess for inexperienced interviewers, a whiteboard is an easy tool to hide behind.
I don't think it's a complete waste of time to do these practice interview problems. At the end of the day it reinforces your understanding of core computer science topics and helps you think out of the box(barring stupid problems where you need to know a specific formula). Then there are situations where you need a deep understanding of these algorithms to choose them for specific situations irrelevant of implementation.
> After stopping these practice, I leverage the spare time for following tasks: read C++ classical books and learn more about standard libraries
I might suggest this is the best way to study for whiteboard tests in the first place. Practicing for them might only help if you have other people pick the questions and watch you.
The last interview I had, I completely bombed the whiteboard test, but not for technical reasons. I probably wouldn't have been able to practice my way out of it. Luckily, the company also had a live coding test which I aced, and they hired me, so it didn't matter.
As a manager and hiring interviewer, I have to say that I find whiteboard tests moderately useful, even though I agree with pretty much all the criticisms here so far.
The point of it is to try to see how the candidate thinks on their feet, watch them talk their way through a problem without StackOverflow at their fingertips. It almost never matters if the whiteboard code has bugs, the question is more about the process, not the result.
I'm definitely not trying to be snarky or mean here--just offering my take on this after reading the article. English is obviously the author's second language, and I would offer that practicing English and learning better fluency/grammar/spelling would help an order of magnitude more than practicing computer science quizzes and coding challenges. Many of the candidates and developers I encounter may or may not be good software engineers--I don't know because their language skills are lacking, and they are unable to convey it through spoken or written word.
I'm learning a second language myself and I feel your pain. I would never want to interview in a second language. However, in the US workforce, English proficiency is critical and such soft skills can often mean promotion into management, increased responsibility, and technical leadership positions.
The first part of the article where the author explains how time-consuming it is to study algorithms can really be applied to any field of study. The fact that the author spends much time learning algorithm doesn't mean that he's "dumb" as he humbly presents himself, it just means that he's just getting started. It's like trying to pick up guitar and when you realize that it really is hard and takes time you decide after one month that you're not a talented guitar player.
Then there the second part where the author considers algorithmic knowledge as useless (or not so much useless at best) because the tools available to a programmer already implement any relevant algorithm.
There's some ironic parallel between this way of thinking and algorithmic proficiency, actually. Many algorithms work by leveraging data structures whose understanding of internal working isn't needed. For example, one way to efficiently merge N sorted list into one big sorted list is to use a priority queue. Do you need to know how a priority queue is implemented to find this solution? Not at all. You just need to know the interface it offers, and the complexity of each method. Finding the k-th smallest element in a list can be done using a heap. Do you need to know how a heap is implemented to find this solution? Not at all. You just need to know what a heap does.
Really, studying algorithms is like studying the C++ standard library, except that instead of knowing about classes and methods as your toolbox, you know data structures (and common patterns) as your toolbox. Of course any curious mind will then go deeper and actually read about how those things are implemented, building an even better understanding of the foundations, and solving even deeper problems with it.
While being an interesting parallel, this doesn't really answers the author's questioning about: what's the point? Which brings us to what the author forgot to address in his article: white board test. The title of article sets the scene with someone wanting to find a job in one of those big tech company hiring people who can pass the whiteboard tests, but then who somehow... forget about it and decides to abandon this endeavor? Well, fair enough, but to be perfectly honest, while Google & Cie engineers certainly aren't spinning up algorithms on a daily basis like mad computer scientists, the engineering level there is still quite high. So there's definitely some basis in wanting to pass the whiteboard interview, mostly working with intelligent people.
Practicing white board tests is more about learning composure, maintaining cool under pressure (and under observation), and finding a way to think and reason about problems that "shows your work."
Many times people can solve problems, but can't verbalise them enough to make a white board test be an accurate indication of how they think.
It's not about practicing to solve any specific problem - that kind of "practice" is counterproductive as the poster realised.
I regularly conduct interviews at one of the more infamous "whiteboard interview" companies. At the end of my interviews I like to take five to ten minutes to answer whatever questions the candidate might have about myself, the company, the process, etc. One time I had a candidate who absolutely bombed the interview asked me a question that expressed a similar sentiment: why do you guys place such a heavy emphasis on basic data structures and algorithms when standard libraries and software packages offer easy to use, efficient implementations?
To be honest, this is a fair question. For someone whose entire career has been spent coding on a single machine, with data that fits in memory, reusing pre-written O(NlogN) or faster algorithms, often with a single thread, I can see why they would ask this question.
The answer is: for what we do, there is no off the shelf solution. Our datasets almost never fit on a single machine, to the point where we make jokes about five terabytes of data being so small we forgot how to count that low. Our algorithms are almost all novel: one of my favorite interview questions starts off as a trivial string manipulation problem but then branches out into easy-to-express, easy-to-understand variations that actually require some very sophisticated algorithms to solve. When it comes to pre-built software, even our internal turnkey database solutions are so sophisticated they require a solid understanding of distributed systems and operating systems to avoid common pitfalls.
Honestly, not every person is up to the task of working in this environment. Our workforce skews towards people with degrees in CS from high-ranking universities not because we're snobby but because there are few places that teach this particular combination of skillsets. You can probably go your entire career without working for us or on the sorts of problems we try to solve, and you can just as well prepare for interviews by focusing on practical matters and not deeper algorithms and data structures knowledge. And that's fine, I'm sure there are plenty of positions out there for you.
But if you do, don't come complaining to me that our interview process is too hard or the prep process is too impractical or that we're being unfair because none of the stuff we test for is practical in the real world. You can have an easy time prepping for my interview or a job at my company, but not both.
EDIT: There is something to be said for companies that cargo cult this interview process. True, if someone can pass this interview process they’re probably pretty decent, but you have to be honest with yourself what sort of problems you’ll be solving. OP, meanwhile, expressed an feeling of blanket pointlessness without saying what kind of job he’s looking for. I hope I’ve made clear this is counterproductive for companies like Facebook, Google, Microsoft, etc.
I got one of these interviewers (but not at one of the big companies that actually do things at scale), and stood my ground.
How would you sort this list? "Use the standard library and move on with my life."
But I want you to show me... "Use the standard library and move on with my life."
But what data structure would you use? "This one, because A, B, and C. It is already provided, debugged, and tested, by my standard library."
How is that data structure implemented? "Doesn't matter. It works and the time I spend not worrying about it I can spend shipping software."
--
It actually worked. I think they respected the pragmatism. I actually went on to help migrate them away from a HORRIBLY BUGGY home-grown container/string library, over to, you guessed it--the library that comes with the language.
I'd argue that for almost any companies except for the big 10 or so, that's the correct approach. Especially for enterprise software. And even in those big 10, for projects which are not directly tied to the things they do at scale (I doubt the Hangouts Android client needs to reinvent the wheel...), it's still the right approach.
I'm kind of sick and tired of companies writing their own frameworks and languages when they can't really maintain them, long term. Google, Facebook, Apple, Microsoft can, because they have a different culture and primarily because they have a different focus as well as enough profits to support it. Random Java big-shop can't. Their framework will be nice and shiny the first year and 15 years later you'll be wondering why you're working with the atrocious Struts-1 inspired undocumented internal framework.
> I doubt the Hangouts Android client needs to reinvent the wheel...
This actually touches on an interesting point. My notorious interview company is Google. I'm not familiar enough with the Hangouts Android client to tell you anything about it, but for the sake of argument let's unrealistically assume it's trivial. Imagine you have an organization where some engineers are good enough to work on the Hangouts Android client but not good enough to work on, say, the F1 distributed database.
Internal transfers become crapshoots. Is this person good enough to work on my new, highly complex project? Should I reinterview them to make sure they can hack it? Imagine what this does to the social and cultural stratification. "Oh, those Hangouts guys are nice but they're not that impressive, it's not like they're working on the machine learning or anything." It'd be a nightmare both practically for management and socially for the engineers.
Our philosophy is: all engineers should have the chops to quickly be able to ramp up on whatever project they like. This means rejecting a lot of people, but it also means that two engineers can look at one another across the hallway and immediately know they could swap teams and not miss more than a couple weeks of productivity. When framed this way, I think the interview process is a natural consequence.
Not sure what the culture is at other places, but I hope this is a useful data point.
so you admit your interview process is broken, and till this day you guys have not found a better way to interview given the problems your company has?
There's nothing broken about it. If anything, I'd say it's a great success: from the set of people I've seen, the people who don't get hired get rejected because they exhibit real deficiencies across multiple interviewers in areas that are relevant to our work. The people my team and others have hired are consistently capable of doing both the day-to-day work we need done and the once-in-a-career stretch challenges they encounter. They are also independent and don't require months of basic training before they start being useful.
If my comment expresses any frustration it's with people who fail to see the point in preparing for anything more difficult than "can you use the standard library and write a for loop" type interviews and then turn around and complain about not getting hired at selective organizations that do very hard things. And then they go and issue a blanket denunciation of a process that consistently meets organizations' expectations.
This is like saying running for exercise is a waste of time because you could be driving instead. It's not about knowing a particular sort algorithm. It's about the discipline of solving performance problems in the small. Knowing how quicksort works isn't that helpful. Being accustomed to the thought processes that led to the development of quicksort is important in any non-trivial programming activity. I'm not writing Google-scale services, but I regularly encounter algorithm design problems on the job, and they're never the exact algorithm you studied for some white board exam. I think the author is approaching algorithm study with the wrong attitude.
I took it to mean that most developers in this current market do not need to know implementation perfect algorithms and data structures like RB trees, depth first search, A*, quicksort, etc. Rather your time is better spent on learning the advantages and disadvantages on the structures and algorithms, and you get more benefit from understanding the implementation. For example, merge sort is dividing a list up into sublists until you hit single element lists and recombine them into a single sorted list. That, in my opinion, is more important than a picture perfect implementation on a white board.
I think it’s more akin to applying for a driving job but they test you on your running rather than your driving. Where you should be practicing your running anyway to maintain your ability to be an alert and healthy driver, but it’s not likely necessary for every driver to be in tip top physical shape to be a good driver.
Some of the interviews at lower paid shops I’ve either participated in or rejected have had more stringent requirements like weird whiteboard questions in the interview while working on more simple implementations (including shops where I found you won’t even be issued your own machine. You partner solely and share) than places where the work will be significantly more difficult and significantly better paid the environment more adult and the perks better.
I know FANG et al employ those tests for various reasons. I think a lot of other places just blindly emulate it.
I just gave up on getting hired at any tech company that uses whiteboard interviewing. It seems like, just as in college where there's always that one kid who aces every test and wrecks the curve for everyone else, in any candidate pool there is always one demigod of algorithms (and it isn't you, you're merely great) that spends every waking hour of his/her life on hackerrank and topcoder.
Whatever, I spend my time on side projects, learning new things, family, and the occasional bit of fun and relaxation. And I still manage to stay employed (so far).