Because learning a bunch of algorithms doesn’t translate well into day to day programming capability. This is exactly why those dumb whiteboard problems aren’t good for interviews; you’re not measuring what makes someone successful on a day to day basis.
> This is exactly why those dumb whiteboard problems aren’t good for interviews; you’re not measuring what makes someone successful on a day to day basis.
If that was the case, you think all of the biggest tech companies that have billions of dollars would continue using it? Did you ever consider that you might be wrong?
I mean, for a while there Google thought that asking how many basketballs could fit into a school bus was a good way to find good coders. Just because Google does it doesn’t mean it’s smart.
And when is the last time you had to balance a binary tree or write Djikstra’s algorithm from memory anyways?
Curiously, in my non coding management role, I busted out Djikstra’s algorithm last year to the amazement of most of the developers, who thought the problem in front of them was insoluble. It turned out that yes, it is in the general case, but it was totally possible to brute force the results required for the constrained task at hand on a laptop in Python and store the ~60k optimal solutions in a lightning fast key/value store.
I didn’t reproduce it from memory though (but I did choose to port a Java version in preference to any of the existing Python versions I found, because it’s code was structured in a way that I grokked much easier. Possibly due to my not totally expert idiomatic Python skills, I’m really a Perl coder deep down.)
(And I never had to do any whiteboard performative coding to land this gig either.)
Interesting anecdote non-withstanding, a white board interview would have not been a good way to predict your ability to have done that. The limiting factor was not your ability to remember the algorithm from memory, which you didn’t do, but the ability to make complex decisions based on real world constraints and data. In a whiteboard situation, chances are the “correct” answer would be “it’s not possible”, since without knowing the actual data set that’s true.
Yeah - totally agree, this was just intended an a serendipitous anecdote. In this case it was, though, a clearly constrained problem space. For only a few hundred rarely changed nodes, an O(n^4) algorithm (running O(n^2) against every possible n^2 pair of nodes) is perfectly amenable to brute force and caching the results.
Totally wouldn't work for Google Maps doing shortest path routing globally, but for an event with only a couple of hundred POIs and intersections, it's a cheat that seems almost magical to people who have some concept of the algorithmic problem, but haven't worked out the "trick".
If I were asked to whiteboard this, I would 100% be asking that constraints of the dataset up front. (I'm not too sure I'd have been smart enough to do that a couple of decades back, when I last had a "perform for us code-monkey!" type of job interview...)
Cynical response: Because thats what the people already at the big company had to put up with during hiring, and damned if they’re gonna let any prospective newbies off with an easier recruiting process, no matter how broken and pointless it is...
> at least not one that’s possible to administer cheaply.
Ah, yes. AFAIK, Behavioural interviewing works, but it does require training and thoughtfulness. You have to know what you're looking for, and ask questions specifically targeting that. You definitely cannot just throw a random employee into an interview with a list of questions and expect it to work.