So, you're a good developer who is smart and passed the interviews with some studying. That's exactly how it's supposed to work :).
You're right that the problems that some up in interviews typically aren't real life problems. But that's not necessarily an issue. Are the skills they test (developing good algorithms) important for real life?
An interviewer doesn't ask you to develop an algorithm to return a random node from a binary tree because they particularly care if you know THAT algorithm. They're asking as a way of evaluating your skills developing solutions to problems you haven't seen before.
I believe that is the most common argument made, but also therein lies the fallacy.
I'm not developing anything new when I implement a quicksort algorithm; Tony Hoare certainly was back in 1960 when he created it. The same goes for the hash; someone at IBM made a big leap when they first used hashes back in the early 1950s.
Since that time numerous people have made some improvements to these (and other algorithms of course), but by and large companies are just testing is the developer interviewing able to code existing algorithms and explain their Big-O notation.
I think you end up with 4 subsets that the interviewees fall into.
I = {interviewees}
A = {I | bad programmer and can't solve algorithms}
B = {I | good programmer and knows the algorithms}
C = {I | bad programmer but knows the algorithms}
D = {I | good programmer, doesn't know the algorithms}
It is subsets C and D that concern me.
In subset C we have someone that studied the algorithms really hard, but just can't write good code and solve real original problems. One approach to this is the "hire fast/fire fast" mentality, but in some organizations "fire fast" is not an option; so you would rather not hire them in the first place.
In subset D we end up missing out on good developers. In a bad job market there isn't much need to worry about this; if we miss a good developer easily enough we'll find another. Right now though developers are in such high demand I see this loss as a bigger deal.
I also think that this type of interviewing technique works better for the tech giants (MS, GOOG, FB, etc...) than it would for a lot of smaller organizations. A good set of programmers will self select out of interviewing at these companies because they don't believe they are "good enough" for MS(or alt). Additionally a good set of developers that do believe they are high caliber enough for these organizations will do some research about the interview process before hand and end up studying up on the right things before going in.
With smaller companies or businesses with non centralized hiring processes for developers interview candidates may just not know what to prepare for in the interview and while an algorithm may seem obvious in hindsight; there is a reason nobody was using it until it was "discovered". I can't reasonably expect someone to devise the answer to that type of question when I interview.
Honestly I would love to hear some ideas on this; as it has been a pretty major concern for me for the last couple of years.
You're right that the problems that some up in interviews typically aren't real life problems. But that's not necessarily an issue. Are the skills they test (developing good algorithms) important for real life?
An interviewer doesn't ask you to develop an algorithm to return a random node from a binary tree because they particularly care if you know THAT algorithm. They're asking as a way of evaluating your skills developing solutions to problems you haven't seen before.