Ok, if what the smartest and best people are doing isnt the right solution for my problem, what is?
Come on; enough vague hand waving. How do you pick the right tech to use then?
Obviously only what "the smartest and best people are doing" to solve problems that are isomorphic to yours.
But first you also need to identify those people: who said those in the big companies (or smaller with better bloggers) are the "smartest and best people"? Just because they belong to a succesful company? And because they have degrees from some top-tier university?
The company could be succesful despite of its technology, on the business value alone (which usually is how it is), and the people with "good degrees" could just be architecture astronauts or fresh amateurs who re-invent long buried concepts because they don't know better.
There's a whole long distance between what the current industry champions as "best minds" (some 20 year old with 2 years of JS that created the framework du jour) and people like Knuth, Alan Kay, Kernighan, Richie, Bill Joy, and the like...
As it turns out, moving fast and breaking things isn't all that conducive to quality engineering. Facebook has a massive legacy code base maintained by an enormous team that is under pressure to iterate quickly while supporting what is arguably the world's largest user base. They are dealing with extremely specialized constraints that most of us will never encounter. Many of the trade-offs they consider acceptable aren't trade-offs that make sense elsewhere.
And that's exactly what the blog post is getting at. A lot of these emerging technologies are situationally useful in a particular set of scenarios and environments, but developers with cargo cult mentality convince themselves that these things will solve all of their problems if they use them everywhere. When the disappointment sets in and they figure out that isn't the case, yesterday's darling stack becomes the subject of today's "considered harmful" essays and everybody moves on to the next shiny thing.
The point here is that maybe you should make a sober assessment of new technologies and objectively evaluate whether they are actually practical for your usage scenario rather than just blindly jumping on the bandwagon and using something.
The examples from the blog post are totally on point. I previously worked at a company that built a NoSQL database and I have great affinity for the technology, but there's no question that too many people adopted NoSQL databases without really understanding the trade offs. People want to be "web scale" so they go straight to using databases that are designed for high-availability clustering even when they are building applications that would be perfectly fine running forever on a single postgres instance.
In my experience the choice of the right tech is primarily a business decision rather than a technical one. By that I mean it should be based on a number of factors external to the technology. Once you have clearly identified the business problem you are trying to solve, you have to think of the solution in terms of risk. For example, what is the up-front development risk? What are the maintenance risks?
Every project exists within a certain environment with its own set of constraints. Picking the right solution means understanding these and doing so from the point of view of the business.
If you understand the risks and constraints, and you put the needs of your end-users, the systems administrators, the testers, and the business first you will rarely go off-track.
Of course, there's the opposite extreme too, where long time team members and management ivory-tower themselves to micromanage every tech choice and then prescribe it throughout an organisation without much input.