> Jeff Bezos has all these incredibly intelligent, experienced domain experts surrounding him at huge meetings, and on a daily basis he thinks of shit that they never saw coming.
This isn't a story about Bezos' genius. It's a story about bad corporate culture.
What's really happening is that people are afraid to think in the presence of the boss -- not just afraid to express their thoughts, but afraid to have them. All the boss has to do in those meetings is state the obvious, and all the courtiers fall around in awe.
And indeed it's explicit in this story as well: note the "jury of VPs" who wait for a clue from the boss to know whether they can laugh, and the people who come back from a meeting with the boss "licking their wounds" and going back to "a cave".
Obviously Bezos is very smart; it's possible he's "the smartest guy in the room" most times; and it's likely he doesn't have a lot of patience for fools. But what this shows is people are terrified of him, and he likes it, or at least he doesn't do much to correct it.
> What's really happening is that people are afraid to think in the presence of the boss
Is it too much to ask they do the thinking before meeting him?
It seems clear to me that the process that Bezos has institutionalized for meetings at Amazon aims to achieve exactly that.
A person that is senior enough to report or present to Bezos is probably involved in taking huge, high-pressure decisions. If one can handle those, one can handle a meeting with Bezos, and vice-versa.
I think the point is they may have had the thought already, but are afraid to express it, for fear of getting part of it wrong and suffering a reappraisal.
Yes I read this at the time and felt exactly as you do. There's no evidence of special intelligence in the story he tells. It's a story only someone who has been ever-so-slightly brainwashed could tell.
There is of course evidence of Bezos' intelligence in the fact of Amazon.
In how far is the success of Amazon(I suppose that success is what you mean by 'the fact of Amazon') to be associated with intelligence? Might there be some survivorship bias or halo effect at work? Nassim Taleb writes that if you place a sufficiently large number of monkeys in front of typewriters, one will end up writing the Iliad.
And yet they continue to go from strength to strength. I think you're right, and that the logical outcome should be poor company performance, but clearly that's not the case.
> the logical outcome should be poor company performance, but clearly that's not the case.
But how many years have passed for us to reach this conclusion? The same was said about Richard Fuld and Lehman Brothers under his leadership but it dint end well.
To clarify, I am not predicting a poor performance by Amazon here but to think that they are infallible just because they are doing great now is incorrect. When and if they falter these stories will provide ample ammunition for people who "dint see it coming".
A healthy work environment is not economically optimal. Not even close. It is optimal to take as much from your employees as possible and to give them as little as possible in return; the opposite of a healthy environment and healthy workforce. Amazon practices this efficiency behind the developer console as much as they do in their warehouses. They just can't treat their developers quite as bad as the warehouse workers have it yet because of a temporary market inefficiency -- a dearth of skilled developers.
The workplace is an area where if we let the market do its work, it causes social and moral catastrophe and forces all but the tiny few at the top to live in complete misery, which is why we fought and died over hundreds of years against optimal economic efficiency for worker protections, unions, weekends, worker's compensation, overtime pay, etc.
It's hard to say actually - sorta like "google is super successful and gives out outlandish perks/benefits to its employees ergo it's good to provide such perks/benefits".
Where is the direction of causality here, and is there even such? Maybe the perks in the above example actually harm google but their basic business is so strong as to cover for it (just an example, that's not what I'm actually arguing)?
What's probably crucial to understanding this favourable appraisal of Bezos is the "reply-all" gaffe that he mentions before the war story.
After speaking candidly about Bezos as a control freak and a tyrant in the original post (1), it's no surprise he followed up with a gushing and face-saving post about his genius and prescience.
I don’t know. I have experienced Amazon and think it is a plague of a company. If it is not the worst company culture in the world, I shudder to think what could possibly supersede it. With that said, I still have a pretty high opinion of Bezos. I just think he made a handful of mistakes that may have accelerated his success at the expense of some long term stability (yes I think Amazon will eventually see a fiery implosion). Intelligent people still make mistakes.
I think you're misinterpreting the context here. This is a followup to a post where he did say some unflattering things about Amazon. It was intended as an internal communication, but he accidentally made it public. So this post is damage control. "Sorry guys, I don't do that sort of thing on purpose, here's a flattering story to even things out."
That rant was hilarious. Nearly everything he claims Google does right and Amazon does wrong, ends up screwing Google, and makes Amazon more profitable.
Expecting a generalist software engineer to know data mining and machine learning (well) is about equivalent to expecting your primary doctor to be able to do cardiology and heart surgery.
I expect a generalist doctor to have a good understanding of those things though she might not be in practice at specific surgical techniques. A primary care physician who doesn’t know those things is a specialist in primary care who will have difficult helping patients make decisions about more specialized care.
Similarly a generalist software engineer should know what things are realistic and possible with data mining and machine learning. Just like they should know what is possible and realistic in cpu design or network protocols. They might not know all the latest tricks but they will be able to design and advise on systems which depend on these capabilities.
> Similarly a generalist software engineer should know what things are realistic and possible with data mining and machine learning. Just like they should know what is possible and realistic in cpu design or network protocols. They might not know all the latest tricks but they will be able to design and advise on systems which depend on these capabilities.
I'm sorry to say this, but your expectations are insane.
Or maybe I'm misinterpreting you. I know that a CPU has pipelines, multiple cores, an ALU, a MMU, a FPU, several levels of caches, etc., but I have no idea what's "possible or realistic" in CPU design. At least not in any way that I'd be able to argue toe to toe with an actual hardware engineer working on CPUs.
I also know about network protocols, L1/L2/L3/L4(7), IP, TCP vs UDP, etc., but same thing, a real network engineer would wipe the floor with me regarding "possible and realistic" network protocols.
Same for data mining or machine learning. Sure, if you held a gun to my head I could probably design something, but I definitely wouldn't feel confident going to production with it in any serious capacity unless I consulted some people who actually know the field.
This field is way too broad for 1 person to cover everything at a decent level of competence. I think that people who think otherwise are deluding themselves.
Or you're thinking about a generalist providing a shallow level of advice. Maybe that could apply, but I don't know who that would help...
Similarly a generalist software engineer should know what things are realistic and possible with data mining and machine learning.
This is impractical. A generalist software engineer should know that machine learning is possible, and have some idea of who they would ask for help if it became relevant (or to ask if it were relevant). They should be able to design systems which depend on these capabilities in collaboration with a domain specialist, and based on their advice.
In this context, ML is like FPGA design, cryptography, network security, or embedded real time systems. A generalist should know they exist and that they contain hard problems.
Likewise, a GP does not in general have a good understanding of cardiology. Rather, they have a good understanding of indications from basic tests that mean they should send their patient to a specialist.
I suspect you've never been referred to a cardiologist by a General Practitioner. Even if she did know her discussions with you would be very limited. Her job here is to know who the good cardiologists are, and set up the referral.
i absolutely hate this story every time i see it. it's so filled with hyperbole and general fawning that it contains nothing useful. and i am sure bezos is indeed intelligent, but this story's climax is him pointing out something relatively mundane in what i am sure was a mundane presentation. and i still have the same question i did when i first saw this on hacker news. who is steve yegge and why should i care?
This isn't a story about Bezos' genius. It's a story about bad corporate culture.
What's really happening is that people are afraid to think in the presence of the boss -- not just afraid to express their thoughts, but afraid to have them. All the boss has to do in those meetings is state the obvious, and all the courtiers fall around in awe.
And indeed it's explicit in this story as well: note the "jury of VPs" who wait for a clue from the boss to know whether they can laugh, and the people who come back from a meeting with the boss "licking their wounds" and going back to "a cave".
Obviously Bezos is very smart; it's possible he's "the smartest guy in the room" most times; and it's likely he doesn't have a lot of patience for fools. But what this shows is people are terrified of him, and he likes it, or at least he doesn't do much to correct it.