Trying to restart an innovation engine in a large old company like this seems to be nearly impossible.
I’ve been a part of a number of startups that were gobbled up by big co trying to innovate their business, and it always seems to fail.
The problem I continually see is that in big companies anyone who sticks around and does “something” gets promoted. Over time as the good people leave you have a layer of of really bad managers and directors that just effectively block innovation. They’ve spent their careers being shills and are threatened by anyone innovating.
> Trying to restart an innovation engine in a large old company like this seems to be nearly impossible.
It's super hard and I've thought about it for years (even though I can't imagine working for a big company -- even with my own companies I leave when the headcount gets to about 200). But I've wondered if there's a kind of "hayflick limit" for corporations too.
Consider Intel. One way to get the old spark back would be to simply start a new company. Takes a lot of capital and involves a lot of risk. The only tech company I know that does this well is Cisco; it's also emerged less intensively in pharma. Other than that, no.
Second: Aggressively prune the product line, milk what cash cows you have, use the savings to invest heavily (including increasing compensation for those who remain, so they feel it's worth staying to be part of something growing). The advantage of this is that you ignore the deadweight problem you describe: it's too hard to root it out when you have a crisis, so just do it at a macro level. I've never seen this really be pulled off; the closet is late 90s Apple when everybody knew they were in an existential crisis and they had enough emotional commitment to keep enough people on board (since they couldn't afford to do the "raise compensation" path right away).
Actually a not bad example is a company called Intel when Grove cut away the cash cow, memory, replacing it with the unproven CPU business because he could see memory rapidly becoming a commodity boat anchor. But that Intel was a much smaller company, with the founders still around, in a immature market. And it was a bet the company manoevre.
The prune-and-milk (mixed metaphor) requires a compelling vision that includes an extreme sense of urgency. Elop's "burning platform" memo at Nokia tried to give the sense of urgency but had no compelling vision beyond "the company needs to make money in order to survive". Gelsinger did a good job of setting out a coherent vision, but is it compelling enough to get good people to join & stay? I doubt it.
(The reason for the "milk" part of the approach is because while you're in the shit, even if you are digging your way out, most of the public articles will be like the submission we are discussing, which makes execution harder).
But in general, apart from a big pool of money and a legacy customer base, why would you try this with a big, cumbersome company when you could just do a startup? That's what Cisco honed in the 90s and still does: I think the company needs to invest in new tech X using approach Y, but that would threaten the revenues of some existing product line. You think tech X is great but approach Z is better. We both quit and start companies (sometimes Cisco even invests in later rounds). Eventually it turns out that one of us was right. That company gets bought (big bonus for the founders, who rejoin Cisco) and the founders of the one that didn't win also can get their jobs back (maybe through a small acquisition so they get a small bonus for their risk taking).
This allows them to compensate innovators outside the normal process (which doesn't piss off the MBA VPs), allow some competing technological developments, and keeps them big. I don't know if that process is as big and institutionalized as it was in the Chambers/Volpe days, but it was super smart.
In a small organization, people usually succeed by being motivated by dream of success. In large organizations, people seem to succeed by fearing and avoid failure.
I'd say for a big org the priority isn't "not fucking up", it's not objecting.
In a small growth org, despite severe rank differences, you're likely to have some impact on direction if you disagree. In a big org, there is no impact. You either are the wavemaker or those gleefully riding the waves. It doesn't matter how many "we listen to everybody's opinion" there are in these big orgs, wavemakers simply keep a lot of key information to themselves which puts everyone else within at a huge disadvantage. This also causes innovative thinkers to propose ideas less, because they got burned that one time when a wavemaker shut them down with previously hidden information.
> I'd say for a big org the priority isn't "not fucking up", it's not objecting.
At the organization's level it's "don't fuck up" since you're already making a lot of money and a lot of people's livelihoods are depending on that - the trick, if you're managing in a big org, is how to avoid that turning to a "don't disagree" culture before "not changing" turns into a bad thing instead of a good thing.
I haven't figured out how to do that - or to work under people who also haven't - so I prefer small orgs.
I wonder if the "don't fuck up" principle at an organizational level really applies past a certain point. I think that's where a lot of the internal dilemmas stem from. Google as an example, they can "fuck up" tremendously on an organizational level, numerous failed products and services costing billions. Yet they still roll on, and even interns are very well aware not to do something catastrophically detrimental such as turning Google Search into a TikTok clone. There's just far too much momentum that fucking up should ever be a concern once you're at Google/Intel's level.
So perhaps the fear of fucking up plays an integral role in creating this atmosphere of unchanging. When you're fearful, you're less likely to take risks in general, even if that fear is subconscious rather than active.
I was part of a startup that was gobbled up by Intel trying to innovate their business. Their culture was slow-moving and process-heavy, which makes sense for precise, high-volume manufacturing, but for software development it felt like wading through glue. I had a good impression of management in the group we got acquired into (itself a previous acquisition), but some political conflict with a different group led to our project getting shelved anyway. I quit last year as part of a general exodus.
I keep hearing Intel management saying the right things about what they're trying to do, but I think it will be a long slow process if it can work at all.
Or the acquisition was meant to take over / replace an existing product and those working on the existing product had no idea, and what happens is continued misalignment, pushbacks, conflicts and years go by with nothing being done.
Many people complain about how good individual producers, but few talk about effective front-line managers being moved to where they are less effective. In organizations it's often assumed that the best managers need to be near the top of the org chart to spread their effectiveness as far as possible, but I contend that these great managers might be better off near producers in critical areas.
Very competent people are pretty rare in large organizations, once you factor in their environment.
Do you put your one best performing manager at the top or at your most impactful project? (I'm not sure about the answer, but I'm not sure it a real option exists either.)
True, though that's because Apple basically bought NeXT, it's IP, OS, top executives and Jobs had learned alot about the mistakes he'd made. Also he was singularly driven and charismatic (like him or not).
Basically it's as if Intel bought AMD and switched, but AMD had some visionary CEO that could lead Intel to victory.
The days of Moore are over. Still I used to hate Intel, but now root for them. I guess I like underdogs.
I don't know how you can root for Intel. Until AMD made it's comeback Intel charged premium prices for very low performance improvement generation after generation.
Intel is still the primary CPU on most laptops produced today; and only in the last 6-12 months I've started seeing some options to buy AMD powered laptops.
Even ignoring that Intel's vPRO/ME implementation is still scarier (potentially more invasive) than the somewhat equivalent AMD "feature".
Yes, I am old enough to remember the years 1994/1995, when Intel has introduced the split between the "Pentium Pro" CPUs intended for "professionals" who should pay dearly for the right of having computers that function without errors, and the "Pentium" CPUs (2nd generation Pentium @ 90/100 MHz), intended for the naive laymen, whose computers are not important, so it does not matter if their computers make mistakes from time to time.
This market segmentation introduced by Intel has broken the tradition created by IBM, who had taken care from the beginning to provide the IBM PC with error-detecting memory.
You are mostly right, but AMD laptops have been available for years. One of my few Windows systems is an HP Envy x360 laptop with an AMD 4500U, from mid-2020.
Any one discovery isn't going to suffice. Building a bunch of technologies to make that discovery usable and profitable is hard and long work.
We know for years that carbon nanotubes can make great, very compact, very fast transistors. There's nothing close to a practical application of this discovery, because we barely know how to make nanotubes in tiny amounts, a handful of labs, at a great expense.
Have you? There were plenty of people predicting Moore’s law would end in x years (e.g. https://www.technologyreview.com/2000/05/01/236362/the-end-o..., from 2000), and some people (including Moore himself) argued that progression had slowed a bit, but I don’t remember anybody saying the law already had ended at any time.
The graph of CPU density, clock speed, etc has a kink in it a decade or two ago. So, yeah, at least what we thought of as "Moore's Law" is dead, and has been for a while.
Dennard scaling broke down around 2005 or 2006.
Are we one discovery away? No, probably not. Leakage current isn't a "one trick" problem, especially as the features continue to get smaller. It's a fundamentally hard problem, and as you go to smaller feature sizes, it keeps getting harder. As we go to 3D approaches, thermal issues probably aren't "one trick" away either.
If you go read what his article said, it's very clear and incontestable ended. Still, that doesn't stop a bunch of people that have no idea what Moore's Law was about from claiming whatever they want to.
Anyway, where did you hear Moore's Law ended 30 years ago? That one doesn't make sense.
The most superficial and literal reading of Moore's Law is "the number of transistors on the most economical package doubles every X months" where X changed a bit over the decades.
Not only has the number of transistors on the largest package not followed an exponential for a decade (a good hint is that people denying it repeats the law with X changed on every single iteration), but as fabs adopt more and more complex processes, the number of transistors on the most economical package has been growing very slowly, with no perspective of a doubling any soon.
On a deeper reading, the Moore's paper is all about the economics of semiconductor fabrication. And not only companies that aren't on the most advanced process are not failing as they used to, but nobody is using the law anymore to size investments.
There is just no way to read Moore's Law and interpret it into something that still exists.
> People were saying it was going to end, not that it already ended.
My read is that people who like to "buy American" are often concerned with where the product is manufactured, as opposed to where the company is headquartered, or where the product was designed. On that score AMD is quite a bit less American made than Intel simply due to being fabricated it Taiwan and assembled in Malaysia (as far as I can find).
1. Talks about 45 percent decline in stock value when amd /Nvidia declined by almost 60 percent in the same period.
2. Talks about Nvidia revenue growing by 50 percent in Q3, when in reality they fell 17 percent vs the last year. AMD's revenue fell too. Perhaps the author is confused that Nvidia's earning calendar runs 1 year ahead and they reported q3 2023 results recently, not 2022.
If you actually look into the details, Nvidia's revenue growth was lower due to their gaming/mining GPU sales seeing a major drop from 2021. This was obviously a temporary blip from the unusual spike from the pandemic. Meanwhile, Nvidia's datacenter revenue is still seeing a large amount of growth. Intel's woes are not based on temporary blips and they are certainly not still seeing greatly accelerating growth like Nvidia.
I quit Intel after working there for a decade. The open source work they did (still do in some cases) was great and I loved it.
I couldn’t stand supporting products that would come out “whenever” and would be marginally improved or not at all. 14nm++++++ was a joke. Our customers hated us so much. It was obvious in calls.
I got the feeling that Pat G cares, but I think fixing Intel is like changing the orbit of a planet.
That said, all my managers were awesome and the people were great. Sadly, having so much experience at Intel is almost a red flag on my resume.
Wow, this sounds like jingoistic, revisionist history. I don't claim to know the details but I certainly don't trust this article to be objective.
One alternate theory I've heard is that Intel bet big either keeping the feature size it was (7nm?) or on a process that was less speculative and risky than the process that ultimately got put in place to make sub 7nm chips a reality now. In other words they failed to innovate.
The demand for chips has only increased, even if the article is suggesting that "PC sales are down". Moore's law has kept pace even if CPUs have been stagnant (hint: s/C/G/).
Intel's problems have been much greater than "failing to innovate".
During many years, it looks like the Intel CPU design teams have been designing for imaginary manufacturing processes, which have never been implemented.
The Cannon Lake and Ice Lake CPU generations must have been designed for such imaginary processes, because if the parameters of the processes that have been actually used for their fabrication would have been known in advance, those projects would have been canceled already some years before their finalization (because the first 2 versions of the Intel 10 nm process have been much worse than their old 14 nm process, so they should have never been used in commercial products and no manufacturing plants should have been converted for them), saving a lot of money.
Only with the manufacturing process used for Tiger Lake, starting in Q4 2020, Intel has restored the appearance of being able to predict how their designs will work.
During the years 2015 to 2019, when Cannon Lake and Ice Lake have been designed, someone from those in charge with the developing the Intel manufacturing processes must have told continuously lies to the other divisions about what parameters will be true for their future manufacturing process.
Some of those lies have also been told in public, because Intel has published a few presentations about their 10 nm process, which supposedly worked perfectly.
There is a very large difference between just being unable to find a way to improve a manufacturing process and claiming that the process that will be available next year will have a certain set of improved parameters, despite knowing that you do not have the foggiest idea about how such parameters might be achieved, and making thus the company spend many billions for developing products based on the fake set of parameters and for converting manufacturing plants to implement the fake manufacturing processes.
I wonder if it will ever be known who were those guilty of Intel's problems, because they have caused losses of many billions, while in other places someone who causes losses of a few thousands might be fired immediately.
There’s another side to the Intel 10nm process debacle. Intel designers and architects have relied on a process and manufacturing advantage for years to overwhelm any problems they had. One of the consequences is the death of any kind of post-mortem accountability: if your product is guaranteed to make billions of dollars no matter what, why point any fingers? Management basked in the money stream. Engineers who toed the line were promoted up. Dissenters were beaten down.
This toxic environment might still be going on now if Intel maintained their process advantage… the 10nm debacle exposed the design/arch teams: their people-manager emperors had no clothes.
I still can't believe we don't have an explanation of why they can't match Samsung and TSMC in node size despite all 3 using ASML machines to do the lithography.
Outside of the lithography machines which are all the same what is Intel doing wrong in terms of industrial process? Is it the pellicle which I know does vary by manufacturer? Why haven't they adapted by now?
What explanation? Lithography is only a small part of the whole Foundry process. It is not like Samsung and TSMC are the same either.
Outside of TSMC, Samsung still hasn't perfected their node on EUV. And Intel, is still on the waiting list, skipping right ahead to High-NA EUV. With their N4 and N5 on a very short term.
People often think ASML are like convenience store where they could go and buy one right now. When in fact even if you have the money approved today ( which in itself takes a long time in a large company ) you will still be on the waiting list for 2 years or longer. You will then need another year for them to ship enough machines for you to ship your product.
The equipment is a small part of the fab process; the materials and process steps are critical and every fab is a little different. For example, Intel has been using cobalt wires while others use copper; this may or may not have contributed to Intel's problems.
I do understand that but I still want to know some specific examples.
I'm expecting stories to come out at some point in the future of dysfunction that's preventing Intel from simply implementing similar processes as their competitors. And yes I understand the competitors probably have trade secrets but they've been at least a step behind in process for so long now. 14nm when others had 7nm and just starting 7nm when others have 5nm. Hell there's already some 3nm TSMC stuff coming. The whole ++++++++ debacle was ridiculous.
At this point there has to be some major dysfunction. Are there senior people there with pet projects and adamant viewpoints that can't change? I'm expecting something along the lines of an execs insisting that cobalt conductors and liquid pellicles are the only way despite competitors kicking their ass by doing it differently. And if there are such stories to be told I'm rather shocked that no one at Intel's addressing the issues.
There's probably a book to be written here, similar to the stories told about Atari and Apple dysfunctions back in the day.
They spent years being 1-2 process nodes ahead of everyone else with excellent execution. Anyone within the organization trying to sound alarm bells would have been silenced or ignored... after all they pulled the rabbit out of the hat so many times before, why would this one be different?
As for specifics I doubt we'll get many. Chip manufacturing is famously super-secretive and has a culture of not discussing the details - even for old process nodes.
Side tangent, what about that revenue graph where Singapore is second. I can see China being number one just through size, but is every Singaporean buying ten PCs a year? Or is there some huge OEM based in Singapore that's just evaded my mind?
I’ve been a part of a number of startups that were gobbled up by big co trying to innovate their business, and it always seems to fail.
The problem I continually see is that in big companies anyone who sticks around and does “something” gets promoted. Over time as the good people leave you have a layer of of really bad managers and directors that just effectively block innovation. They’ve spent their careers being shills and are threatened by anyone innovating.