Before anyone says that Moore's law is dead, remember that transistors are still getting smaller and more dense (not to mention more expensive, which was another part of his prediction).
In every thread people confuse Moore's law with Dennard scaling (which states roughly that, as transistors get smaller, their power density stays constant, so that the power use stays in proportion with area; both voltage and current scale (downward) with length.)
Actually Moores law is dead. His prediction was not transistors would get smaller. His prediction was initial that for 10 years was that density would double every year. And then later that it would double every 18 month.
And this held longer then even he thought. But it has ended. While things still get denser, its no longer on that same trend.
The prediction wasn't that density would double. It was that "number of transistors on a chip" would double every year, and it was basically just marketing. That actually held up somewhat longer than the density doubling did, since chip area grew to fill the gap.
Smaller transistors is one way to get more transistors per chip. Chips could also get bigger, or the transistors could be packed in more densely (my VLSI is rusty, but not all transistors are minimally sized; a wider transistor can be used to push more power for example).
Chips are not getting bigger really. Its just very large 'chips' are multible chiplets package in on bigger part. And that's not what Moore was talking about.
I’m just saying it is possible to follow Moore’s law. Physics didn’t limit us, engineering and economics did. Multi-chip packaging is a fantastic response to the engineering/economic problem of low yields.
And how sustainable is making chips bigger or packing transistors more densely without making them smaller as a solution to getting chips with more transistors?
This is more of an engineering question than a physics one I think.
GPU dies are quite large compared to CPU, ballpark 2X to 3X, so I guess we could get a generation to generation and a half of transistor doubling out of increasing the size of the chips at least.
But the question is, do workloads benefit from all these transistors? Server workloads, I guess so, you can get some pretty big Xeons, or lots of transistors per package with AMD and multi-chip packaging. Consumer workloads, I think the issue is more that consumers don’t have these massively parallel workloads that servers and workstations have. Consumers have stated their preference by not paying for more than a handful of cores per chip.
The obvious way to get more transistors per core is fewer, bigger cores per chip. But changing the cores is a big engineering project. And yield issues are already a big problem, which would only become worse with more complex circuits. But until we end up with one core per chip, I think we have hit an engineering/economics problem rather than a physics one.
And the engineering/economics problem is a really big one, I mean AMD has had great success by going in the opposite direction; breaking things up further with multi-chip packages.
But the question is, do workloads benefit from all these transistors?
That's not the question at all, this was about transistors getting smaller. Now you're way out in left field talking about how gpu chips being bigger than cpus somehow means that transistors don't need to get smaller, benefits, workloads, cores per chips and a lot of other irrelevant strange goal post shifting.
All I said was that transistors are still shrinking and density is increasing, I don't know where all this other stuff is coming from.
Moore himself said (mentioned in the link) that the “law” was just supposed to convey that the way to cheaper electronics at the time was to focus on transistors, I don’t think he ever meant it was supposed to be a law of physics that stays true 50 years or 100 years later. In fact he said it would be true for 10 years
This is interesting, I never knew this. Is there somewhere I can read more on his thoughts? I was under the impression (naively so) that this was some invariant that had been somehow proven to an extent.
I searched on "Moore's law is not a law" and found [1], "Moore’s Law Is Not Really A Law But It Is Still Obeyed", from someone who was a semiconductor engineer at Intel 40 years ago. As he says, Moore's law was more a roadmap than a law. Companies strove to keep it going for fear their competitors would succeed while they failed; sort of a self-fulfilling prophecy.
Eve if Moore’s law is dead, it lived way longer than any law claiming a doubling every two years has a right to. Compound annual growth of 41% per year for 40+ years is astonishing.
I think the spirit of Moore's Law (the context in which it is most often used) is better referred to as the Law of Accelerating Returns [1] [2] (e.g. by Ray Kurzweil). Moore's Law is a sub-section (in time and technology) of the Law of Accelerating Returns.
Naw, this is from a quarter century after Moore’s predictions, and Moore’s law had already started slowing down by then. Kurzweil was also selling snake oil (immortality) here, and the proof is in the final figure in the article, it’s 100% misleading and has left me with mistrust for all the rest of what Kurzweil said. Why did he leave out all the population data between 1920 and 2000, data which we have available at our fingertips? Because then he couldn’t have drawn a completely fabricated plot showing continuous lifespan acceleration, acceleration that doesn’t exist. Why does he conflate life expectancy with longevity to imply to the reader that genetics are improving as opposed to, say, the truth about poverty, wars, sanitation, and medicine? Because then it wouldn’t tell an amazing sounding (but completely untrue) story about humans being able to live longer and see the “singularity”. This article proved to me that Kurzweil is a charlatan, which is disappointing because I loved his keynote at Siggraph right around this time where he was pitching these ideas.
Really? I thought the exponential growth of calculations per second per dollar was rather well-established at least from 1900 through 2010 [1], a period of time that is much broader than Moore's Law?
Yes really, there is no law of accelerating returns as far as I’m concerned. Kurzweil clearly tried to opportunistically generalize Moore’s law and take credit for what he thought was a bigger idea, but sadly doesn’t produce a compelling argument, given that 1) Moore’s law died a decade ago, it no longer holds, and cost of calculations per second is no longer decreasing exponentially, and 2) Kurzweil claimed that things like US GDP and human life expectancy are evidence of his so-called law, and these claims are clearly bullshit. And not just wrong but intentionally misleading, that’s the part that bothers me. Did you look at the final figure? It’s not possible to make that diagram or write the accompanying text without knowing you’re lying. How about the first or second ones, wtf even are they? The majority of plots in Kurzweil’s piece, some 20-odd figures, are direct byproducts of Moore’s law from after Moore observed/predicted it, and nothing more.
In case you missed it, my point is not that flops didn’t increased exponentially for most of the last century, it’s that Kurzweil wasn’t the first to observe it, and didn’t add anything rigorous to what Moore & others had already done. Kurzweil only made a mockery of it by fabricating some of the data and using the larger argument to claim we’re headed toward immortality, joining an illustrious list of people who get attention by hawking the fountain of youth.
BTW feel free to re-read Kurzweil’s predictions about computation growth in the article and ask yourself whether they’re true. One of his scheduled predictions is up this year. He’s off by a few orders of magnitude according to his own definitions, and furthermore his definitions are off by a few more orders of magnitude according to other researchers, and on top of that his claim that enough flops equal human brain capability are, to date, false. These things put the remainder of his predictions really far behind schedule, and possibly slipping at an exponential rate, which kinda undermines the whole point, right?
Moore's Law has been dead for a while. The most common confusion I see is that people don't understand he was referring to economical scaling. These days we're getting more density and bigger chips but prices are going up a lot too. More for more instead of more for less.
Dennard scaling died 2 decades ago. Moore's law is dead for less than a dexade. It took 4 years for density to double(from N7 to N3). And with density doubled the cost. It will get only worse from here.
In every thread people confuse Moore's law with Dennard scaling (which states roughly that, as transistors get smaller, their power density stays constant, so that the power use stays in proportion with area; both voltage and current scale (downward) with length.)