Hacker News new | past | comments | ask | show | jobs | submit login
Gordon Moore has died (moore.org)
2594 points by mepian on March 25, 2023 | hide | past | favorite | 234 comments



I have never had any brush with the giant. But I have my own little story to share. Way back in 1998, I was a newly minted engineer in India and was visiting Bangalore. While roaming around on MG Road, I picked up a book named “Inside Intel”. It is through that book that I learned about Gordon Moore. On the 48 hour train journey back to Delhi, I devoured the book. It left an impression and I was in awe. It also ignited something in me. I wanted to become a better engineer, so I bought my first computer soon after. I wanted to be in the “Silicon Valley”. As the fate would have it, 3 years later, I landed there. Little did I know that this serendipity and random inspiration would weave my life story. I remain awestruck with the person and the impact that he left. RIP


There are small moments in life that can change us forever!


We are results of many accidents.


My dad told me it was just one.


Mine did the same before my mom whacked him and corrected that I was "a happy surprise" while glaring daggers at him.


Blossom book house <3


In 1999 on MG Road, it was more likely a roadside vendor with pirated books, Higginbotham, or unlikely, Gangarams. Blossom’s (Church St) wasn’t yet open IIRC.

Late 90s and early 2000s, I recall that Intel Inside, Lee Iacocca’s book, Who Moved my Cheese, Seven Habits of Highly Successful People, Jack Welch’s book, etc were commonly sold, in addition to the latest novels. I even remember buying Sylvia Nasar’s A Beautiful Mind (was popular because of the movie then) from a roadside vendor.


wow, I'm just too young to have experienced all this :(


Happy memories of the book shops on MG road. Last time I went most of the Bangalore book shops were a shadow of there former selves


On the contrary, the bookshop situation in Bengaluru is exceptional!

The old book shops have been disrupted. Sapna is one that has adapted well, with a publishing house, decent web-presence, and has grown too.

For the best technical books, you now go to Tata Book House (may be others, I am unaware of). Novels, and such, Blossom's on Church Street is one of the best. Then there are cafe-bookshop types like Crossword, etc.

Kannada and other language book stores are also thriving.


Ok thanks, I’ll try those you have suggested :)


How incredible is it, that I'm writing software on a laptop, connected to the internet over wifi, all while being 38,000 feet in the air. And all this enabled by billions of tiny transistors made from sand. And to make this all the more surreal, I'm sitting next to someone who knew him.

Thank you Gordon Moore. RIP.


> all while being 38,000 feet in the air

I was pushing code while on a transatlantic flight the other day and I got a little overwhelmed with just how cool it was. There I was in a metal tube staying aloft with 2 jet engines, and had wifi at 30k feet while traveling at ~600 mph.


And all of this..all of us..basically coalesced from star dust that will one day again be star dust :)


Maybe. Might just end up being a dense ball of nothing.


To be fair, "star dust" is not rigorously defined, and the cosmological definition of "dust" includes some pretty big chunks. So after the sun swallows the earth, there's a very good chance we'll qualify. :)


Also we are more patterns in the dust given that most of the atoms in our bodies swap for new ones over a couple of years. The patterns may be able to continue separately from the dust.


At the cosmological scale, stars are dust.


I thought that at any scale, that they were atoms. Or quarks. Or quantum fields. Or whatever.


A dense ball of _everything_.


That's what we are, entropy generating star dust.


> I'm sitting next to someone who knew him

Is this person someone who is (was) traveling with you, and therefore you knew that he/she knew Gordon Moore, or did you just got to know this person by random chance?

Given he/she knew him, can you ask to share some memories, and post it here?

RIP.


Random chance


Holy smokes!


Hacker Newsing above the clouds? What a world we live in…


Cloud computing!


I once got the opportunity to ask Gordon Moore a question, “what is the equivalent of Moore’s Law for software?” His response, “number of bugs doubles every year”. Great man.


This made me immediately think of "The bug count also rises" http://www.snider.com/jeff/hemingway.html and of course Bryan Cantrills reading of it: https://youtu.be/4PaWFYm0kEw?t=1243


Since IIRC bug count is roughly proportional to source code line count, this implies the aggregate complexity of software doubles too. That might sound foreign, but if you think duplicated effort and accidental complexity rather than problem domain complexity it wouldn't surprise me if it was close to true.

Points to the value of sticking close to the problem domain.


I heard error count is proportional to LoC + # of developers involved in writing it. So as well as the accidental complexity from LoC, the more people touch a code base, the greater the chance of broken communication and misunderstandings. Ie bugs.


I suspect lines of code and number of developers are strongly enough correlated that it's hard to get any additional predictive power from the latter!


I think that's true when the source is first written. But # of devs only ever goes up during maintenance lifecycle, while loc may not change much.


Maybe. In my experience it's common for developers who were not part of the original effort to maintain by addition, because they are worried changing the existing code might break stuff. So with increased maintainer count the code size grows too.


I suspect the same holds true for hardware bugs in silicon, given the increasing complexity of CPUs. ;)


Maybe I'm naive but i think hardware cannot afford such condition. Even the name is a hint, software was termed so because it was easier to modify, allowing more errors to get in because you can fix it later at a lower cost.


Reading errata for Intel/AMD processors would be an eye opening experience then.

There are so many bugs in features you never heard of, just like software. It is inevitable given the complexity.


I'm sure there are tons of bugs, I was told a few stories by people in aerospace industry. But I think they cannot survive if they don't keep the number ultra low, because they can't fix it (well except microcode level).


They can't fix it, but compiler teams can.

It's a testament to those unsung heroes that most high-level language users can remain blissfully unaware of how much is automatically fixed by compilers in their emitted instructions.


Reminds me of the classic aerospace joke.

Q: In the age of Pentium, how do you pronounce IEEE-754? -- A: AAAAAAIIIIIIEEEEEEEEEE!


Reading errata for just the various STM32 microcontrollers was enough to dissuade me of the notion that hardware is infallible!


My personal favourite is from STM32F427:

"When PA12 is used as GPIO or alternate function in input or output mode, the data read from flash memory (bank2, upper 1MB) might be corrupted."

I can't even imagine the head scratching one might go through if hit by this, particularly late in development process (when the firmware becomes larger than 1MB).


Would fabrication failure be considered a bug? Isn't the rate kinda up there for chips that don't make it out of the factory? Not to mention on arrival there's a failure rate.

But also, I don't know too much about this, but when Linus tech tips did a tour of a chip fabrication plant, I remember learning that apparently the high end chips are chips where all the little bits inside all work, and if they detect like, busted wee transistors or whatever, they can just use that chip as a lower power processor. If that's the case that's like... Kinda a bug, but they just "code around it," right?


I'd say defect, not bug.


Yes, because those HW bugs are at the origin, SW bugs.

I heard that the last processor where they still used VLSI by hand was DECs ALPHA. Now everything comes from HDL synthesis so bugs are common.


Establishing an equivalent law for firmware would be interesting too.


Wirth's law :)


I love that!

Thanks for sharing it.


I would like to propose "Grove's Axiom" which basically states:

If you're failing, pivot to what your replacement will do before you fail.

"In the midst of the doldrums of 1985, Grove posed a hypothetical question to his colleague Gordon Moore: “If we got kicked out and the board brought in a new CEO, what do you think he would do?” Moore answered without hesitation, “He would get us out of memories.” To which Grove responded, “Why shouldn’t you and I walk out the door, come back and do it ourselves?” [1]

[1] https://hbr.org/1996/11/inside-intel


This reminds me of the advice:

When deciding what to do, imagine that you are not talking to yourself but rather what you would tell someone else to do in your situation.

This simple mental shift apparently removes a lot of cognitive biases and emotional baggage that might be affecting your decision making process.


People ask me for advice and I immediately realize why am I not following it myself? I have had multiple people come back years later and tell me that they would have never done X or achieved Y if it weren't for a discussion we had years earlier.

The analytical knowing mind has to drag all this other stuff around!


(Prompt engineering is a powerful technique with humans as well as LLMs!)


That's a pretty powerful axiom. I wonder how applicable it is to life in general.

Relationship is failing (and you want to save it): what will their next partner do differently? Do it now!

Worried you might be kicked off a sports team: what will the new player do better than you? Practice that!

Imposter syndrome has you worried your boss will fire you: what will the replacement, the "real" person do that you aren't? Start doing that today.


I would be careful applying this one to imposter syndrome. I think of imposter syndrome as a failure to calibrate your expectations of yourself with the reality of what your work requires. Asking yourself what the "real" person would do that you aren't doing seems to me like it would play into that miscalibration. You might be better of asking _someone else_ what your replacement would do.


I was a kid reading 'Popular Electronics' in the mid 70's in a rural Florida high school. I did an English class presentation on transistors, mentioning Moore's Law. My electronics never got past the Radio Shack 150-in-1 Electronics kit and I went into software. I feel like I owe my career there to the work of the geniuses like Moore who not only made something rare and special, they made it common, nearly universal.


Dad was an electrical automotive mechanic so I didn't have a choice there.

Did EE/CS and have all the standard electronic lab bits from a trinocular microscope w/ polarized ring light to the JBC knock-off soldering station. I keep meaning to buy an HP 54600B to get the Tetris easter egg.

Software and systems are where the money's at now (until the tech singularity ruins it), hardware is just for fun.


These kits were awesome, BTW. I had one too. I know you can find them still on Amazon, but I never got my kid, who's otherwise into computers, interested in them.


Wow, we have lost a true legend. As I get older, I marvel a bit at the giants I once shared time on the earth with. Sometimes it feels like we don't have giants in the 21st century in the same way.


I think it is hard to see giants when they are just innovators.

To everyone around them and especially themselves, they were just pushing the ball forward day-by-day; they weren't heroes or insanely prescient, they were just doing the next logical thing to do. It's only after they've climbed the mountain that you can appreciate what they did.

That's probably the hardest thing about innovation. To everyone who wants to innovate, they only see the mountain. They have the grand idea of where they will end up, when the reality of innovation is just pushing the ball forward one step at a time. It's not heroism or being a genius necessarily, it's persistence and being able to think on the fly.

I think "genius" knows where the general destination is but I feel like most aren't doing it for the destination, they are doing it because they feel it is a calling and what they must do.


I think this is tautological - this is how it must always be, and I would guess has been the common view at any given point of time in history.

You can only properly assess and appreciate the impact sufficient to assign 'legend' status to an individual or institution retroactively, decades hence from the actions that elevated them there.


Reminds me of this silly quote from The Office (US)

> I wish there was a way to know you were in the good old days before you actually left them. > - Andy Bernard


Fortunately we have that opportunity today


do you mind elaborating?


it's always the good ole days for someone.

it's always 5pm somewhere for someone.


"It's always the good ole days for someone" is a little phrase I'm going to try to remember and think of from time to time. Thank you for this :-)


Not at all. Moore himself has been recognized and celebrated for decades.

But many other such figures achieved 'legendary' status while they were very much alive and in their prime: Edison, Ford, Bell, Wright Bros, Disney, Carnegie, Einstein, Oppenheimer, etc.

Moore himself was a member of an elite crew and one of Shockley's (the inventor of the transistor) "Traitorous Eight" which spun out from their former mentor to join Fairchild Semiconductor before all of them spun out from that enterprise to found their own companies and in the process create Silicon Valley proper.


I sometimes make myself a mental list of famous people who are expected to die soon. The following people will all die in the next few years: Jimmy Carter (98), Henry Kissinger (99), Dick Cheney (82), Pope Francis (86), Paul McCartney (80), John Williams (91), William Shatner (92), and Ali Khamenei (83).


Attenborough. That will be the end of our planet's conscience, symbolic of the era where we discovered how truly incredible a place we live in, and what we are doing to it.


Buzz Aldrin (93), although he might outlive all of us


There is a website for such predictions: https://deathlist.net/


Well, that's delightfully morbid.


Even in death Ringo will be in Paul McCartney's shadow haha.

A huge one for me is Mel Brooks, it is pretty rare for a celebrity death to have an impact on me but that one will.


Sadly Mick Jagger is on my list, his death would mark the end of an era.


Related:

"I think youngsters need to start thinking about the kind of world they are going to leave for me and Keith Richards (79)." - Willie Nelson (89)



Warren Buffett (92)


And Charlie Munger (99) is still somehow doing press junkets.


Gene Hackman (93)


Alan Alda.


And Mike Farrell.


Not to diminish your list, but notably they're all non scientific, there's a push with Kissinger. (One could see economics as a science)

Point being: they only serve the generation who survive them, Moore, will outlive us all, in fact and in legacy.

Edit: What I said isn't true, music, movies and arts will outlive us. We have different lists.


Their impacts go beyond their generation by having shifted the world in some way.


> music, movies and arts will outlive us.

Will they? Who watches old movies?


I just watched Some Like It Hot (1959) a few weeks ago. Great comedy, looks phenomenal in 4K


I watch old ones, too, but always by myself. Nobody else wants to. It's always "eww, black and white!"

Often they're right. They'd be much more watchable with an AI that could sharpen the picture and colorize it.


Watched it with the family! Was a hit

Hard disagree on the colorization bit. I don’t see how color would improve some thing like Psycho or Citizen Kane


Nice!


I think it'd take more than picture quality to make classic films accessible. Acting styles and pacing especially have changed a ton since the b&w era. It's not to say they're not good, but they're a bit of an acquired taste.


A kid in this house has started an appreciation for the original Twilight Zone, and enjoyed Casablanca before that. Gotta start early I guess. TV is scarce in this house, which promotes reading and feeling lucky to see a great movie, no matter the era.


I don’t know. Sometimes poor contrast black and white film does really well for the atmosphere/vibe/whatever the hell you want to call it.


"Sometimes" is the right word. Usually, it's just inferior. B+W was used for artistic purposes only rarely, the rest of the time it was because it was cheap and easy.


Peter Parker for one. > Hey guys, you ever see that really old movie, The Empire Strikes Back

As a kid I watched many films from my parent's generation. Mary Poppins (1964) or Great Escape (1963) come to mind as being staples every christmas -- Great Escape on boxing day especially

A generation before that people watched "It's a Wonderful Life" (1946) for much the same reason.

I've subjected my kids to a few old films from well before they were born, but they were films from my youth -- Bill and Ted, Back to the Future, Men in Black, Muppet Christmas Carol etc. And yes they've seen Star Wars, which was from before I was born (just), but that's an ongoing franchise.

I don't see them watching films from the 1960s like I did though, and I doubt they'll make their kids sit through films from the 90s.


> A generation before that people watched "It's a Wonderful Life" (1946) for much the same reason.

Some of us still watch it!

Of course the vast majority of old movies are long forgotten (and no great loss), just as the vast majority of what we're producing today will be. But there are a minority of classics that endure. Just as there's still a market for Shakespeare, Dickens, and Mozart, despite all the plays, books and music that have been written since.


Moore lived long enough to see Intel financialized, and its engineering crown taken by its rivals.

Only the paranoid survive, one of his colleagues once said, and I guess Intel stopped being sufficiently paranoid.


That “colleague” was Andy Grove, also a rather impressive person.

https://www.goodreads.com/work/quotes/664484-only-the-parano...

He got Parkinson’s and died 7 years ago. https://news.ycombinator.com/item?id=11333402


Intel is not dead, wounded perhaps. I would still be very surprised to see them disappear.


they'll survive due to geopolitics if nothing else, the US government has no choice but to make sure they stay viable due to the potential issues in Taiwan


This, anything happening to Taiwan will shaken IT and electronics industry severely. We are ahead of "chip-covid" event in my opinion as IT wasn't enough shaken by SARS-Cov2 or Ukraine, I hope I am wrong. Intel stock price would sky rocket then.


It takes time to understand who is the true giant. Robert Metcalfe just received the ACM Turing award last week and his contributions where from the 70's. It's the same with Nobel laureates, it takes year for your work to be recognized.


Not really, no. He may have just recently received an award (he's received many awards) but Metcalfe has been recognized for his achievements for decades. He got the ACM Grace Hopper award in 1980: https://en.m.wikipedia.org/wiki/Grace_Murray_Hopper_Award


In Antarctic exploration, there was a time called Heroic Age of Antarctic Exploration: https://en.wikipedia.org/wiki/Heroic_Age_of_Antarctic_Explor...

People attempted to use dog sleds to traverse up the antarctic plateau, which is nearly the same altitude as Tibet, and reach the pole.

Many died in the process, their names are remembered (though slid out of public discourse as its over a century ago).

In our industry fortunately people don't die, but trailblazer had to do a similar thing - they used the proverbial dog sleds to build a camp in the middle of antarctic plateau.


I mean, not yet. Right now in tech, we are subject to 38 year olds careening through life, dying in their mid 50s

Whatever legacy and mature rebranding they attempt wont be apparent until 2070


Try to become a giant yourself - why not?


I've thought about that a lot. Right now, I think it's just not what drives me. I'm motivated by novelty and certain types of experiences. Not really by years-long projects trying to break new ground.

But my comment doesn't come out of a sense of desire to emulate. Just out of a sense that especially in the early-to-mid 20th century, we had these titanic characters: Einstein, Tesla, Von Neumann, The Beatles, and so forth.

In a lot of ways, I think it comes down to these folks existing in the beginnings of the information age, and how many advances were unlocked by the increase in communication speed.

That was probably a lot more groundbreaking than the digital revolution. For all the progress we've seen since I was born in the 80s, there's not a lot of things that have fundamentally changed in terms of human capability. Although, we may see in the next decade or two the same sort of paradigm shift.


A giant typically means famous. There are many giants that we will never know the name of. For me, I’d rather die famous with my friends and family and have changed the world in my silent way. I’ve been fortunate enough to do it a few times already, and I hope I’ll get another chance before I die as most people never get the first chance. That’s I think a better goal - when a chance comes to change everything, jump into it head first. But don’t bother trying to be famous. It’s meaningless.

Regardless, I bid Dr Moore a fond farewell. He lived to see the horizon of his law.


> There are many giants that we will never know the name of.

And sadly, there are millions living out their lives in poorer parts of the world who might have become giants, if only they'd had access to nutrition, good parenting, education, connections, etc.


His law was probably self fulfilling. In some ways more like an actual law than laws if nature


> Sometimes it feels like we don't have giants in the 21st century in the same way.

Somehow it feels like we'll never have legends in an age when GPT-4/5/... are accessible to everyone.


When you zoom out on the timeline of humanity, this man, among others, will have left a huge spike in our technological progress. When I look back at the last 50 years and compare it to all known previous human history, I'm quite humbled to be alive during this timeline with them, despite some things I also greatly abhor.


Wow just yesterday a few days ago I watched his Oral History:

https://www.youtube.com/watch?v=gtcLzokagAw

What I learned is that I had no idea how semiconductors are actually made and how they made them early on.

Its really interesting interview!


>yesterday a few days ago

i like this. keeps people guessing as to why it was phrased as such. makes it more mysterious


Dyslexia is my superpower


I parse it as:

Yesterday, it was a few days ago.

Today, it was many days ago :)


this was my interp as well


Typing in small textboxes always makes me feel stupid.


For reference to anyone who doesn't know : He is the Moore of Moore's law :

https://en.wikipedia.org/wiki/Moore%27s_law

We lost a visionary and a pioneer. I like to think we have people who not only think like he did, but who can get the message across.


Moores law lives on in Kurzweils law (in spirit)


No it doesn’t - take that singularity bullshit elsewhere. Gordon Moore never aligned himself with that pseudoscientific nonsense https://cs.stanford.edu/people/eroberts/cs181/projects/2010-...


Can you please make your substantive points without name-calling? It's not hard, and we're trying for something a bit different from internet default here.

https://news.ycombinator.com/newsguidelines.html


Was there an edit? I don't see any name-calling in GP.


"That singularity bullshit" and "that pseudoscientific nonsense" count as name-calling in the sense that https://news.ycombinator.com/newsguidelines.html uses the term:

"When disagreeing, please reply to the argument instead of calling names. 'That is idiotic; 1 + 1 is 2, not 3' can be shortened to '1 + 1 is 2, not 3."

(a bit more explanation if anyone cares: aggressive pejoratives tend to poison curious conversation and are easy to cut, so this isn't a hard one. I think mostly people write like that online because it's how many of us would talk in person with people we're friendly with—and there it's totally normal, because there's already a 'contract' that takes care of understanding each other's good intention. That's absent online, so many commenters are on a hair trigger with this kind of thing and will react in ways that take the conversation to less interesting, more conflictual places.)


Though the basic point that computers keep increasing in processing power still seems to hold, even if it's not just by having more transistors per area.


I don't understand what you linked. Don't they believe in technological singularity?

I would think a intelligence capable of self modification could use the scientific method to increase it's performance (even if it means scaling in size like a matrioshka brain).


> Don't they believe in technological singularity?

“ In fact, in a 2005 interview, Moore personally stated that his prediction of exponential growth would not hold indefinitely, simply due to "the nature of exponentials."

No.


What a respect-worthy man, and a lucky man to have been born at the beginning of an age of exponential innovation. Also, from all that I've read about him he gave back to the world in equal measure and was the best of a past generation of Silicon Valley entrepreneur.

For anyone who has interest / time and lives in the Bay Area, watch this PBS program for the history of the "Fairchildren" and the roots of Silicon Valley. (of which Gordon Moore was one of the "traitorous eight" who contributed to making SV what it is today) If you live, drive, work around the area, there are so many landmarks and legacies of this phase of history that you can still see today (but only if you keep an eye out and take a little effort to learn the past).

https://www.pbs.org/wgbh/americanexperience/films/silicon/#f...

(links to Amazon Prime, Apple TV there, and you can also find this on your favorite BitTorrent source...)


Weird how this works sometimes… I was just at the Monterey Bay Aquarium for the first time today, saw the name on the auditorium and thought “that Gordon (and Betty) Moore?”

Truly an incredible place, and thanks in very large part to the Moores. RIP Gordon.


The Monterey Bay Aquarium was founded by two daughters of David Packard of Hewlett-Packard fame (along with Stanford professor Chuck Baxter who died last year.) Deep tech roots there.


Wow this has made me realise that the maths library at Cambridge was also sponsored by them. Never really gave a thought to it until now, just assumed they were some random rich alumni!



RIP. As an ex-Intel employee, this one stings. He was a godly figure internally and even during Intel's deteriorating years, engineers and technologists looked up to him as a constant north star of inspiration and engineering culture. Hope his demise gives a long pause to Intel's leadership and bring back some of good parts of the engineering culture from the 80's and 90's.


Before anyone says that Moore's law is dead, remember that transistors are still getting smaller and more dense (not to mention more expensive, which was another part of his prediction).

In every thread people confuse Moore's law with Dennard scaling (which states roughly that, as transistors get smaller, their power density stays constant, so that the power use stays in proportion with area; both voltage and current scale (downward) with length.)


Actually Moores law is dead. His prediction was not transistors would get smaller. His prediction was initial that for 10 years was that density would double every year. And then later that it would double every 18 month.

And this held longer then even he thought. But it has ended. While things still get denser, its no longer on that same trend.

This interview with Moore explains it:

https://www.youtube.com/watch?v=gtcLzokagAw

History by Cantrill:

https://www.youtube.com/watch?v=TM9h89Vo_Qo

David Patterson perspective:

https://www.youtube.com/watch?v=kFT54hO1X8M

Opposing point of view Jim Keller:

https://www.youtube.com/watch?v=oIG9ztQw2Gc

Patterson response to Keller:

https://www.youtube.com/watch?v=Qz0eJA1TP3Y

There are talks by David Patterson, Jim Keller and


The prediction wasn't that density would double. It was that "number of transistors on a chip" would double every year, and it was basically just marketing. That actually held up somewhat longer than the density doubling did, since chip area grew to fill the gap.


density would double every year

I'm not a physicist, but I think they need to get smaller to do that.


Smaller transistors is one way to get more transistors per chip. Chips could also get bigger, or the transistors could be packed in more densely (my VLSI is rusty, but not all transistors are minimally sized; a wider transistor can be used to push more power for example).


Chips are not getting bigger really. Its just very large 'chips' are multible chiplets package in on bigger part. And that's not what Moore was talking about.


GPU dies are pretty big, right?

I’m just saying it is possible to follow Moore’s law. Physics didn’t limit us, engineering and economics did. Multi-chip packaging is a fantastic response to the engineering/economic problem of low yields.


And how sustainable is making chips bigger or packing transistors more densely without making them smaller as a solution to getting chips with more transistors?


This is more of an engineering question than a physics one I think.

GPU dies are quite large compared to CPU, ballpark 2X to 3X, so I guess we could get a generation to generation and a half of transistor doubling out of increasing the size of the chips at least.

But the question is, do workloads benefit from all these transistors? Server workloads, I guess so, you can get some pretty big Xeons, or lots of transistors per package with AMD and multi-chip packaging. Consumer workloads, I think the issue is more that consumers don’t have these massively parallel workloads that servers and workstations have. Consumers have stated their preference by not paying for more than a handful of cores per chip.

The obvious way to get more transistors per core is fewer, bigger cores per chip. But changing the cores is a big engineering project. And yield issues are already a big problem, which would only become worse with more complex circuits. But until we end up with one core per chip, I think we have hit an engineering/economics problem rather than a physics one.

And the engineering/economics problem is a really big one, I mean AMD has had great success by going in the opposite direction; breaking things up further with multi-chip packages.


But the question is, do workloads benefit from all these transistors?

That's not the question at all, this was about transistors getting smaller. Now you're way out in left field talking about how gpu chips being bigger than cpus somehow means that transistors don't need to get smaller, benefits, workloads, cores per chips and a lot of other irrelevant strange goal post shifting.

All I said was that transistors are still shrinking and density is increasing, I don't know where all this other stuff is coming from.


They haven't been doing so at that rate for a very long time.


Moore himself said (mentioned in the link) that the “law” was just supposed to convey that the way to cheaper electronics at the time was to focus on transistors, I don’t think he ever meant it was supposed to be a law of physics that stays true 50 years or 100 years later. In fact he said it would be true for 10 years


This is interesting, I never knew this. Is there somewhere I can read more on his thoughts? I was under the impression (naively so) that this was some invariant that had been somehow proven to an extent.


I searched on "Moore's law is not a law" and found [1], "Moore’s Law Is Not Really A Law But It Is Still Obeyed", from someone who was a semiconductor engineer at Intel 40 years ago. As he says, Moore's law was more a roadmap than a law. Companies strove to keep it going for fear their competitors would succeed while they failed; sort of a self-fulfilling prophecy.

[1] https://medium.datadriveninvestor.com/moores-law-is-not-real...


Eve if Moore’s law is dead, it lived way longer than any law claiming a doubling every two years has a right to. Compound annual growth of 41% per year for 40+ years is astonishing.


I think the spirit of Moore's Law (the context in which it is most often used) is better referred to as the Law of Accelerating Returns [1] [2] (e.g. by Ray Kurzweil). Moore's Law is a sub-section (in time and technology) of the Law of Accelerating Returns.

[1] https://www.kurzweilai.net/the-law-of-accelerating-returns

[2] https://link.springer.com/chapter/10.1007/978-3-662-05642-4_...


Naw, this is from a quarter century after Moore’s predictions, and Moore’s law had already started slowing down by then. Kurzweil was also selling snake oil (immortality) here, and the proof is in the final figure in the article, it’s 100% misleading and has left me with mistrust for all the rest of what Kurzweil said. Why did he leave out all the population data between 1920 and 2000, data which we have available at our fingertips? Because then he couldn’t have drawn a completely fabricated plot showing continuous lifespan acceleration, acceleration that doesn’t exist. Why does he conflate life expectancy with longevity to imply to the reader that genetics are improving as opposed to, say, the truth about poverty, wars, sanitation, and medicine? Because then it wouldn’t tell an amazing sounding (but completely untrue) story about humans being able to live longer and see the “singularity”. This article proved to me that Kurzweil is a charlatan, which is disappointing because I loved his keynote at Siggraph right around this time where he was pitching these ideas.


Really? I thought the exponential growth of calculations per second per dollar was rather well-established at least from 1900 through 2010 [1], a period of time that is much broader than Moore's Law?

[1] https://miro.medium.com/v2/resize:fit:1400/1*vlmZ_Y3a-PGsK0S...


Yes really, there is no law of accelerating returns as far as I’m concerned. Kurzweil clearly tried to opportunistically generalize Moore’s law and take credit for what he thought was a bigger idea, but sadly doesn’t produce a compelling argument, given that 1) Moore’s law died a decade ago, it no longer holds, and cost of calculations per second is no longer decreasing exponentially, and 2) Kurzweil claimed that things like US GDP and human life expectancy are evidence of his so-called law, and these claims are clearly bullshit. And not just wrong but intentionally misleading, that’s the part that bothers me. Did you look at the final figure? It’s not possible to make that diagram or write the accompanying text without knowing you’re lying. How about the first or second ones, wtf even are they? The majority of plots in Kurzweil’s piece, some 20-odd figures, are direct byproducts of Moore’s law from after Moore observed/predicted it, and nothing more.

In case you missed it, my point is not that flops didn’t increased exponentially for most of the last century, it’s that Kurzweil wasn’t the first to observe it, and didn’t add anything rigorous to what Moore & others had already done. Kurzweil only made a mockery of it by fabricating some of the data and using the larger argument to claim we’re headed toward immortality, joining an illustrious list of people who get attention by hawking the fountain of youth.

BTW feel free to re-read Kurzweil’s predictions about computation growth in the article and ask yourself whether they’re true. One of his scheduled predictions is up this year. He’s off by a few orders of magnitude according to his own definitions, and furthermore his definitions are off by a few more orders of magnitude according to other researchers, and on top of that his claim that enough flops equal human brain capability are, to date, false. These things put the remainder of his predictions really far behind schedule, and possibly slipping at an exponential rate, which kinda undermines the whole point, right?


Moore's Law has been dead for a while. The most common confusion I see is that people don't understand he was referring to economical scaling. These days we're getting more density and bigger chips but prices are going up a lot too. More for more instead of more for less.


Prices of the chips may be rising, but power usage is falling and in the end, that's often the bigger cost.


Dennard scaling died 2 decades ago. Moore's law is dead for less than a dexade. It took 4 years for density to double(from N7 to N3). And with density doubled the cost. It will get only worse from here.


Gordon built something amazing, and I was lucky to have started my career at his creation.

I was fortunate to be able to meet Gordon at his house in Hawaii a few years back, which was an amazing experience for an engineer that grew up learning from his wisdoms and accomplishments.

Rest in peace


I wouldn't be who I am today without Intel enabling what I did in my childhood years, so paying my respects isn't nearly enough to make up.

A big, sincere pressing of the F to pay respects. May he rest in peace.


Well, for me getting started it was Zilog (but the Z80 was based on the Intel 8080), Mostech and Motorola. But Intel had a big input too, and of course an overwhelming influence with the 8086.

And as John Donne said:

"any man's death diminishes me"

but some more than others.


In 2004 or 2005, Gordon came to UCD and gave a talk about computing — both history and future. It was unintentionally intimate: the overhang of the dotcom bust lead to the CS department’s lowest enrollment in history. Gordon had a really thoughtful response to my question about a particular chart. I remember walking away inspired and excited about the future of the industry I’d soon be working in. All these years later, I’m still just as excited. Thank you, Gordon. RIP


He definitely deserves the memorial banner. But wow, what a well lived life


We should all be so lucky. It's always sad when it happens though. RIP


RIP


I'm a little curious, is there a reason why the ultra rich, once they've cashed out, create foundations that hand out grants? Is it because they suddenly find themselves wanting to do something good for society with the money or just because that is what everyone in that situation does? Is there a ranking of what foundations have done the most good work? Is that ever evaluated?

Anyway rest in peace Gordon Moore, a true captain of industry.


Those donations can often be arranged to be tax deductible, so instead of letting $Government spend the money you get to direct its course a little more precisely. At larger income levels they often manage to make a donation pay off as "taxes", but also obligate others of their strata socially, and wield control over the recipients ("we'll sponsor your publication for a year of you dont talk about $controversy").


> why the ultra rich, once they've cashed out, create foundations that hand out grants?

Not qualified to say whether this is exactly the case here, but I think this video about charities/funds as power-wielding vehicles bears some relevance: https://m.youtube.com/watch?v=0Cu6EbELZ6I

Disclaimer: not a big fan of the delivery style in the video.


You've hit on a few reasons. Basically: taxes, no more need for money, desire to give back, desire to have a lasting legacy, societal pressure...

And no, there really is very little work done evaluating private foundations. They operate with very few guardrails and oversight is extremely minimal. It is difficult to evaluate the effectiveness of charities, so how do you evaluate foundations that give grants to charities?

I am an academic that studies foundations and would be happy to talk more about it if you have other questions.


They've been through the struggle themselves, and especially in the tech sector, they must've been very interested in their specific field to devote their life to building a large business around it.

Thus when they find themselves with the time and money to support new innovators, they end up handing out grants as that's an easy way to support fundamental research of the sort that they too once did.


RIP he funded a park near me. Did more than just computers.


His perspective must have been pretty gnarly. Starting from literally the ground floor when it comes to transistors and personal computers - and living to see stuff like the 13900k. RIP.


In 2 years time, he will have died at 188. (edit for maths failure)


Wow

>> By the 50th anniversary of Moore’s Law in 2015, Intel estimated that the pace of innovation engendered by Moore’s Law had ushered in some $3 trillion in additional value to the U.S. gross domestic product over the prior 20 years. Moore’s Law had become the cornerstone of the semiconductor industry, and of the constantly evolving technologies that depend on it.

Critical to that engine of growth had been U.S. investment in basic research and STEM education, ten percent of the U.S. Federal Budget in 1968. By 2015, however, that had been reduced to a mere four percent. To Gordon, investment in discovery-driven science was another key impetus behind creating the Gordon and Betty Moore Foundation in 2000, especially in the context of a widening funding gap for something he recognized as critical to human progress.


Oh man, I was just at the Monterey Bay Aquarium earlier today reading a sign that said, "This exhibit made possible by Dr. and Mrs. Gordon Moore".


No tapping on the glass. Moore's law.


Thank you for your contributions Mr. Moore. The world appreciates it. Rest on.


A true legend and a giant in the foundations of multiple tech industries and devices all using semiconductor technology and the co-creator and center of "Silicon Valley" and the creator of Moore's Law.

RIP.


Scroll to 2:20 in the video below

Moore admits as a kid he had an explosives lab at home (unlicensed) back in the day. Even Elon Musk admitted he made pipe bombs as a kid.

Wonder how many similar promising science kids today would be / have been arrested and imprisoned today by the goons at Homeland for experiments like these ?

https://www.sciencehistory.org/files/scientists-you-must-kno...


Long live Moore's law


Sad to say, I think his law died with him.


One of the earlier scaling laws of computing. Now we're into AI scaling laws.


It has probably just always been an exponential and any curve can be a straight line if you zoom far enough in and Moore's law was just that. Now we've got some perspective and are starting to see the actual exponential.


Model size will double every 18 months?


And number of jobs automated.


Do you have a source or further info you could share on this?


Moore's law is really dead now


I bet i’m not the only one that was looking through comments for this.


Moore died so his law could live on.


In death, members of project meyham have names. His was Robert err.. Gordon Moore.


As I grow older I witness that the people that built the computing industry are slowly but surely fading out. The next generation now stands on their shoulders and keeps on building the next generations of computing. Thank you Gordon Moore. RIP.


2500 points. Wow. I haven't seen upvotes like this... ever.

Unmentioned so far- I was watching the University of Utah's 50th Anniversary of CS sessions[1], and- apologies I forget who- someone mentioned that it was Carver Mead who largely promoted "Moore's Law" as an idea. Just some random trivia, to highlight how such a strong piece of identity around such a more dynamic & wide-ranging man came about.

Hopefully not the "Douglas Engelbart: creator of the mouse" identity.

[1] https://www.youtube.com/watch?v=LUFp6sjKbkE


Oddly, www.intel.com has nothing about this on its Canadian frontpage (https://www.intel.com/content/www/ca/en/homepage.html), but their newsroom does: https://www.intel.com/content/www/us/en/newsroom/home.html



It isn't often one person or a group like the "Traitorous Eight" [1] go on to make entire industries and new platforms. They did it though and that included Gordon Moore and Robert Noyce. Moore and Noyce later split from that and made NM Electronics which became Intel.

This was back when engineers/product people ran things and competition via skill not just funding was the driving force. Imagine a new company today fully controlled by the engineers/creatives/product people, it happens but not as often. We need to get back to that.

The Moore's Law is an interesting case study in creating a term/law that supersedes you and inspires your self interest but also the interest of the industry and innovation. The root of Moore's Law was making more products and cheaper, allowing more to use computing.[2]

> Prior to establishing Intel, Moore and Noyce participated in the founding of Fairchild Semiconductor, where they played central roles in the first commercial production of diffused silicon transistors and later the world’s first commercially viable integrated circuits. The two had previously worked together under William Shockley, the co-inventor of the transistor and founder of Shockley Semiconductor, which was the first semiconductor company established in what would become Silicon Valley. Upon striking out on their own, Moore and Noyce hired future Intel CEO Andy Grove as the third employee, and the three of them built Intel into one of the world’s great companies. Together they became known as the “Intel Trinity,” and their legacy continues today.

> In addition to Moore’s seminal role in founding two of the world’s pioneering technology companies, he famously forecast in 1965 that the number of transistors on an integrated circuit would double every year – a prediction that came to be known as Moore’s Law.

> "All I was trying to do was get that message across, that by putting more and more stuff on a chip we were going to make all electronics cheaper," Moore said in a 2008 interview.

> With his 1965 prediction proven correct, in 1975 Moore revised his estimate to the doubling of transistors on an integrated circuit every two years for the next 10 years. Regardless, the idea of chip technology growing at an exponential rate, continually making electronics faster, smaller and cheaper, became the driving force behind the semiconductor industry and paved the way for the ubiquitous use of chips in millions of everyday products.

When he did become successful he also gave back.

Moore gave us more. Then when he made it he gave even more.

> During his lifetime, Moore also dedicated his focus and energy to philanthropy, particularly environmental conservation, science and patient care improvements. Along with his wife of 72 years, he established the Gordon and Betty Moore Foundation, which has donated more than $5.1 billion to charitable causes since its founding in 2000. [2]

[1] https://en.wikipedia.org/wiki/Traitorous_eight

[2] https://www.intel.com/content/www/us/en/newsroom/news/gordon...


Rest in peace.

Truly, one of the giants upon whose shoulders we all stand.


Poor guy considering the obesity epidemic.


Ah the creator of the famous Moore’s law. I didn’t realize they were the same person. RIP. An incredible innovator, responsible for so much.


Fuck me I feel old.


how's the scrolling so fast on this website


Its called scroll-jacking and it made me close the tab within 3 seconds.


It's made by people who have never tested it on a Mac. They tried to "improve" the scrolling by making it "smoother".


Rest in peace sir

Your actions have powered billions of dreamers and saved trillions of hours for the mankind.


The age of the titans is leaving fast. Another titan's candle is out, I am sad.


I guess we can put Moore's law to it's final rest beside him.


I mistakenly immediately thought of of the (color)FORTH one.


What a giant in our field. Rest in peace.


RIP to a legend


True pioneer in backdooring CPUs


RIP. Nothing more to add.


*nothing Moore.


Thank you Mr.Moore, RIP.


Moore is no more. :(


I qofte dheu i lehte.


Rest in peace, legend.


RIP to a real one.


Incredibly, his eponymous law has held out long then him. RIP.


RIP to an absolute giant, on whose shoulders we all stand.


A legend. RIP


:(


well dead and dead


I really hope his casket has an Intel Inside sticker on it. RIP


I hope the company is not buried with his passing.


This seems like a good reason to change the bar color on HN.


Yeah, no question. I'm sure it is coming.


As of 00:44 EST, black bar is showing.

Please chill with your inane takes on why this "wasn't fast enough"


I was thinking the same thing. I'm surprised it's taking this long, honestly. He's such an icon and major player in the history of the computer industry.

Edit>> Though I suppose there's a confirmation process HN goes through before sporting the black band.


Let’s not assume badly of them. The real reason is likely that the person who changes it to black (dang?) isn’t around to do it right now.


dang is not permitted to sleep and it can't be reasoned with. It doesn't feel pity, or remorse, or fear. And it absolutely will not stop... ever, until you are dead!


how dare he not be on hacker news on a friday evening


> Edit>> Though I suppose there's a confirmation process HN goes through before sporting the black band.

I'm sure HN is willing to defer that research to someone like Intel in this case.


Wow, you downvoters are harsh. Okay I learned my lesson not to assume someone has a phone or PC next to them and can simply flip a switch in a matter of 30 seconds. Cool.


I once had a nightmare that Knuth died, and at first there was a thin black bar on the HN site. Clicking around, suddenly the whole topcolor turned black. Lastly the entire HN page turned black text on black background. I was just like "it's cool, they're taking this seriously". It was just so weird and believable at the same time.


Agreed


Can we get a black bar, please?


Tragic news


Wait, didn’t someone post about how he was still alive just today?! Way to jinx it!


So Moore's Law is literally dead?

RIP, we will continue to break your law.


So Moore's Law is officially dead.

RIP and we will continue to kill the law. Hope the humor reaches you. Rest well.


> We at Intel remain inspired by Moore’s Law, and intend to pursue it until the periodic table is exhausted.

Interesting way for them to put it, i think I like it.


LOL




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: