I have never had any brush with the giant. But I have my own little story to share. Way back in 1998, I was a newly minted engineer in India and was visiting Bangalore. While roaming around on MG Road, I picked up a book named “Inside Intel”. It is through that book that I learned about Gordon Moore. On the 48 hour train journey back to Delhi, I devoured the book. It left an impression and I was in awe. It also ignited something in me. I wanted to become a better engineer, so I bought my first computer soon after. I wanted to be in the “Silicon Valley”. As the fate would have it, 3 years later, I landed there. Little did I know that this serendipity and random inspiration would weave my life story. I remain awestruck with the person and the impact that he left. RIP
In 1999 on MG Road, it was more likely a roadside vendor with pirated books, Higginbotham, or unlikely, Gangarams. Blossom’s (Church St) wasn’t yet open IIRC.
Late 90s and early 2000s, I recall that Intel Inside, Lee Iacocca’s book, Who Moved my Cheese, Seven Habits of Highly Successful People, Jack Welch’s book, etc were commonly sold, in addition to the latest novels. I even remember buying Sylvia Nasar’s A Beautiful Mind (was popular because of the movie then) from a roadside vendor.
On the contrary, the bookshop situation in Bengaluru is exceptional!
The old book shops have been disrupted. Sapna is one that has adapted well, with a publishing house, decent web-presence, and has grown too.
For the best technical books, you now go to Tata Book House (may be others, I am unaware of). Novels, and such, Blossom's on Church Street is one of the best. Then there are cafe-bookshop types like Crossword, etc.
Kannada and other language book stores are also thriving.
How incredible is it, that I'm writing software on a laptop, connected to the internet over wifi, all while being 38,000 feet in the air. And all this enabled by billions of tiny transistors made from sand. And to make this all the more surreal, I'm sitting next to someone who knew him.
I was pushing code while on a transatlantic flight the other day and I got a little overwhelmed with just how cool it was. There I was in a metal tube staying aloft with 2 jet engines, and had wifi at 30k feet while traveling at ~600 mph.
To be fair, "star dust" is not rigorously defined, and the cosmological definition of "dust" includes some pretty big chunks. So after the sun swallows the earth, there's a very good chance we'll qualify. :)
Also we are more patterns in the dust given that most of the atoms in our bodies swap for new ones over a couple of years. The patterns may be able to continue separately from the dust.
Is this person someone who is (was) traveling with you, and therefore you knew that he/she knew Gordon Moore, or did you just got to know this person by random chance?
Given he/she knew him, can you ask to share some memories, and post it here?
I once got the opportunity to ask Gordon Moore a question, “what is the equivalent of Moore’s Law for software?” His response, “number of bugs doubles every year”. Great man.
Since IIRC bug count is roughly proportional to source code line count, this implies the aggregate complexity of software doubles too. That might sound foreign, but if you think duplicated effort and accidental complexity rather than problem domain complexity it wouldn't surprise me if it was close to true.
Points to the value of sticking close to the problem domain.
I heard error count is proportional to LoC + # of developers involved in writing it. So as well as the accidental complexity from LoC, the more people touch a code base, the greater the chance of broken communication and misunderstandings. Ie bugs.
Maybe. In my experience it's common for developers who were not part of the original effort to maintain by addition, because they are worried changing the existing code might break stuff. So with increased maintainer count the code size grows too.
Maybe I'm naive but i think hardware cannot afford such condition. Even the name is a hint, software was termed so because it was easier to modify, allowing more errors to get in because you can fix it later at a lower cost.
I'm sure there are tons of bugs, I was told a few stories by people in aerospace industry. But I think they cannot survive if they don't keep the number ultra low, because they can't fix it (well except microcode level).
It's a testament to those unsung heroes that most high-level language users can remain blissfully unaware of how much is automatically fixed by compilers in their emitted instructions.
"When PA12 is used as GPIO or alternate function in input or output mode, the data read from flash memory (bank2, upper 1MB) might be corrupted."
I can't even imagine the head scratching one might go through if hit by this, particularly late in development process (when the firmware becomes larger than 1MB).
Would fabrication failure be considered a bug? Isn't the rate kinda up there for chips that don't make it out of the factory? Not to mention on arrival there's a failure rate.
But also, I don't know too much about this, but when Linus tech tips did a tour of a chip fabrication plant, I remember learning that apparently the high end chips are chips where all the little bits inside all work, and if they detect like, busted wee transistors or whatever, they can just use that chip as a lower power processor. If that's the case that's like... Kinda a bug, but they just "code around it," right?
I would like to propose "Grove's Axiom" which basically states:
If you're failing, pivot to what your replacement will do before you fail.
"In the midst of the doldrums of 1985, Grove posed a hypothetical question to his colleague Gordon Moore: “If we got kicked out and the board brought in a new CEO, what do you think he would do?” Moore answered without hesitation, “He would get us out of memories.” To which Grove responded, “Why shouldn’t you and I walk out the door, come back and do it ourselves?” [1]
People ask me for advice and I immediately realize why am I not following it myself? I have had multiple people come back years later and tell me that they would have never done X or achieved Y if it weren't for a discussion we had years earlier.
The analytical knowing mind has to drag all this other stuff around!
I would be careful applying this one to imposter syndrome. I think of imposter syndrome as a failure to calibrate your expectations of yourself with the reality of what your work requires. Asking yourself what the "real" person would do that you aren't doing seems to me like it would play into that miscalibration. You might be better of asking _someone else_ what your replacement would do.
I was a kid reading 'Popular Electronics' in the mid 70's in a rural Florida high school. I did an English class presentation on transistors, mentioning Moore's Law. My electronics never got past the Radio Shack 150-in-1 Electronics kit and I went into software. I feel like I owe my career there to the work of the geniuses like Moore who not only made something rare and special, they made it common, nearly universal.
Dad was an electrical automotive mechanic so I didn't have a choice there.
Did EE/CS and have all the standard electronic lab bits from a trinocular microscope w/ polarized ring light to the JBC knock-off soldering station. I keep meaning to buy an HP 54600B to get the Tetris easter egg.
Software and systems are where the money's at now (until the tech singularity ruins it), hardware is just for fun.
These kits were awesome, BTW. I had one too. I know you can find them still on Amazon, but I never got my kid, who's otherwise into computers, interested in them.
Wow, we have lost a true legend. As I get older, I marvel a bit at the giants I once shared time on the earth with. Sometimes it feels like we don't have giants in the 21st century in the same way.
I think it is hard to see giants when they are just innovators.
To everyone around them and especially themselves, they were just pushing the ball forward day-by-day; they weren't heroes or insanely prescient, they were just doing the next logical thing to do. It's only after they've climbed the mountain that you can appreciate what they did.
That's probably the hardest thing about innovation. To everyone who wants to innovate, they only see the mountain. They have the grand idea of where they will end up, when the reality of innovation is just pushing the ball forward one step at a time. It's not heroism or being a genius necessarily, it's persistence and being able to think on the fly.
I think "genius" knows where the general destination is but I feel like most aren't doing it for the destination, they are doing it because they feel it is a calling and what they must do.
I think this is tautological - this is how it must always be, and I would guess has been the common view at any given point of time in history.
You can only properly assess and appreciate the impact sufficient to assign 'legend' status to an individual or institution retroactively, decades hence from the actions that elevated them there.
Not at all. Moore himself has been recognized and celebrated for decades.
But many other such figures achieved 'legendary' status while they were very much alive and in their prime: Edison, Ford, Bell, Wright Bros, Disney, Carnegie, Einstein, Oppenheimer, etc.
Moore himself was a member of an elite crew and one of Shockley's (the inventor of the transistor) "Traitorous Eight" which spun out from their former mentor to join Fairchild Semiconductor before all of them spun out from that enterprise to found their own companies and in the process create Silicon Valley proper.
I sometimes make myself a mental list of famous people who are expected to die soon. The following people will all die in the next few years: Jimmy Carter (98), Henry Kissinger (99), Dick Cheney (82), Pope Francis (86), Paul McCartney (80), John Williams (91), William Shatner (92), and Ali Khamenei (83).
Attenborough. That will be the end of our planet's conscience, symbolic of the era where we discovered how truly incredible a place we live in, and what we are doing to it.
I think it'd take more than picture quality to make classic films accessible. Acting styles and pacing especially have changed a ton since the b&w era. It's not to say they're not good, but they're a bit of an acquired taste.
A kid in this house has started an appreciation for the original Twilight Zone, and enjoyed Casablanca before that. Gotta start early I guess. TV is scarce in this house, which promotes reading and feeling lucky to see a great movie, no matter the era.
"Sometimes" is the right word. Usually, it's just inferior. B+W was used for artistic purposes only rarely, the rest of the time it was because it was cheap and easy.
Peter Parker for one.
> Hey guys, you ever see that really old movie, The Empire Strikes Back
As a kid I watched many films from my parent's generation. Mary Poppins (1964) or Great Escape (1963) come to mind as being staples every christmas -- Great Escape on boxing day especially
A generation before that people watched "It's a Wonderful Life" (1946) for much the same reason.
I've subjected my kids to a few old films from well before they were born, but they were films from my youth -- Bill and Ted, Back to the Future, Men in Black, Muppet Christmas Carol etc. And yes they've seen Star Wars, which was from before I was born (just), but that's an ongoing franchise.
I don't see them watching films from the 1960s like I did though, and I doubt they'll make their kids sit through films from the 90s.
> A generation before that people watched "It's a Wonderful Life" (1946) for much the same reason.
Some of us still watch it!
Of course the vast majority of old movies are long forgotten (and no great loss), just as the vast majority of what we're producing today will be. But there are a minority of classics that endure. Just as there's still a market for Shakespeare, Dickens, and Mozart, despite all the plays, books and music that have been written since.
they'll survive due to geopolitics if nothing else, the US government has no choice but to make sure they stay viable due to the potential issues in Taiwan
This, anything happening to Taiwan will shaken IT and electronics industry severely. We are ahead of "chip-covid" event in my opinion as IT wasn't enough shaken by SARS-Cov2 or Ukraine, I hope I am wrong. Intel stock price would sky rocket then.
It takes time to understand who is the true giant. Robert Metcalfe just received the ACM Turing award last week and his contributions where from the 70's. It's the same with Nobel laureates, it takes year for your work to be recognized.
Not really, no. He may have just recently received an award (he's received many awards) but Metcalfe has been recognized for his achievements for decades. He got the ACM Grace Hopper award in 1980: https://en.m.wikipedia.org/wiki/Grace_Murray_Hopper_Award
People attempted to use dog sleds to traverse up the antarctic plateau, which is nearly the same altitude as Tibet, and reach the pole.
Many died in the process, their names are remembered (though slid out of public discourse as its over a century ago).
In our industry fortunately people don't die, but trailblazer had to do a similar thing - they used the proverbial dog sleds to build a camp in the middle of antarctic plateau.
I've thought about that a lot. Right now, I think it's just not what drives me. I'm motivated by novelty and certain types of experiences. Not really by years-long projects trying to break new ground.
But my comment doesn't come out of a sense of desire to emulate. Just out of a sense that especially in the early-to-mid 20th century, we had these titanic characters: Einstein, Tesla, Von Neumann, The Beatles, and so forth.
In a lot of ways, I think it comes down to these folks existing in the beginnings of the information age, and how many advances were unlocked by the increase in communication speed.
That was probably a lot more groundbreaking than the digital revolution. For all the progress we've seen since I was born in the 80s, there's not a lot of things that have fundamentally changed in terms of human capability. Although, we may see in the next decade or two the same sort of paradigm shift.
A giant typically means famous. There are many giants that we will never know the name of. For me, I’d rather die famous with my friends and family and have changed the world in my silent way. I’ve been fortunate enough to do it a few times already, and I hope I’ll get another chance before I die as most people never get the first chance. That’s I think a better goal - when a chance comes to change everything, jump into it head first. But don’t bother trying to be famous. It’s meaningless.
Regardless, I bid Dr Moore a fond farewell. He lived to see the horizon of his law.
> There are many giants that we will never know the name of.
And sadly, there are millions living out their lives in poorer parts of the world who might have become giants, if only they'd had access to nutrition, good parenting, education, connections, etc.
When you zoom out on the timeline of humanity, this man, among others, will have left a huge spike in our technological progress. When I look back at the last 50 years and compare it to all known previous human history, I'm quite humbled to be alive during this timeline with them, despite some things I also greatly abhor.
Can you please make your substantive points without name-calling? It's not hard, and we're trying for something a bit different from internet default here.
"When disagreeing, please reply to the argument instead of calling names. 'That is idiotic; 1 + 1 is 2, not 3' can be shortened to '1 + 1 is 2, not 3."
(a bit more explanation if anyone cares: aggressive pejoratives tend to poison curious conversation and are easy to cut, so this isn't a hard one. I think mostly people write like that online because it's how many of us would talk in person with people we're friendly with—and there it's totally normal, because there's already a 'contract' that takes care of understanding each other's good intention. That's absent online, so many commenters are on a hair trigger with this kind of thing and will react in ways that take the conversation to less interesting, more conflictual places.)
Though the basic point that computers keep increasing in processing power still seems to hold, even if it's not just by having more transistors per area.
I don't understand what you linked. Don't they believe in technological singularity?
I would think a intelligence capable of self modification could use the scientific method to increase it's performance (even if it means scaling in size like a matrioshka brain).
> Don't they believe in technological singularity?
“ In fact, in a 2005 interview, Moore personally stated that his prediction of exponential growth would not hold indefinitely, simply due to "the nature of exponentials."
What a respect-worthy man, and a lucky man to have been born at the beginning of an age of exponential innovation. Also, from all that I've read about him he gave back to the world in equal measure and was the best of a past generation of Silicon Valley entrepreneur.
For anyone who has interest / time and lives in the Bay Area, watch this PBS program for the history of the "Fairchildren" and the roots of Silicon Valley. (of which Gordon Moore was one of the "traitorous eight" who contributed to making SV what it is today) If you live, drive, work around the area, there are so many landmarks and legacies of this phase of history that you can still see today (but only if you keep an eye out and take a little effort to learn the past).
Weird how this works sometimes… I was just at the Monterey Bay Aquarium for the first time today, saw the name on the auditorium and thought “that Gordon (and Betty) Moore?”
Truly an incredible place, and thanks in very large part to the Moores. RIP Gordon.
The Monterey Bay Aquarium was founded by two daughters of David Packard of Hewlett-Packard fame (along with Stanford professor Chuck Baxter who died last year.) Deep tech roots there.
Wow this has made me realise that the maths library at Cambridge was also sponsored by them. Never really gave a thought to it until now, just assumed they were some random rich alumni!
RIP. As an ex-Intel employee, this one stings. He was a godly figure internally and even during Intel's deteriorating years, engineers and technologists looked up to him as a constant north star of inspiration and engineering culture. Hope his demise gives a long pause to Intel's leadership and bring back some of good parts of the engineering culture from the 80's and 90's.
Before anyone says that Moore's law is dead, remember that transistors are still getting smaller and more dense (not to mention more expensive, which was another part of his prediction).
In every thread people confuse Moore's law with Dennard scaling (which states roughly that, as transistors get smaller, their power density stays constant, so that the power use stays in proportion with area; both voltage and current scale (downward) with length.)
Actually Moores law is dead. His prediction was not transistors would get smaller. His prediction was initial that for 10 years was that density would double every year. And then later that it would double every 18 month.
And this held longer then even he thought. But it has ended. While things still get denser, its no longer on that same trend.
The prediction wasn't that density would double. It was that "number of transistors on a chip" would double every year, and it was basically just marketing. That actually held up somewhat longer than the density doubling did, since chip area grew to fill the gap.
Smaller transistors is one way to get more transistors per chip. Chips could also get bigger, or the transistors could be packed in more densely (my VLSI is rusty, but not all transistors are minimally sized; a wider transistor can be used to push more power for example).
Chips are not getting bigger really. Its just very large 'chips' are multible chiplets package in on bigger part. And that's not what Moore was talking about.
I’m just saying it is possible to follow Moore’s law. Physics didn’t limit us, engineering and economics did. Multi-chip packaging is a fantastic response to the engineering/economic problem of low yields.
And how sustainable is making chips bigger or packing transistors more densely without making them smaller as a solution to getting chips with more transistors?
This is more of an engineering question than a physics one I think.
GPU dies are quite large compared to CPU, ballpark 2X to 3X, so I guess we could get a generation to generation and a half of transistor doubling out of increasing the size of the chips at least.
But the question is, do workloads benefit from all these transistors? Server workloads, I guess so, you can get some pretty big Xeons, or lots of transistors per package with AMD and multi-chip packaging. Consumer workloads, I think the issue is more that consumers don’t have these massively parallel workloads that servers and workstations have. Consumers have stated their preference by not paying for more than a handful of cores per chip.
The obvious way to get more transistors per core is fewer, bigger cores per chip. But changing the cores is a big engineering project. And yield issues are already a big problem, which would only become worse with more complex circuits. But until we end up with one core per chip, I think we have hit an engineering/economics problem rather than a physics one.
And the engineering/economics problem is a really big one, I mean AMD has had great success by going in the opposite direction; breaking things up further with multi-chip packages.
But the question is, do workloads benefit from all these transistors?
That's not the question at all, this was about transistors getting smaller. Now you're way out in left field talking about how gpu chips being bigger than cpus somehow means that transistors don't need to get smaller, benefits, workloads, cores per chips and a lot of other irrelevant strange goal post shifting.
All I said was that transistors are still shrinking and density is increasing, I don't know where all this other stuff is coming from.
Moore himself said (mentioned in the link) that the “law” was just supposed to convey that the way to cheaper electronics at the time was to focus on transistors, I don’t think he ever meant it was supposed to be a law of physics that stays true 50 years or 100 years later. In fact he said it would be true for 10 years
This is interesting, I never knew this. Is there somewhere I can read more on his thoughts? I was under the impression (naively so) that this was some invariant that had been somehow proven to an extent.
I searched on "Moore's law is not a law" and found [1], "Moore’s Law Is Not Really A Law But It Is Still Obeyed", from someone who was a semiconductor engineer at Intel 40 years ago. As he says, Moore's law was more a roadmap than a law. Companies strove to keep it going for fear their competitors would succeed while they failed; sort of a self-fulfilling prophecy.
Eve if Moore’s law is dead, it lived way longer than any law claiming a doubling every two years has a right to. Compound annual growth of 41% per year for 40+ years is astonishing.
I think the spirit of Moore's Law (the context in which it is most often used) is better referred to as the Law of Accelerating Returns [1] [2] (e.g. by Ray Kurzweil). Moore's Law is a sub-section (in time and technology) of the Law of Accelerating Returns.
Naw, this is from a quarter century after Moore’s predictions, and Moore’s law had already started slowing down by then. Kurzweil was also selling snake oil (immortality) here, and the proof is in the final figure in the article, it’s 100% misleading and has left me with mistrust for all the rest of what Kurzweil said. Why did he leave out all the population data between 1920 and 2000, data which we have available at our fingertips? Because then he couldn’t have drawn a completely fabricated plot showing continuous lifespan acceleration, acceleration that doesn’t exist. Why does he conflate life expectancy with longevity to imply to the reader that genetics are improving as opposed to, say, the truth about poverty, wars, sanitation, and medicine? Because then it wouldn’t tell an amazing sounding (but completely untrue) story about humans being able to live longer and see the “singularity”. This article proved to me that Kurzweil is a charlatan, which is disappointing because I loved his keynote at Siggraph right around this time where he was pitching these ideas.
Really? I thought the exponential growth of calculations per second per dollar was rather well-established at least from 1900 through 2010 [1], a period of time that is much broader than Moore's Law?
Yes really, there is no law of accelerating returns as far as I’m concerned. Kurzweil clearly tried to opportunistically generalize Moore’s law and take credit for what he thought was a bigger idea, but sadly doesn’t produce a compelling argument, given that 1) Moore’s law died a decade ago, it no longer holds, and cost of calculations per second is no longer decreasing exponentially, and 2) Kurzweil claimed that things like US GDP and human life expectancy are evidence of his so-called law, and these claims are clearly bullshit. And not just wrong but intentionally misleading, that’s the part that bothers me. Did you look at the final figure? It’s not possible to make that diagram or write the accompanying text without knowing you’re lying. How about the first or second ones, wtf even are they? The majority of plots in Kurzweil’s piece, some 20-odd figures, are direct byproducts of Moore’s law from after Moore observed/predicted it, and nothing more.
In case you missed it, my point is not that flops didn’t increased exponentially for most of the last century, it’s that Kurzweil wasn’t the first to observe it, and didn’t add anything rigorous to what Moore & others had already done. Kurzweil only made a mockery of it by fabricating some of the data and using the larger argument to claim we’re headed toward immortality, joining an illustrious list of people who get attention by hawking the fountain of youth.
BTW feel free to re-read Kurzweil’s predictions about computation growth in the article and ask yourself whether they’re true. One of his scheduled predictions is up this year. He’s off by a few orders of magnitude according to his own definitions, and furthermore his definitions are off by a few more orders of magnitude according to other researchers, and on top of that his claim that enough flops equal human brain capability are, to date, false. These things put the remainder of his predictions really far behind schedule, and possibly slipping at an exponential rate, which kinda undermines the whole point, right?
Moore's Law has been dead for a while. The most common confusion I see is that people don't understand he was referring to economical scaling. These days we're getting more density and bigger chips but prices are going up a lot too. More for more instead of more for less.
Dennard scaling died 2 decades ago. Moore's law is dead for less than a dexade. It took 4 years for density to double(from N7 to N3). And with density doubled the cost. It will get only worse from here.
Gordon built something amazing, and I was lucky to have started my career at his creation.
I was fortunate to be able to meet Gordon at his house in Hawaii a few years back, which was an amazing experience for an engineer that grew up learning from his wisdoms and accomplishments.
Well, for me getting started it was Zilog (but the Z80 was based on the Intel 8080), Mostech and Motorola. But Intel had a big input too, and of course an overwhelming influence with the 8086.
In 2004 or 2005, Gordon came to UCD and gave a talk about computing — both history and future. It was unintentionally intimate: the overhang of the dotcom bust lead to the CS department’s lowest enrollment in history. Gordon had a really thoughtful response to my question about a particular chart. I remember walking away inspired and excited about the future of the industry I’d soon be working in. All these years later, I’m still just as excited. Thank you, Gordon. RIP
I'm a little curious, is there a reason why the ultra rich, once they've cashed out, create foundations that hand out grants? Is it because they suddenly find themselves wanting to do something good for society with the money or just because that is what everyone in that situation does?
Is there a ranking of what foundations have done the most good work? Is that ever evaluated?
Anyway rest in peace Gordon Moore, a true captain of industry.
Those donations can often be arranged to be tax deductible, so instead of letting $Government spend the money you get to direct its course a little more precisely. At larger income levels they often manage to make a donation pay off as "taxes", but also obligate others of their strata socially, and wield control over the recipients ("we'll sponsor your publication for a year of you dont talk about $controversy").
> why the ultra rich, once they've cashed out, create foundations that hand out grants?
Not qualified to say whether this is exactly the case here, but I think this video about charities/funds as power-wielding vehicles bears some relevance:
https://m.youtube.com/watch?v=0Cu6EbELZ6I
Disclaimer: not a big fan of the delivery style in the video.
You've hit on a few reasons. Basically: taxes, no more need for money, desire to give back, desire to have a lasting legacy, societal pressure...
And no, there really is very little work done evaluating private foundations. They operate with very few guardrails and oversight is extremely minimal. It is difficult to evaluate the effectiveness of charities, so how do you evaluate foundations that give grants to charities?
I am an academic that studies foundations and would be happy to talk more about it if you have other questions.
They've been through the struggle themselves, and especially in the tech sector, they must've been very interested in their specific field to devote their life to building a large business around it.
Thus when they find themselves with the time and money to support new innovators, they end up handing out grants as that's an easy way to support fundamental research of the sort that they too once did.
His perspective must have been pretty gnarly. Starting from literally the ground floor when it comes to transistors and personal computers - and living to see stuff like the 13900k. RIP.
>> By the 50th anniversary of Moore’s Law in 2015, Intel estimated that the pace of innovation engendered by Moore’s Law had ushered in some $3 trillion in additional value to the U.S. gross domestic product over the prior 20 years. Moore’s Law had become the cornerstone of the semiconductor industry, and of the constantly evolving technologies that depend on it.
Critical to that engine of growth had been U.S. investment in basic research and STEM education, ten percent of the U.S. Federal Budget in 1968. By 2015, however, that had been reduced to a mere four percent. To Gordon, investment in discovery-driven science was another key impetus behind creating the Gordon and Betty Moore Foundation in 2000, especially in the context of a widening funding gap for something he recognized as critical to human progress.
A true legend and a giant in the foundations of multiple tech industries and devices all using semiconductor technology and the co-creator and center of "Silicon Valley" and the creator of Moore's Law.
Moore admits as a kid he had an explosives lab at home (unlicensed) back in the day. Even Elon Musk admitted he made pipe bombs as a kid.
Wonder how many similar promising science kids today would be / have been arrested and imprisoned today by the goons at Homeland for experiments like these ?
It has probably just always been an exponential and any curve can be a straight line if you zoom far enough in and Moore's law was just that. Now we've got some perspective and are starting to see the actual exponential.
As I grow older I witness that the people that built the computing industry are slowly but surely fading out.
The next generation now stands on their shoulders and keeps on building the next generations of computing.
Thank you Gordon Moore. RIP.
2500 points. Wow. I haven't seen upvotes like this... ever.
Unmentioned so far- I was watching the University of Utah's 50th Anniversary of CS sessions[1], and- apologies I forget who- someone mentioned that it was Carver Mead who largely promoted "Moore's Law" as an idea. Just some random trivia, to highlight how such a strong piece of identity around such a more dynamic & wide-ranging man came about.
Hopefully not the "Douglas Engelbart: creator of the mouse" identity.
It isn't often one person or a group like the "Traitorous Eight" [1] go on to make entire industries and new platforms. They did it though and that included Gordon Moore and Robert Noyce. Moore and Noyce later split from that and made NM Electronics which became Intel.
This was back when engineers/product people ran things and competition via skill not just funding was the driving force. Imagine a new company today fully controlled by the engineers/creatives/product people, it happens but not as often. We need to get back to that.
The Moore's Law is an interesting case study in creating a term/law that supersedes you and inspires your self interest but also the interest of the industry and innovation. The root of Moore's Law was making more products and cheaper, allowing more to use computing.[2]
> Prior to establishing Intel, Moore and Noyce participated in the founding of Fairchild Semiconductor, where they played central roles in the first commercial production of diffused silicon transistors and later the world’s first commercially viable integrated circuits. The two had previously worked together under William Shockley, the co-inventor of the transistor and founder of Shockley Semiconductor, which was the first semiconductor company established in what would become Silicon Valley. Upon striking out on their own, Moore and Noyce hired future Intel CEO Andy Grove as the third employee, and the three of them built Intel into one of the world’s great companies. Together they became known as the “Intel Trinity,” and their legacy continues today.
> In addition to Moore’s seminal role in founding two of the world’s pioneering technology companies, he famously forecast in 1965 that the number of transistors on an integrated circuit would double every year – a prediction that came to be known as Moore’s Law.
> "All I was trying to do was get that message across, that by putting more and more stuff on a chip we were going to make all electronics cheaper," Moore said in a 2008 interview.
> With his 1965 prediction proven correct, in 1975 Moore revised his estimate to the doubling of transistors on an integrated circuit every two years for the next 10 years. Regardless, the idea of chip technology growing at an exponential rate, continually making electronics faster, smaller and cheaper, became the driving force behind the semiconductor industry and paved the way for the ubiquitous use of chips in millions of everyday products.
When he did become successful he also gave back.
Moore gave us more. Then when he made it he gave even more.
> During his lifetime, Moore also dedicated his focus and energy to philanthropy, particularly environmental conservation, science and patient care improvements. Along with his wife of 72 years, he established the Gordon and Betty Moore Foundation, which has donated more than $5.1 billion to charitable causes since its founding in 2000. [2]
I was thinking the same thing. I'm surprised it's taking this long, honestly. He's such an icon and major player in the history of the computer industry.
Edit>> Though I suppose there's a confirmation process HN goes through before sporting the black band.
dang is not permitted to sleep and it can't be reasoned with. It doesn't feel pity, or remorse, or fear. And it absolutely will not stop... ever, until you are dead!
Wow, you downvoters are harsh. Okay I learned my lesson not to assume someone has a phone or PC next to them and can simply flip a switch in a matter of 30 seconds. Cool.
I once had a nightmare that Knuth died, and at first there was a thin black bar on the HN site. Clicking around, suddenly the whole topcolor turned black. Lastly the entire HN page turned black text on black background. I was just like "it's cool, they're taking this seriously". It was just so weird and believable at the same time.