Hacker News new | past | comments | ask | show | jobs | submit login

I generally agree although I do think there's too much mythology around Alpha.

What if Intel had said to HP "That's a really dumb plan" when it brought up the idea of EPIC? I agree probably not much. Intel would presumably have accelerated what Gelsinger called a 64-bit RAS Xeon in a magazine article. Probably good for Intel and not clearly negative for HP. (Both companies went that route in the end anyway.)

Less money would have been wasted in general but it's not like a company spending $100s of millions on Itanium would have told its engineers and other employees to go fiddle with some other exotic architecture. It probably just wouldn't have hired them in the first place.




Seems almost certain that we would have ended up with something that looks exactly like x86_64 a few years earlier in that case, no? I mean, it's not like AMD invented fundamentally new concepts or anything. The doubled GPR set was likely to happen regardless. The REX mechanism might work differently (like moving to a 3-address encoding). Maybe Intel wouldn't have dropped segmentation and we had a new selector record in long mode to worry about. Maybe we wouldn't have the sign bit rule in addresses. None of those are design-breakers.

Again, if no one (including giants like Apple and NVIDIA) has invented new ways to run code in the last 30 years either, it seems like wishful thinking to argue that Intel would have somehow done it had they been less wrong about VLIW.


>Seems almost certain that we would have ended up with something that looks exactly like x86_64 a few years earlier in that case, no?

You're probably right. If Intel had said (internally): "Screw it. Let's just extend x86. It's not like we have anything to fear from AMD." You'd probably have had it a bit earlier. (Though it's not like most buyers of x86-based systems were screaming for 64-bit until into the 2000s anyway so it's not clear that there would have been a big push to pull the timeframes in by much.)

And Intel would probably have still pushed too hard on frequency and ILP--because Microsoft if a senior Intel I exec I knew who isn't prone to lying is to be believed.

Also, yeah. Apple Silicon is great but it's not new paradigm great. If anything, the revolution has been in heterogeneous computing (esp. GPUs to date). And a lot of that has been partly enabled by open source.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: