The RISC-V has a terrible Instruction-per-Clock and thats where Arm shines over x86. The idea that's specialty hardware is an Apple invention and that you cant get that with x86, is also ludicrous. Take a x86 CPU from AMD or Intel and add a RTX 3090 and you have specialized hardware for ML, graphics, ray-tracing and video encoding/decoding that runs circles around an M1.
I think you’re fundamentally misunderstanding what the article is saying.
The article literally gives several examples of co-processors for x86 going right back to the original 8086 and 8087. Where you get the idea it’s saying Apple invented the concept is kind of confusing.
Also it’s not about RISC-V versus ARM. It’s about ARM chips containing RISC-V cores as local controllers inside the accelerator hardware, and why that makes sense.
RISC-V is designed to be super simple to implement, this means simple instructions that do one thing, like add or multiply. In a more complex design you have add and multiply, but also add-multiply that does both. By having that more complex instruction you can get more work done on a single instruction, but it requires more silicon.
If your goal is: build a CPU that is as simple as possible, with as few gates as possible then RISC-V is great, But if you are trying to get good performance, then RISC-v is a bad design. I think RISC-V is great, and i think it will se a lot of sucsess, because there are many applications for a slow, simple and free CPU, but it wont compete with x86 and ARM on performance.
I don't think you quite appreciate how clever RISC-V has been designed. RISC-V is very well designed to be used with both complex and simple micro-architectures. By combining instruction compression and macro and micro-fusion you get everything you claim only x86 and ARM can do an more. Here is a proper explanation of how:
How many architectures bother with a non-SIMD multiply and accumulate? I don't think it's a big issue to be missing the single form. RISC-V's vector instructions have it.
Sure, if your goal is to get the longest running battery then you want something else, but this is also the problem with Apples design, it is designed to be low power.
If you want the best graphics/ML/Compute/Ray-tracing/videoencoding/decoding and you are not running on a battery then the M1 is not for you. I think this is a big deal, because it will mean that all the innovation on the bleeding edge will happen on other platforms then the MAC.
In WIN/Linux land loads of people are innovating, on hardware for all these things, OS features, applications, peripherals (Like VR). On the Mac side almost nothing can happen unless one company makes it happen. A single company have limited resources and most of all attention span.
The problem with controlling everything and doing everything yourself, is that you have to be the best at everything. In the long run you will fail at something. My guess is that the GPU is where Apple will fall behind first, and where it will be most obvious. They simply dont have the muscle and inclination to go head to head with nVidia on GPUs.
The whole impressive thing about the M1 is that it is low power AND fast. With all other current gen processor you can only pick one.
> They simply dont have the muscle and inclination to go head to head with nVidia on GPUs.
I wouldn’t bet on this, Apple has a history of barging into markets and becoming tech leaders in them. Mainly because they have enough money and clout to attract the best talent in the areas.
No the have a terrible history of catering to high-end needs, b2b and specialty needs.
Remember the Xserver? They killed off most of the high end video market with FinalCutX, Everyone in the CGI space will not forgive Apple for what they did to Shake. The drop of OpenGL made loads of CAD apps give up on Apple. AAA game developers have long ago given up on Apple making something competitive. And add to that the debacle that was the trashcan, and the cheese grater. Neither of which was anywhere near competitive.
Apple is great at figuring out what the mass market wants, for everyone else they have a very poor track record.
I agree that on the software front it’s a whole different story.
I was thinking about broader categories like the music players (initially derided), phones (that Ballmer interview really aged poorly and and all large phone manufacturers couldn’t believe what was happening). Watches - all the media portrayed the Apple Watch as a failure even years after it was a market leader. The M1 is the same story, all the benchmarks were showing that A series chips had great performance compared to x86 and yet people still somehow managed to be surprised that M1 is fast.
That’s why I wouldn’t bet agains apple when it comes to graphics cards. Now, the software story is different. Even if they have the greatest GC, the question is “what for?” As you mentioned they managed to scared off professionals. And they chased gaming away years ago.