* For a new software approach to be recognized as valuable, it needs appropriate hardware to exist at the right time
* As Moore's law fades away, we can no longer rely on steady performance improvements for generalist CPUs, so specialized HW will make a comeback
An interesting note:
> Hardware is only economically vi-able if the lifetime of the use case lasts more than 3 years
I admit I stopped reading at section 5.
I think that was mean to be the main point of the article, but I do think section 5 is way more interesting than 6.
That wasn't the only type there, but this one makes a lot of damage.
ie. you can make some great hardware and software which perform really efficiently today. If you later try to make a v2 of that system, and want to reuse the same software, then sorry - you're outa luck!
Also, while compilers today are very complex, they are still far from what's necessary to really make good use of a VLIW machine.
If someone manages to solve the above two things, then VLIW will 100% wipe the floor with current architectures. Being able to throw out all logic trying to get parallelism out of a serial instruction stream would have massive power and area improvements for the same computation throughput.
Similarly, the high cost of memory page systems and context switches is an artifact of executing unvetted code. When everything untrusted is compiled in a way that it physically cannot breach memory boundaries, the need for the OS and programs to live in separate memory spaces evaporates, along with its overhead.
Hardware and software co-evolve! There is no such thing as people creating hardware ex-nihilo. This reads more like a researcher venting about his limited budget and development capacities.
Obviously they co-evolve, and the consequence is deeply sub-optimal systems, as a result of essentially random, momentary conditions.
Though since not all the optimization parameters are known for complex systems with multiple stakeholders, or at least not known outside of a select few, i.e. what academic researchers face, we shouldn’t be too hasty in saying it’s definitely deeply sub-optimal.
For example, there could be unknown, private, criteria that have been highly optimized for.
Those last maintain backward compatibility with a different, 1980s, design, albeit with another break at 64 bits that sacrificed the worst aspects.
Essentially all mainstream processors emulate what amounts to a hypertrophied PDP-11, in order to produce good scores on benchmarks coded in C. A better language that does not attempt to model C might be able to use better processor designs that C cannot fully exploit, but we have no practical way to do the experiment.
Today machine learning is used to inside servers to provide multiple responses at the same time.
This batching is because it's easier to gain more money by selling plenty of low quality decisions (ads), rather than a few good quality one.
But to grasp the bigger picture, there is also the fact that silicon chips are hard to develop in a DIY fashion. Our economic models have made it so that the whole silicon industry is based on trade secrets to create barriers and incremental improvements following Moore's Law for more than fifty years.
You see, to build a chip, you need to have everything perfect ; Pure sand crystals, dangerous chemicals, very small features. Everything engineered to the atom and orchestrated to perfection. It makes great product to sell for years.
But you see, this perfection has a price. Everything must lay flat in 2D. Everything must be built. And this is where the flash crash happen. Because the alternative route is vastly superior and evident in hindsight. So vastly superior and evident that the secret is harder to keep. We are even purging our biological ecosystem to keep the secret.
The technological future of computing is in self-assembling nano computing units. You tell the computer units how to build more of themselves. That's how nature has done it for millions of years. Science-fiction call them nanites. You can even reuse existing DNA factory. Or you can bootstrap from scratch. When you see that a typical virus is like 32kb. How can you imagine that with the right resources, you can't write a self generating 3D grey goo liquid.
If you need more computing power it's just a matter of giving it more energy and material. Chemistry scales a lot better. And in the battle of exponential curves, it's a winner takes all market.
Computing power is a resource the same way oil, rare-earth minerals, steel are. It has been managed to be kept under control for fifty years. You need to understand that there is a balance to be strike between enjoying the benefits of technology and the stability of the economy.
That's why Moore's law died, we killed it to keep control.