I'm frankly amazed that the mitigations so far have not already been disastrous to performance. Either we accept drastically less performance from now until basically forever, or we adopt a more fine-grained security model.
How can you trust any?
Spectre and the like needs to be dealt with on everything that isn't strictly an embedded fixed-code device.
Spectre and pals can literally jump between processes that it's no longer funny.
Those microcode updates and kernel fixes are anything but easy. They are anything but lightweight. Context switching overhead is a huge contributor to lag and general inefficiency in interactive systems. Every kind of Spectre mitigation increases that overhead. Every further mitigation wrecks more of the performance advantages that caches have historically provided.
From a performance, power consumption and efficiency perspective, Spectre is a disaster. Mitigations just move the problem from a security and privacy one to an efficiency and performance one.
I highly doubt that the market is big enough to make such a product viable.
This would have an obvious impact on the code size/cache efficiency, and on the number of registers used to store intermediate results, but how much would this kill performances, depending on the use-case ?
That's really not that big a deal compared to slowing everything down again. Modern computers are slow enough thanks to the mountains of abstraction developers insist on using (web included!), the realtime AV scanner, Ads, user tracking, and just general software bloat.