I do agree that this won't end soon though. It appears to me that many of the methods CPU's use for better performance are fundamentally flawed in their security, and it's not like we can expect the millions of affected machines to be upgraded to mitigate this.
I don't think you're giving enough credit. The actual microarchitecture isn't documented much in those manuals, so looking hard at those wouldn't help without making a series of assumptions of how it all works. The authors of recent exploits have been diligently reverse engineering and making sensible guesses.
Your argument sounds like an argument against responsible disclosure totally.
Don't forget that Hope was also released when the box was opened.
(FYI, I've been a security researcher for 15+ years and work as the head of hacker education for HackerOne; I am very, very pro disclosure. :) )
My biggest worry is that all currently known classical "secure" data sets, including encrypted but recorded internet communication, will become an open book a few decades from now. What insights will the powers that be choose draw from it then, and how will that impact our future society? Food for thought.
You can have unintentional exploits/vulnerabilities in free/open source software or hardware too.
All “secrets” are eventually revealed, security is about managing the risks and timing associated with this revelations
"Obscurity" general refers to situations where "everything" is confidential. And when everything is confidential priority one, nothing is, since people can't work like that.
Cryptography attempts to sequester the confidential data into a small number of bytes that can be protected, leaving larger body of data (say, the algorithm) non-confidential.
Security _ONLY_ through obscurity is not security. Obscurity is a perfectly valid layer to add to a system to help improve your overall security.