I'm sure we agree but a caveat like "if properly implemented" is quite a big one to make in this business - bad stuff happens largely because of implementation bugs. And we don't know how to eliminate implementation bugs. And this is one reason why we see successful attacks on systems with multiple "sound if properly implemented" levels of security.
I remember reading elsewhere that bad stuff happens largely because of incomplete or conflicting requirements, and that implementation bugs are secondary. My own experience confirms this, even though I usually deal with systems where a problem at worst results in lost sales / orders and thus sloppy coding and implementation bugs are much more common.
Makes me wonder how well-polished the requirements analysis for (nuclear) power plant software is...
There is no perfect security. What I mean is that if you design a plant in a way that it's not relying on cybersecurity measures or digital automation for it's nuclear safety and security, it's secure from cyberattacks.
We know how to eliminate almost all implementation bugs. I hope nuclear power plants are one of the areas where people are willing to pay for that level of correctness.