I don't know, there may come a time. If we were to replace the current system, then yes that would be impossible/waste of time. But what if an architecture came about built on AI? Maybe quantum computing? DNA based computers? Eventually there will be new hardware platforms that force the very change you are dismissing. Even if it is 25-50 years from now which is a blink of an eye in the grand scheme of things.
I'm a bit confused. The architecture is just abstracted away. Why does a programmer care if it's optical or DNA or whatever? See, for a less extreme example, the tools that people use to develop for a single x86 vs CUDA vs massive clusters vs huge FPGA clusters vs ARM vs etc etc. I honestly think that revolutionary architecture will just lead to some new libraries and dev tools which get kludged onto existing dev systems. But I welcome more information because I know I could be horribly wrong.
I think you're on the right track, software tries to layer itself as much as possible. New architectures and capabilities will only make the concepts and tools exposed by the glue between the layers change.
Unless the new thing isn't Turing-complete and can't be implemented with a Turing-complete system, it will be abstracted away at first just so we have an environment to start building with, and can start using it without reinventing every single wheel we have.