>> There will come a point where the accumulated complexity of our existing systems is greater than the complexity of creating a new one. When that happens all of this shit will be trashed.
Amen! (Include with complexity above the complexities of real life like project schedules, time-to-market, ..., ultimately economics.)
Ryan claims that the systems are still complex, which suggests that the accumulated complexity (including project schedules) has NOT exceeded the complexities of creating a new one in general.
Having said this, what is Ryan really saying that tells us something we did not know before!
A tacit aspect of the whole argument is that the people are intelligent enough to judge complexity to make rational decisions, and would be able to find a simple solution when creating a new one when even with all the new understanding gained with experience, the new solution will still be very complex (just simpler than the existing one). This is to an extent analogous to the rational market hypothesis, and that I doubt to be true.
Next Ryan may propose a new system that will be written from scratch to satisfy his no-overly-complex goal. Only to find that the new software runs on the top of existing hardware which is immensely complex. Oh, then he thinks about developing hardware again too. Only to find that hardware development is immensely complex (EDA tools for example). Oh, then he thinks about developing them again too. He now concludes that the accumulated complexity hasn't yet become too high after all.
After taking all of that into account and if that is not complex by itself, find something intermediate level (say a programming language) that has less complexity at that level (going deeper would increase complexity) and build something on the top of it. But isn't this what all of us already do?