The problem of this approach is that it does not scale to large systems. If you don't spend much time on thinking in the abstract about how it will work and what might go wrong, then, by the time you have written enough code to find that out, you may have gone a long way down the wrong path, and not all architectural-level mistakes and oversights can be patched over.
No-one does this perfectly -- even people using formal methods will overlook things -- but, on a big project, if you don't put much effort into thinking ahead about how it should work, and try to identify the problems before you have coded them, you are likely to end up where, in fact, many projects do find themselves: with something that is nominally close to completion but very far from working. Those that are not canceled end up looking like legacy code even when brand new.
Big projects should be cut into smaller pieces where each piece can be relatively easily rewritten.
To come up with the right smaller pieces, you have to think about how they will work together to achieve the big picture. That means interfaces and their contracts, and if you get them wrong, you end up with pieces that don't fit together, and do not, collectively, get the job done.
Big problems cannot be effectively solved in a bottom-up manner, and perhaps the most pervasive fallacy in software development today is the notion that the principle of modularity means you only have to think about code in small pieces.
What do you think other engineering principles do? They create a proof of concept. Verify it works and then create the real thing. That is why "real" engineering companies have hundreds of tools to test stuff.
I really don't understand why people want software to be different. You write some shitty throwaway web app then sure go ahead and don't prototype anything just hire a "software architect" that designs something and use that.
But do you want something that actually works then that is completely useless. Prototype, verify, start over if necessary. That is the way to write quality software.
That's beside the point. The point is that coding is not the only way to verification, especially at the architectural level.
> I really don't understand why people want software to be different.
It seems to be you who wants to be different. Making prototypes is expensive and time-consuming, so engineers try to look ahead to anticipate problems. Prototyping in software is cheaper, but not so cheap (especially at the architectural level) that thinking ahead isn't beneficial.