When I asked about this, I was told it was "best practice" and that if they ever needed to scale, there were now many places they could separate things. I pointed out that for the target audience, they probably wouldn't need to scale. But that if they did, it would be because they were doing 7x the work necessary.
The code was certainly "tidy" from the perspective of the guy who got paid a lot of money to produce architecture diagrams. But it was a nightmare from the perspective of an individual programmer trying to add a feature. They would have been way better off without a lot of quasi-religious design theory slowing them down.
And really, it turns out most of the time that not only you don't need the complex abstractions designed early on, you actually need a set of different ones. That's why keeping the design process continuously grounded in reality is important.
My solution for not producing spaghetti code with this method? I don't release the first version that works. I don't mark the ticket as "done", and I don't even push it out of my local repo. Instead, I clean it, or even straight up rewrite it, until it reaches a sleek and acceptably elegant state. It's the responsibility of a programmer to decide when the code is ready, and it doesn't have to be at the first moment it passes all the tests.
Because that's the time when you've learned to understand the problem, and you already have working code to move around.
On reflecting, I've had a similar process, too, though now I'm at least consciously aware of it as an actual development strategy rather than just believing I'm a shitty programmer who's forced to rewrite things because he can't fix his prior horribly-broken implementations, lol.
It's a long way from printf to framebuffer pixels. There's good reasons for every layer inbetween, too. (I like C compilers, format strings, buffered I/O, I like file descriptor semantics; I like having an operating system and it providing terminal emulation and framebuffer text rendering services!)
So I'm fine with 15 layers of abstraction to print hello world.
I'm not saying abstraction is bad - just that it's constraining, and prematurely introducing a whole ladder of constraints is going to grind code evolution to a halt.