A "well structured" database can lead to performance issues. Every join and sub-query has a cost and they add up.
The funny thing is that on this project, I largely stepped back from anything to do with the Database because the target was only around 100 users or so on a deployment that could largely be dealt with by faster hardware. Now, stuck trying to scale to more users than that and stuck with layers of procedures/functions that don't perform well at all. I wanted a few loosely coupled tables with some destructured (json) data in them. Wouldn't be having half the issues today.
Also, not a fan of ORMs... it's usually easier to do simpler mapping directly or a simpler data mapper that isn't a more typical full on ORM.
Almost every app I have worked on had a database that were several tables for a single object and several for the where clause. As good as the table layout seemed to be in terms of logic it just led to slow queries like you said joins and sub-queries come at a cost. Then again, I have seen clients balk at the idea of fixing the database and insist on a more unreadable yet performant query. But its their time and money so I don't fight back.
Sometimes, a really good DBA is what a project needs because so few programmers are good at performant design.
Moving from denormalization to denormalization can be hard, and you end up with a sui generis application, versus a normalized database where so much has been written about how to deal with problems up to a reasonably large scale, although the document store style does seem to be reasonably popular so maybe I'm missing something.